date.toISOString returns something like 2011-10-05T14:48:00.000Z. The first replace call changes the colons and the T to dashes, the second one removes the point and everything after it. So it is a tiny bit more elegant than the AppleScript way, after all.
I’m sorely tempted to respond with an improved version of my AppleScript script (including the ability to cancel) but shall not because you will clearly win this little “contest”.
-- Daily backup archive - all open databases
-- Created by Christian Grunenberg on Mon Jun 22 2009.
-- Copyright (c) 2009-2022. All rights reserved.
-- addition for all open databases by coolgoose85
-- https://discourse.devontechnologies.com/t/backup-all-open-databases/50932
-- https://macscripter.net/viewtopic.php?id=24737
set this_time to replace_chars(time string of (current date), ":", "-")
tell (current date) to get "" & its year & "-" & (text -2 thru -1 of ("0" & (its month as integer) as text)) & "-" & its day
set this_date to result
set backup_date to this_date & " " & this_time as string
-- https://macosxautomation.com/applescript/sbrt/sbrt-06.html
on replace_chars(this_text, search_string, replacement_string)
set AppleScript's text item delimiters to the search_string
set the item_list to every text item of this_text
set AppleScript's text item delimiters to the replacement_string
set this_text to the item_list as string
set AppleScript's text item delimiters to ""
return this_text
end replace_chars
tell application id "DNtp"
set {od, AppleScript's text item delimiters} to {AppleScript's text item delimiters, "/"}
set all_databases to every database
set AppleScript's text item delimiters to od
repeat with this_database in all_databases
try
set this_path to path of this_database
set this_name to name of this_database
set this_archive to "/Volumes/YourVolumeName/Backup/" & this_name & " " & backup_date & ".dtBase2.zip" -- Change YourVolumeName to your external backup drive.
show progress indicator "Daily Backup Archive of " & this_archive steps 3
with timeout of 3600 seconds
step progress indicator "Verifying database " & this_name & "..."
if (verify database this_database) is not 0 then error "Database " & this_name & " is damaged."
step progress indicator "Optimizing database " & this_name & "..."
if not (optimize database this_database) then error "Optimization of database " & this_name & "failed."
step progress indicator "Zipping database " & this_name & "..."
if not (compress database this_database to this_archive) then error "Backup of database " & this_name & "failed."
end timeout
hide progress indicator
on error error_message number error_number
hide progress indicator
if the error_number is not -128 then log message "DEVONthink:" & error_message
end try
end repeat
Okay. I’ve run this a few times and it’s doing what I need it to do.
It no longer gives me permissions errors because it handles the dates with AppleScript instead of a shell.
Really helpful to simply open the databases I have changed and kick off the script after finishing work.
So thanks for all the tips and code ideas, guys. I appreciate it.
If you decide to use it, just be sure to edit the YourVolumeName in the script and set it to the name of your backup drive… I’ve been running the script from Script Debugger which gives me log I can check in the morning to see if any errors occurred.
I think that’s the only thing I might like to change is to change all the messaging commands to write to the DT3 log, but I haven’t looked at how to write to the DT3 log yet, so this is where I’m leaving it for now.
Time Machine is better than no backups at all. It’s easy to set up and generally performs well enough. However, redundant backups of important data is always a good idea.
imho Time Machine’s incremental backups are superior to full database copies
However, Time Machine data is stored locally; I also want to use cloud storage (offsite)
edit: for this backup, I use Arq Premium
In addition to Time Machine, I want a data backup disconnected from Devonthink.
i.e. ability to retrieve individual files without the complexity of retrieving the DT database
edit; for this backup, I use the export files&folder feature
Oh, I guess I have some homework to do. I didn’t realize you could back up the files within the DT databases as individual files without dealing with the DT database. I understand it’s easier with an index structure, but I need to explore how to do so when my data is stored within the DT database.
I’m not sure of @DTLow’s method here but we don’t suggest you mess about in the internals of the database package, in case you were headed that direction.
Does anyone have a script that will backup all open databases rather than doing it individually? I would like to set up a few databases as my primary grows, and would simplify the backup process.
The database archive function isn’t intended for incremental backups. They are best used as periodic full backups. Incrementals should be handled by your primary back strategy, e.g., Time Machine and external drives.
Yes I have a synology for Time Machine, but I also do occasional full zip “snapshot” backups as well. Looking to make it a bit easier as I think several databases would make more sense for my setup, but would add a lot of manual steps (which would probably limit my keeping up with the backups as much).
First off no clue about any Apple script can’t do it…
The export script works fine per database, but the one above seemed to imply that it would make exports of all open database. I saved this script as a new file but it only does the one open database (not all).
-- First set the path of the folder to which you wish to backup
property pbackup_path : "/Users/[redacted]/Documents/DevonThink/Backups/"
property ws : "MyDefaultDatabase" -- Enter a default workspace, if needed. Makes for faster loading of an often used workspace at the end of the backups. Note need first to save it as a workspace.
-- Next, the location of your DT databases
property databaseFolderPath : "/Users/[redacted]/Documents/DEVONthink/"
property databaseNames : {"Personal", "Financial", "Software tips"} -- The names of the databases you wish to open and backup
tell application id "DNtp"
display dialog "Create backups of all open databases?" with title "Backup all open databases" buttons {"Cancel", "Continue"} default button {"Continue"} with icon 1
-- Open the databases
repeat with a from 1 to length of databaseNames
set databaseName to databaseFolderPath & item a of databaseNames & ".dtBase2" as string
open database databaseName
end repeat
set this_date to do shell script "date +%Y-%m-%d-%H-%M-%S"
set all_databases to every database
delay 3 -- Wait to ensure databases are opened
try
-- Set a general progress indicator
show progress indicator "Weekly backup of all open databases…" steps count of all_databases
repeat with this_database in all_databases
set this_name to name of this_database
-- We use the backup path property here
set this_archive to pbackup_path & this_name & " " & this_date & ".dtBase2.zip"
with timeout of 3600 seconds
-- Show which database we're processing
step progress indicator this_name as string
if (verify database this_database) is not 0 then error "Database --" & this_name & " is damaged."
if not (optimize database this_database) then error "Optimisation --of database " & this_name & "failed."
if not (compress database this_database to this_archive) then --error "Backup of database " & this_name & "failed."
end if
end timeout
end repeat
hide progress indicator
-- Show an alert when it's all done
display alert "Backups of all open databases complete. The next step will close all open databases and enable you to re-open your selected database."
set availableWorkspaces to workspaces
if availableWorkspaces = {} then return
if (ws is not missing value) and (ws is not in availableWorkspaces) then -- Check for the default workspace
log message "Workspace " & ws & " could not be found."
set ws to ""
end if
set chosenWorkspace to (choose from list availableWorkspaces default items ws multiple selections allowed no empty selection allowed no)
if result is false then return -- Don't respond if cancelled
close every window
with timeout of 3600 seconds
close every database -- There is the potential this may not close a database if it's actively syncing.
end timeout
load workspace (chosenWorkspace as string)
on error error_message number error_number
hide progress indicator
if the error_number is not -128 then log message "DEVONthink:" & error_message
end try
end tell
Note:
I don’t run DT with all my favoured databases open at all times. Thus the script includes a list of those databases you wish to include in the backup.
The script also includes a database that you wish to open again at the end of the backup (there will be a prompt on screen asking you to do that when the backup ends). In order for that to work you’ll need first to have saved the workspace for that database.
I simply run the script manually once a week in response to a DT reminder.
In summary, to make this work all you need to do is to complete the information required at the start of the script and then save the script somewhere you can access it easily (almost certainly in the DT scripts folder ). Of course, it’s up to you whether or not you then wish to set a weekly reminder in DT.
Thank you this worked great! Multiple databases here we come!
(Quick naive question as I know no apple script). So it seems this script backs up everything that is open, and where you have set the list of databases in the script this would also backup these databases if “unopened?”
I keep all of the databases open on my Mac at all times, but I am splitting the databases up more for DTTG (so I can have a wiki/pkm database to use wikilinks and do full sync). The rest will be shallow since(a reference for home, a work academic reference, family documents, and work documents). So I don’t think it matters if I have the names on the script or not then if always open.
Addendum:
Getting an issue which is above my pay grade with scripting.
When I put the folder location as user/…/Documents/DT Backup
My file ends up in the user/…/Documents folder and each backup started with file name “DT Backup Database 1” etc.
How would I change to script to actually put the files in the DT Backup folder and just keep the backup name the actual Database name only?
The list at the start of the script is, for me, a list of databases that will be opened so that they are backed up by the script. I believe (but have not tested it) that if any is already open there will not be a problem with the script. However, you do need, in that particular script, that list of database names to be there—because those are the databases the script will back up.
Try putting a forward slash at the end of the backup path in the script (as in my script).
The one “/” fixed the naming issue! And I understand now. I made a dummy database real quick and it backed it up fine. Nifty script to open everything else up.
This is so helpful I appreciate it. I do this weekly to keep versioned history, and always hate it because was such an annoyance to do manually that I was limiting database #'s just because I knew I wouldn’t do it manually if I had more.
Do you happen to have a script for Exporting as “Files and Folders.” I usually do a once a month update on all our family documents to a synology drive shared folder so my wife has access in a plain folder format of documents (as she would not even consider DT). It would be nice to automate this as well.
I don’t think so. A script running in DT can’t back up (aka “copy”) a closed database. Simply because DT does not “see” it – it is nothing more than a file outside of DT. Only if it is open can it be worked on with a script (which need not be written in AppleScript, either).
Personally, I’d forego this “backup my open database” stuff. Simply because I don’t see a reason to do that besides the normal backups with TimeMachine and the like (Backblaze and Arq, in my case). Those tools simply backup what I tell them to and don’t bother if anything is open or not.
You need a reliable (!) backup method (no, several reliable backup methods) anyway. Adding application-specific procedures on top of that seems too much work (for me, at least).