backup and sync

i have been using the latest version of DTPro on my mac.i would like to keep a backup of my database on an ext HD.how could i set things up so that any changes i make to the database on mac will be reflected on the archived copy on ext HD? in other words,i want back up and syncing between the actual and archived copy done automatically.any help would be greatly appreciated.

If you want continuous, real-time automatic synchronization between a database on one computer and an archive on another computer or external hard drive, I don’t know how to do that.

Perhaps it can be done via scripting or Automator workflows, for example. But I suspect it would – given computer resources available at the consumer level today – prove to be an intrusive process. One would likely experience slowdowns. When DT Pro’s database structure is modified in version 2.0, synchronization of the file contents will become a simpler matter. (But I’ll probably still operate as described below.)

So I’ll answer in a different way.

My databases are very important to me. I’ve got years of effort involved in collecting and organizing their information contents. And I know that things can go wrong. A hard drive can fail, for example.

So I’ve grown in the habit of periodically running DT Pro’s Scripts > Export > Backup Archive. When I say “periodically” I don’t mean that I do this on a daily, weekly or monthly basis. I wouldn’t trust any predetermined schedule. Instead, I tend to run the Backup Archive script whenever I’ve put substantial investments of time and energy into changing the database, whether by adding or editing content or by changing the organization of material. Depending on the database and the level of activity, I may make 2 or 3 archives on the same day, or not create a new archive for months.

It only takes moments, at break time, to initiate the Backup Archive script and choose the location for saving the compressed and dated archive file. Later, it only takes moments to initiate a copy to another computer on my wireless network and/or to an external portable drive.

Perhaps I’m remiss in not making copies on removable media to be stored at a different location, so that a fire wouldn’t result in a total loss of my databases. I would certainly recommend that a business do that. I don’t currently have access to any remote Internet data storage that has enough space to hold my current database archives. (I’ve got .Mac, but several of my database archive files are too large, individually, to fit in my account’s space.)

Please note that my databases tend to be almost totally self-contained (Imported rather than Indexed captures of files). And my habit is to put the information that’s most important to me in a DT Pro database. A corollary is that I don’t bother to put a good deal of the files existing on my computers into a DT Pro database.

In addition to database backups, I’ve got bootable backups of my three computer boot drives on external hard drives. But I rarely update those bootable backups; they are there just to get a computer back up and running if the boot drive were to fail.

I agree that continual synchs are not practical, but I make nightly copies of my entire Home folder, including DT databases, with Synchronize! X Plus, an inexpensive backup and synchronize utility from Qdea Software. The copies go to an external FireWire drive.

Backup creates an incremental copy of all new data, whereas Synchronize creates an exact duplicate of the folder. I prefer the latter method, which takes less than two minutes. More than once the duplicate has saved me from any loss or corruption of data on the main drive.

qdea.com/

Thank you all for your reply.

It seems I have to use Scripts>Export>archived and save the enitrebase as a .zip folder and subsequently transfer that to the extHD on a regualr basis.

From the DTPro pref I have instructed the software to keep 3 backups on a daily basis but it would not let me choose the location of such backups.In case the software crashes etc,how I do I actually access them and get up and running again?

As for backing up to another directory, you can do this from Applescript:


tell application "DEVONthink Pro"
    backup database current database to "/Volumes/External Drive/Backup" including files yes
end tell

John

Your internal backups are stored inside the database package folder.

If you need to use an internal backup, choose Tools > Restore Backup. See this quote from the online Help:

"Restore Backup: Restore an internal backup. Choose the backup you want to restore from the dialog window and click Open. You can identify backups by their creation date.

Note: Restoring a backup simply swaps the current database and the backup. The backup becomes the active database, the former database a backup. You’ll never lose any data by using this command."

Comment: When I’ve spent hours adding content, editing and organizing a database and it’s time to take a break, I’ll often select Tools > Backup & Optimize. While I’m getting a cup of coffee or whatever, DT Pro is compacting and backing up the database, so I’ve got a very recent backup should I need it. (A good idea, even though I haven’t needed a backup in well over a year.)

I do run a nightly sync between my home and work computers (Chronosync) but would greatly appreciate the ability to do partial backups rather than the hundreds or thousands of megabytes everytime as it currently stands.

Chronosynch cannot do partial backups of your database because it cannot read the database structure.

You could use ‘rsync’. It’s like a copy command, but only copies those portions of a file that have really changed. Very fast and reliable, I use to here to transfer the complete home folder from one Mac to another on a daily basis.

Are there any potential issues with resource forks or HFS+ metadata when using rsync with a DevonThink database? I love the convenience of rsync, but I’ve heard that it can do some scary things to Mac files. Are there any options that should be selected (right now I’m using -avE).

Yes, the standard version of rsync does not honor HFS+ metadata. You could have a look at RsyncX for an older but HFS-aware variant of rsync.