Backing up DTPO database

I know synchronization and backups are complicated topics, and I’ve read the various postings explaining why. I am considering using Devonsync or Chronosync, but haven’t yet taken the plunge. More comments on the tradeoffs between the two, or user experiences with each, would be welcome. But really, I may not need a fully fledged synch program because I mostly now use DTPO on one computer – and I can be religious about only doing that, and not trying to keep the database precisely synched on two computers. However, this leaves the question of backups for safety. I’m aware that I can make backups/archives from within DTPO, and that I should do so regularly, and that doing so is completely safe. The downside, of course, is that I need to remember to do so at the proper interval, and this is one more thing in an already complex workflow. Hence I’m also wondering about using automated approaches.

I believe – but would love to confirm – that TimeMachine/TimeCapsule backups will work with DTPO (and ideally with Sente and Scrivener, which I also use). I don’t think there is much doubt that they will. But further, and here’s where I hope someone on the forum has experience, I’m also using a Pogoplug, with ActiveCopy to back up all my files to a remote HD. This includes DTPO databases, Scrivener project files, and Sente libraries. Leaving aside the latter two programs – this is a DTPO forum, after all – does anyone have experience using Pogoplug for DTPO backups? Will the files be usable if I need them later? I think so, since the backups are going only one direction – I’m not working with any files on the Pogoplug while DTPO is open on my computer – but all advice welcome.

The main thing is if the program in question keeps data cached when running, writing them to disk only occasionally. This is a trick that makes many programs run faster but the caveat is that the data on disk are only a faithful representation of what you as a user are seeing after they have been written to disk.

The best way to make sure you are not missing anything in your backup is simply to quit the programs because that forces them to write everything to disk. Then backup to your heart’s content.

Thanks! I posted this question on the Pogoplug forum too, with your comment about data caching as the key issue. This is what someone from the company said:

"Short answer - I wouldn’t use Active Copy to back up a package database like this. It does not, in fact, cache data. The way it works is to listen for change (event the OS kernel) to a file in a directory marked for Active Copy. At that point it will wait for 60 seconds to see if there are any other events. 60 seconds after the last event, it will copy from cpu to Pogoplug directory ANY file with a different size or date (different, not newer, larger, etc.)

I do use it to back up all my documents, my desktop, and my outlook ost file, but haven’t tried to use it for the types of backup you have planned."

So – Pogoplug does NOT cache, hence avoids the problem you mentioned. But that apparently isn’t enough, since they still do not recommend using Active Copy with a package database like DTPO.

The most reliable backups are made when the DT database is closed. I use ChronoSync (two clones and a mirror) and Time Machine. Have recovered databases successfully from each.

The most important point, IMO – don’t rely on the “how well does it work” opinions of others. Test your own backups. Then test again. Then test regularly. It’s your data; our opinions won’t recover you from a disaster.

There is more than one reason why data are somewhat volatile when the program is running so shutting it down is a good way to make sure the data on disk (i.e. the ones you will be backing up) are self-contained, sometimes the only way.

I use Chronosync, TimeMachine and QRecall to back up my data to five different harddrives stored at different locations (I am pretty paranoid about data safety) and so far haven’t run into a single problem. Nonetheless, test before relying on anything too heavily. And also try a restore, don’t rely on the backup procedure to run through without a problem as an indication that your data are safe.

Tipps for Chronosync:
If your database is really large, tick “dissect packages” option when setting up a comparison. That way only the changed files inside a package will be transmitted. If you make sure that DTPO isn’t running this has never caused a problem here. Chronosync is a great program for mirroring to another harddrive but is not a full-fledged backup.

QRecall: Awesome program that, for some reason not well known for what it does. Very friendly and personal support, too. If I had only one choice for backups free, QRecall it would be for me.

Just to add to the suggestions on the topic, when I switched from a local hard drive for Time Machine backups to a Time Capsule, I also downloaded the utility TimeMachineEditor to schedule my TM backups. I did so because I found the normal TM backup routines to be unacceptably slow over my network. As a benefit of having a fixed backup schedule, I now use a timed QuicKeys macro to ensure that DT is closed before the backup starts. There are multiple ways one could trigger the same action without QuicKeys- a script attached to an iCal event is one example.

It’s a good idea to periodically run Tools > Verify & Repair to check for any databases errors. Otherwise, one may be making backups of a damaged database, which could lead to future frustration.

Also think about Bad Things that might happen to your computer equipment, including your backup drives – burglars, fires, floods, etc.

Periodically, I run one of the database archive scripts in Scripts > Export to produce complete compressed archives that are stored offsite.These scripts first run the Verify & Repair routine and will notify of an error before making a backup. (My databases are self-contained (not Indexed), so these archives contain all my data.)

My own experience with ChronoSync’s “dissect packages” option and DTPO is that it can result in corrupt databases – even with databases closed and DTPO shut down on both machines. This happened frequently enough that I abandoned using that option and now just copy whole database packages.

I have been using TriBackup 5 for sync function of the DEVON folder (holding all of my databases) on a portable drive to synchronize with my home computer for over a year without problem. However, I do close DTP before proceeding.

As for TIme Machine, I do not recall having seen a problem but again I have not tested every backup (each hour during work day).

Thanks very much – all this gives much food for thought, especially regarding the alternative programs for backups. And I appreciate all the wisdom from those with far more experience than I.

One additional thought. I’m now convinced NOT to put my DTPO database into Dropbox, or Active Copy, or anything like that. I will instead regularly save zipped archives, manually, as Bill very persuasively suggests. But having done so, could I perhaps also automate at least part of the process, to create the redundant backups and extra copies (on other HDs and in the cloud) to reduce the risks of other Very Bad Things like burglars, fires, etc. (This was inspired by seeing that there are scripts for archival backups to iDisk, JungleDisk, and ZumoDrive, none of which I use.)

How’s this:

a) at the end of every DTPO work session, save a zipped archive (either with File - Export - Database Archive OR [as Bill suggests, because it includes extra verification steps] with Scripts - Export - Daily Backup Archive).

b) save this zipped archive into a separate folder on my computer, let’s call it, say, “Zipped DTPO archives”

c) close DTPO

d) tell Dropbox (or Active Copy) to synch the folder “Zipped DTPO archives”

This should – I think – mean that Dropbox (or Active Copy) then starts synching or duplicating the zipped archive to wherever I’ve told it (this could be external HDs, or other computers, or a cloud server). But it would ONLY be synching/duplicating the saved, zipped copy – in other words, nothing would be changing with the data inside any of the package files. Hence it should be safely backup-able.

(I do realize cloud backups can take a while, esp. for a large database, but even so there shouldn’t be a risk as long as I’m not trying to deal with the zipped version of the archive right away.)

One down side is my folder of zipped archives could grow very large very fast – this is a problem for Dropbox, which synchs to all computers. So in the case of a very large DTPO database, I wouldn’t want to keep lots of zipped archives. (I have a MacBook Air, with a small HD, so this is an issue for me.)

However, it’s not a problem for Active Copy on Pogoplug, which doesn’t synch or mirror, but simply copies the files in the source folder (“Zipped DTPO archives”) to the destination (an external HD). So I could keep just the most recent zipped archive (or two) on my computer, and ALL the previous zipped archives would remain in my Pogoplug-connected drives. Lastly, since multiple Pogoplugs can be set up to duplicate each other over the Internet, it’s easily possible to have the same set of archival backups thus being created on hard drives in physically distant places (even in different states or countries).

What do you think? Does this make sense? By working only with the zipped backups, does this avoid the obvious data corruption issues? Yes, I agree I’ll need to test it to be sure, not rely on others’ opinions etc., but the experts on this forum have a huge amount of experience, and I’m sure you all will see whether there are any red flags in this procedure.

Thanks again!