Time Machine, Backup and ZIP-Archives

I have troubles with Time Machine. Every hour Time Machine wants to back up my complete database, 2 GB, even when there have been only minor changes, for instance when added some RTF from Safari. I could watch my hard-disk’s space getting smaller and smaller every hour. What am I doing wrong?

I told Time Machine not to save my main DT Pro database. In fact, one of the reasons to change to Mac OS 10.5. has been Time Machine’s ability to save my work permanently during the day, so I am not at all happy with this.

I make a backup every day to an external hard-drive: Scripts-> Exportieren-> Backup-Archiv… (using a German system).

These Archives are a lot bigger than the internal Backups. The ZIP-Archive on the external Hard-drive has 2,54 GB, the latest internal Backup 1,99 GB. (The Files-Folder inside the database is 1,62 GB.)

I thought, the Zip-Archive should be smaller than the daily backups. Is there something wrong with my database or with my brain’s belief about sizes of ZIP-Archives? (I get no error messages using the different Tools, mentioned so many times by Bill DeVille.)

Thank you for any information about the sizes and, hopefully, help with Time Machine.


iMac G5, 2.1 GHz, 2.5 GB; Mac OS 10.5.2, DT Pro 1.5.1

The zip archives created by the script are optimized and include the Files folder and are therefore usually larger than the internal (not always optimized) backups which do not contain the Files folder.

You’re not doing anything wrong. It’s a problem with the design of Time Machine (you’ll hear similar complaints from Entourage users). It considers your database to be one document, and so if something changes it will back up everything. There are no hooks available for us to inform it that it needs to update only a part.

I want to have full bootable backups, planning to use SuperDuper! and also have an incremental backup process in place.

Does anyone know if ChronoSync has this same incremental backup limitation as Time Machine, that is backing up the entire DT database rather than just changes?

What incremental backup solutions for DT databases do work?

What about offsite backup services that operate as an incremental backup service such as BackJack (functionally, the entire DT database or incremental changes)?

Any feedback is appreciated.

Thank you Annard and Christian for the hints.
I am wondering, why others obviously have no problems with Time Machine. Bill DeVille mentioned several times, Time Machine is working very well for him. I can’t imagine his databases will not change several times a day. Maybe there is a workaround? (I know, waiting for DT pro 2.0 … )

Besides there has been a posting by Eric Boehnisch in another thread:

Must have been a misunderstanding, I assume.


That was probably wishful thinking on Eric’s part. The behaviour of Time Machine is correct. We claim that our database package is one document to the system. TM doesn’t know when it backs up only one file inside a package if the package is still “correct”, therefore it considers the whole thing changed. But in our case this is most inconvenient.

And like most of the consumer solutions from Apple, they’re very elegant until you hit a different situation compared to 95% of the target market. Then you’re one your own. I’m not sure if this will change for DT2, the package system will not go away.

I just saw this on the net: Time Machine may back up the entire Aperture Library on each run. I have the uncanny feeling that this may apply to your situation as well.

If you use Finder to “Show Package Contents,” you’ll see that a database is actually stored as two chunks. There are a number of .database files, which contain materials stored in DT’s internal format. And there’s a “Files” directory, which contains PDF files and other materials stored in their own format.

BackJack appears to see the same level of detail that the Finder does. Its log shows that it backs up the .database files individually, and then only the changed items in the Files tree.

In my case, the .database files are relatively small, as the vast majority of my information came to me in PDF form. Thus, even though BackJack might have to back them up even when I make small changes, the data load involved is relatively small. YMMV.

BackJack is also able to see package contents in other cases, such as .scriv files. So that appears to be the standard, Apple-supported behavior. If other programs can’t do it, that sounds like a problem with those programs, not DT.

(On the other hand, the opaqueness of the .database files is DT’s responsibility.)


Ursula, Time Machine does work well for me. Yes, my main database usually changes many times during the day. Yes, hourly backups of the package file can use a lot os space.

Remember, though, that Time Machine doesn’t keep all of those hourly backups. As time passes, the storage space devoted to a “slice of time” drops.

I do most of my work on a ModBook, a custom Mac tablet based on a MacBook. I’m not always connected to my Time Machine backup drive, though that’s active much of the time.

My Time Machine backups are to a 500 GB hard drive for which I paid less than $0.50 per GB of capacity. That drive is handling Time Machine backups for two computers, although one of them is used infrequently. Looking at the remaining space on the drive (and Time Machine’s design), it will handle my backup chores indefinitely. I may need to make some choices from time to time (pun intended). Do I need to keep all the backup files indefinitely, or not? Are there certain files that I may want to keep indefinitely?

If I wish to obsess on keeping every file that has been downloaded on my computers, or every version of every file that has been modified, Time Machine will pretty much let me do that, year after year, decade after decade. I could use multiple hard drives to maintain such an archive of the environment of that computer over time. But I’'m not that obsessive. There are many files that I consider transient. Once in a while I throw them away. If archived, I’m still not interested in their archived storage.

There are some files that I’m rather obsessive about. I’ve lovingly built reference collections over the years. I want to preserve them. Those collections are in self-contained DT Pro databases, so I want to preserve them. Time Machine is a convenient backup system to help me do that. If something goes wrong, I can recover a database from Time Machine.

But as good as Time Machine is, it’s not enough for my most important files. For one thing, the hard drive resides at my home. Catastrophic events could occur. The hard drive(s) holding Time Machine backups could fail. My computers and drives could be stolen. My cabin could burn to the ground.

That’s why I also use the Backup Archive routine for DT Pro/DT Pro Office. It performs useful database maintenance every time it’s run. And it produces the smallest possible compressed and dated archive of a database. I can store those archives on external media. For added security, I periodically save recent archives to a DVD and store that offsite.

Finally, I don’t want to depend on scheduled backups, such as daily backups. In DT Pro Preferences > Backup I’ve set frequency to Never. Is that a bad quality assurance decision? No, if I act prudently, it’s a quality assurance plus. If I’m in the process of adding batches of hundreds or thousands of new items to a database I’m not going to wait until that night to backup my work. After each batch I’ll run Tools > Verify & Repair to make certain the database wasn’t messed up by adding corrupted files, and then Tools > Backup & Optimize to make a current internal backup (or I’ll run Backup Archive while taking a coffee break). If something goes wrong that day while I’m making major changes to a database, I’ll lose little or no data. Had I waited for a daily backup, I’d be in trouble.

I’ve had two computers fail. My TiBook’s hard drive crashed after 5 years of use. I had external copies of my databases. My Power Mac G5 blew it’s power supply a few days ago (a common problem for those computers). I’ve got external copies of the databases on that machine (but not a Time Machine backup of the 2 500 GB drives). So a computer failure for me is a minor inconvenience.

The workaround in the above mentioned document:

Does this mean, using Time Machine we never should backup a DevonThink database when DT is open, even when we are not changing data at this moment?

For the moment I have excluded the DT database from Time Machine. What happens, when — once a day for instance – I am again including this database? Do I have to quit DT for this? Could this be a workaround, using Time Machine all day long for (in my case) every application except DevonThink?

Is it possible, that one day Time Machine could handle DT databases? I don’t understand nothing about programming…


Thank you, Bill, for your explanation about your Backup strategies. Mine is similar to yours, I am a little bit, maybe rather paranoid about my data.

What I never considered is to set DTs internal Backups to ‘never’. I like some routines I don’t have to remember. Time Machine seemed to be, what I was looking for, easy to maintain. It’s a pity, that it is not working together with DT the easy way when you are handling large databases.

(Of course I am backing up my data to other volumes too regularly.)