Backup all open databases

Point taken. I already use Time Machine, and do monthly exports of full file/folder structure (for wife to access, and I don’t trust databases). I don’t trust Time Machine due to lost data in the past, it has gotten better but still clunky. I personally use my synology with C2 backup and 2 offsite synology for data backup. Much prefer the snapshots this provides of full backup as compared to Time Machine. (and worse case scenario I have monthly full file/folder export that has snapshots that I can open from any machine such as a windows PC independent of Mac with DT, and sure I lose all the DB structure and DT add ons but I find the piece of mind knowing all my files are in the folder structure I used in DT to be useful, and I do refer to these files often from my work PC via synology).

I like the idea of the incremental backup, and this seems to work pretty well. I keep all database open, but looks like this will actually open them if not in use. I also find this much easier to actually test the backup (aka on my iMac I will load a backup once every few months to make they still load etc. A backup is pretty worthless if you aren’t testing routinely to make sure it loads.

Kind of. Provided they’re all located in the same folder. And I doubt that there’s anything “incremental” about it.

In addition: all you do is create ZIPs of your databases on your disk. You still have to backup those, along with your other data. Asked you have to test those backups. I don’t see any advantage except that it keeps your machine from getting bored – now you’re backing up the same data twice.

1 Like

I don’t disagree at all with what you say (as always :grinning:). It’s simply that I’m completely paranoid about losing months of manual transcription of my diaries (as well as other valuable databases) so I felt there was little harm in having that in addition to:

  • full backup of my MacBook Pro by CCC to an SSD nightly;
  • early morning daily offsite backup using Arq Premium; and
  • weekly Time Machine backups at home.

Now, please don’t anyone feed my paranoia by suggesting any additional backup method!

Stephen

1 Like

I can understand that you don’t want to lose your data. But I don’t think that paranoia is the right counselor in this context. What you do with this script is simply to create a copy of your data (as a ZIP), on the hardware hosting your databases.
If that hardware is or becomes faulty, so might these copies. You do not reduce the risk of data loss by copying data from one part of your disk to another.

I’d suggest increasing TimeMachine backups to hourly (that’s the default, afaik). That does not impede performance as TM does incremental backups, and you can change only so much within one hour. Your current setup is, in my opinion, a bit too optimistic – whenever you create and then accidentally delete an important file between “early morning” and “nightly”, you won’t get it back. TM might help in that case.

There is probably no harm (except uselessly burnt CPU cycles). But there’s also no gain. Except perhaps a warm, fuzzy “secure feeling”. If I really wanted that, I’d rather create these ZIPs on a mounted volume (NAS, USB, whatever), i.e. separate from the hardware hosting my databases.

Otherwise, you might consider exchanging “paranoia” for “risk assessment”. I seem to remember that @Blanc explained their backup strategies in a lot of detail. And, as always, very convincingly.

1 Like

These are saving to my synology drive mount, so auto backup to my NAS, so I have versioning and snapshots (and 2 off site backup) of all my data. This is a one click backup that provides me piece of mind and is versioned/offsite/snapshotted.

Seems like a no brained additional layer for one click now. I see your point but this gives me more comfort at night.

Thanks to @Stephen_C for the script. This works great for my needs.

I see and agree. To save the click (and avoid problems with occasionally faulty human memory), you could use a shell script that creates the ZIPs on the mounted volume regularly (using cron).

Over my head!

This is the point where I ask if anyone has basic scripting/Apple script resources if I need to do more.

My use of NAS/etc is all synology based (which I feel comfortable with GUI, command line gets more shaky).

@chrillek thanks for your usual, carefully considered and cogently argued, post. You’re right that I probably need to reconsider my backup strategy—and certainly correct that there’s a gap between early morning and late evening.

I’ll have a think…thank you.

Stephen

I am 99% iOS/iPadOS and only use DT to essentially backup. My Time Machine backups obviously won’t catch changes on my iOS/iPadOS device with time machine as I don’t have an always on mac anymore.

Would you have any recommendations for incremental backups throughout the day from iOS/iPadOS?

(I use synology WebDAV as sync store, do not have great remote access from work due to poor cell reception).

Since everyone is reconsidering backup strategy… I wonder if there might be a better way than my weekly booting up the mac to get backups and remove the Mac from workflow?
TIA

My backup loophole, kindly identified by you, is I think now filled. I’ve chosen to use CCC to make hourly backups during, effectively, the working day. As those are snapshots on my Data volume they are subject to CCC’s retention rules and will not be retained for more than 24 hours (which is ideal for me).

With apologies for slightly derailing this thread my DEVONthink backup script will now quietly slide into retirement.

Thank you, as always, for your wise counsel.

Stephen