Backup Strategies

Yeah. Wasabi invoices a minimum of 1 TB storage though, so it’s not ideal for anyone actually storing less.

I use Arq with B2, which costs $2.50 monthly for the 500 GB I need ($0.005/GB).

The cold storage options from Google and Amazon can be cheaper, but they are very expensive if you need to restore. B2 was the middle ground, since both storage and restores are very cheap.

All Arq destination options are listed here.

And if you happen to have Microsoft 365 and don’t know what to do with your TB of free space on OneDrive, Arq plays nice with OneDrive too :slight_smile:

2 Likes

I pay 69 Euro per year for 6 TB in OneDrive.

This is 11.5 Euro per year for each TB, or about 96 Euro Cent monthly.

:hugs::hugs::hugs:

In regards to backup of my Mac Mini M1 and DEVONthink, my concept is as follows:

I use TimeMachine to automatically backup my system over USB to a SSD, beside some excluded folders, like for local DT backup archives, Downloads and some others.
Just in case, as it does not hurt and is both encrypted and automatic.

My main data is but on an external APFS-encrypted Thunderbolt drive, or in DEVONthink.

I have OneDrive set up with the Sync / Cache location on that external drive, all content pinned.
As I do not trust any Cloud provider, lots of content is encrypted with Cryptomator, which can be mounted on the Mac as regular volume.

DEVONthink is configured to sync to a remote WebDAV server and additionally to OneDrive (the local Sync Location, which get’s uploaded to OneDrive).
This saves my databases encrypted to those remote locations.
On my WebDAV server I create and rotate backups from time to time, but rarely.

I have some imported DT databases, some open, some as encrypted disk images.
Also, I started to index one of my Cryptomator Volumes, which will be the way to go, for me.
I plan to migrate all content to Cryptomator, which is stored on OneDrive and over DT synced to both WebDAV and OneDrive (the indexed database itself is not sync again to OneDrive, but all others).

Then, every few days, after adding or changing something, I back-up the OneDrive data to local VeraCrypt encrypted disks, one SSD and two HDD which I weekly rotate as archives.
This backup is simply done with “rsync” and a bash script.
The same is done to my Cryptomator content, which removed this content from the need of working Cryptomator software - the backup contains the unencrypted files, but on an encrypted volume.

And finally, my /Users/tja folder will be copies to those VeraCrypt volumes too - and here, I am not sure about what should be saved and what not.
Lots of stuff below ~/Library seems obsolete.

This is, because I don’t trust Microsoft - I already lost one account and hundreds of GB of content due to Microsoft closing this account.
This could happen at any time.
So, I use OneDrive but always expect it to vanish. A family account with about 69 Euro yearly for 6TB is hard to beat.

The same goes for my Cryptomator volumes - at any time after a macOS update, this may cease to work.

Making regular backups / archives to local VeraCrypt disks ensures that I could “survive” both a OneDrive shutdown and a Cryptomator update problem.

So far, I rsync the whole /Users/tja in two steps:

The first step excludes ~/Library and ~/Databases (where my DT databases lie).

The second step rsyncs ~/Library and excludes ~/Library/Caches

I think about excluding any and all “Caches” directories, but am not sure.
There also more cache* directories: “Cache”, “cache”, “caches.*”

2 Likes

Just getting into this whole topic as I have been using Arq to backup everything (including my DT databases whilst open).

  1. Am I to understand that backing-up the DT databases themselves is not clever and to only backup ~/Backups (where the ‘daily backup archive’ script stores the backups)?

  2. Using the ‘daily backup script’ is manual so is there an easy way of automating this? If one is backing the backups, and Arq does retention etc, wouldn’t it be better if the ‘daily backup script’ didnlt create a new file each time (with the date in it)?

…sorry for the dumb questions but I find backing up DT confusing and I want to get it right.

There’s quite an interesting discussion in this thread. I use a modified version of the script I posted there to backup all open DEVONthink databases plus a Hazel rule that ensures I keep only the two most recent versions of each DEVONthink archive.

Stephen

Thanks for the reply, @Stephen_C!

I am playing with a script I found on another thread (b ut cannot remember who posted it, maybe you!).

  1. How do you kick off the script? Is it every so often in automator?

  2. I assume your reply conformed that one shouldn’t;t backup the original (and possibly open) .dtbase2?

  3. Given Time-Machine, Arq et al have versioning, why do you save each backup with the date-stamp in the filename?

Thanks again!

Well, the DT databases are just folders inside another folder. Whatever technique you use (TimeMachine, Arq, manual), they simply copy the folders (or a delta of them) to the destination. Quite simple, in fact.

I never understood the need for a “daily backup my DT databases as ZIPs” script anyway. Maybe that’s a remnant from the time before Timemachine, Arq, Backblaze etc.

2 Likes

I use it to archive my open DEVONthink databases weekly. Thus I simply open all of the databases and run the script effectively from the scripts folder in DEVONthink. (I have a reminder in DEVONthink to do that.) In fact I use a Keyboard Maestro conflict palette to facilitate access to scripts that I frequently use in the DEVONthink scripts folder.

No - the very point of the script is that it’s archiving open databases. Of course, I’m not working with the databases at the time the script runs.

I also make a weekly Time Machine backup, nightly Carbon Copy Cloner backups and daily Arq off-site backups. Call me paranoid but what I have now in DEVONthink is much too valuable to risk losing! As to the date stamp in the file name, it simply makes it easier to see at a glance the date and time of the archive if I ever need to restore it.

Stephen

…so do you just use Arq / Timemachine etc on the (possibly open) .dtbase2 files without separate archived copies?
This is what I have been doing until recently. Arq to B2 and a local NAS has always worked for me, but I recently added storj.io which gives errors backing up open .dtbase2 files.

You misunderstand me (sorry if I wasn’t clear). I meant whether it is OK to use Arq or Time Machine to backup the (possibly open) .dtbase2 files (without requiring the intermediate step of using the script to create a zipped archive for Arq/Timemachine to then backup offsite (or wherever).

I strongly concur with @chrillek’s comment here. I always try to ensure that my databases are closed when backing up to Arq, Time Machine or CCC. However, if I do, on occasion, overlook that, I don’t worry too much because whatever else I’m doing I’m certainly not working on any of the databases at the time any of those backups runs.

Edit: I certainly wouldn’t bother to create a DEVONthink archive simply for the purpose of backing up off-site (or wherever else you may choose).

Stephen

got it - thanks for clarifying!

Exactly. I don’t really care about open/closed, since at the time the backup occurs, nothing is going to change in the database. And frankly, even if it where: There’s always the next backup…

1 Like

Given that Arq, Time Machine, and Carbon Copy Cloner* use snapshot based backups, there is no need to close any databases.
I have DEVONthink open 24 hours and use all 3 backup apps mentioned. Since I use Arq with Google Cloud & Storj, that’s 4 backups per hour.
Never once has this been an issue. You can even modify content in an open database while backups are running. Snapshot based backups are frozen in time, it doesn’t matter if data changes while backing up.

(*when used with APFS volumes)

Also, see this comment for documentation screenshots from Arq and CCC that confirm this.

3 Likes

That’s great to have confirmed - thanks.

2 Likes

I have seen several indications that Backblaze dose not save metadata. I just restored one of my images from Backblaze and all of my metadata was there?

What am I missing?
Roger

Are we talking image-specific metadata here (so perhaps camera and lens type, focus etc.), or file-specific (such as creation date)? Or does “image” refer to a backup as a whole? In which case, a backup of what exactly?

1 Like

Details from Backblaze themselves:

3 Likes