What kind of off-site backup scheme do you use? For Devonthink data and otherwise

I currently use Chronosync to backup to a USB drive connected to an always on, somewhat vintage, Mac mini. I do this daily. That Mac mini backs up the Mac mini and the attached drive to Backblaze personal.

Wondering if I should consider other options… Thank you!

@mhucka copying you.

Every 30 minutes:

  • Time Machine
  • Carbon Copy Cloner

both to an always attached external SSD.

Every sunday:

  • ChronoSync
  • Carbon Copy Cloner

both to one of two external HDDs. I only attach them for the backup and one of the two HDDs is stored at my grandmother’s house (in another city).

In my opinion the most important thing is to have backups that go back for several years, simply because human (or software) errors might not be recognized early enough to prevent data loss.

I might add an online backup someday but am still not sure whether it’s really necessary.

1 Like

Since you have the offline backup approach perhaps you don’t.

Time M and CCC every 30 minutes - wow. Why both? I am thinking that I need activate Time Machine on my SSD.

Agree with your approach to not realizing error - human or software. So how do you go back several years? Meaning in which software do you enable the archiving and not deleting. Although, I have in the past worried about the risk of building clutter?

Thank you for sharing.

>What kind of off-site backup scheme do you use?
Arq Premium running on my Mac Mini
providing auto incremental backups stored in the cloud
of both
. Mac Mini data, including the raw Devonthink databases
. a weekly Devonthink export (files/folders/tags)

what offsite location do you use?

Arq Servers in Iowa
Given the choice I selected something distant from my home base (Canada West Coast)

got it. are is nice! what offsite location do you use?

Time Machine failed in the past, I had to erase the disk and start a new backup several times, i.e. I lost all backups. That happened with the HFS+ file system, APFS seems to be a lot more reliable. Because of that using different apps for backups seems to be a good idea.

Actually no idea how long this will work, but the HDDs are 14 TB, so I’ll probably don’t have to think about it for several years.

  • all macOS devices have automatic daily full system backups with Apple TimeMachine directed to attached USB drives and to the Synology NAS. NAS has the space to have these backups goes back a few years.
  • Very important folders of files kept outside of DEVONthink copied (automated) daily to a local Synology NAS (with Carbon Copy Cloner)
  • all macOS devices have automatic and frequent offsite backups of parts of the system to Dropbox’s new backup service (not sync).
  • my main device, an iMac, is connected with Backblaze which is another offsite backup of most everything
  • the Synology NAS device is backed up with Synology Hyper Backup to a connected USB drive and to Dropbox backup not sync (offsite)
  • For DEVONthink databases, a cron job automatically runs twice a week to create archive zips of all databases, stored in ~/Documents/Backups/DEVONthink so that these zip files picked up by Backblaze and Dropbox backups. Hazel manages a process to delete all but the last three of these zip files.

I have a reminder in my ToDo system to check with a test restore (small) each backup store at least once a quarter. since easy, I click on the TimeMachine, Dropbox, and Backblaze buttons on the Menu Bar every morning to check if still running.

I do not count on any third party sync services as backup. Sync is not backup.

A bit more than the classic 3-2-1 backup method, and surely redundant. Do I care about the extra reduncancy? No. Too many failures of computers and lost backups over the years. It’s all about avoid catastrophe and not being fully optimised, I believe.


Why are you backing up every 30 minutes?
Hourly or daily should be sufficient under normal circumstances.

1 Like

Can you talk more about your automated backup.

I use synology NAS and do a weekly backup archive of my 3 DBs, and a biweekly to monthly full export to have a copy of retained file structure of my documents (or after any significant additions such as after tax season). My synology NAS hyperbackup goes to 2 separate remote synology NAS (stored at family’s houses) for most important documents.

I would love to make this more automatic for the DB archive to my synology and the export full folder to synology. Currently just set a remind me and do it manually once a week after my weekly review.

Which “automated backup”? It’s all pretty much automated an described succinctly above. Which bit are you interested in?

This specifically. I was thinking you are automating this export of these folders, but now I’m wondering if these might be indexed folders that CCC backs up to the synology instead?

I would like to automate the export of my databases in the preserved file structure as a native folder “non database” copy.

No. These are files that are in folders outside of DEVONthink, but very important to me, e.g. some folders that are in a third party sync service shared with collegues and family. Nothing to do with DEVONthink. I just do CCC copies of these to the NAS “just to be sure” they are backed up somewhere other than source and the regular backups. Belt and braces.

I don’t see the need to automate export of DEVONthink groups replicated as folders in the file system.

That’s a leftover from setting up the mac. It’s actually every 60 minutes - the idea was to run TM at e.g. 10:00 and CCC at 10:30, but that didn’t work out as one can’t (easily) control when a TM backup runs.

It feels good to see CCC’s notification every hour and the backup doesn’t slown down the mac, so I just left it that way.

1 Like

I admire the excellent schemes that people have created! This discussion thread has been a nice source of new ideas …

So far, my approach has been the following.

A. When at home:

  1. Hourly: Time Machine backups to a USB-attached HDD; TM is configured to back up the whole computer and an external SSD used for photos

  2. Daily:
    a) CCC bootable clone of whole computer to a USB-attached SSD.

    b) Arq backups (encrypted) to an off-site computer I control; Arq is configured to back up my whole computer, all attached external disks, and an 8TB HDD disk that I use to store archives and infrequently-accessed files at home.

  3. Weekly: copy, using a workflow defined in CCC (started manually), the most important contents to a small USB disk kept in a fire safe hidden at home.

  4. Occasional (maybe once every year or two), copying a selection of TM backups to an HDD ready to be retired, and putting that HDD in a box.

B. When traveling:

  1. Every night, use CCC to create a bootable clone of the full internal drive to USB-attached SSD.

  2. If possible (depending on network speeds), also run Arq while in the hotel room.

The USB-attached HDD used for Time Machine has enough capacity to hold about a year’s worth of TM backups, using TM’s default pruning scheme (which reduces old backups to something like 1/week or 1/month).

My destination for Arq backups is a Linux box at a location in another city. Because of the nature of my work and certain quirks of the academic life, over the years, I’ve been fortunate to have inherited many retired computers and computer parts. One of these was an old Dell server and a pile of 3.5" HDDs that were retired from other projects. I reconfigured it to run Linux (Ubuntu 20) in a console-only non-GUI setup, bought a used Dell RAID card on eBay, and set up the disks in a RAID5 configuration. Then I closed all the TCP ports except SSH and a couple of other necessary ones, and now it acts as an SFTP destination. All administration and control can be done remotely via SSH (command line, no GUI); rarely, I need to access the bare machine before the OS boots (e.g., to get to the BIOS screens), which requires console access, so the computer also has a networked KVM interface. As part of my maintenance of several similar computers, I have a calendar reminder to ssh into the machine and run OS updates, check the logs for signs of trouble, etc.

A natural question is, why set up my own backup server versus just buy space on Backblaze? I would have gone that route if it weren’t for the fact that I had this nice used hardware available and the background to be able to set it up. I don’t think backup speeds are any faster than with a service like Backblaze, and certainly it would have taken less time to use a cloud provider. However, there is one advantage, which is that the server is physically accessible to me and I can copy off multiple TB’s directly instead of waiting for network downloads or asking Backblaze to mail me a disk.

(I feel obliged to add some cautionary words here. Setting up your own server not only takes time once; you have to keep maintaining it, and you have to be careful about securing the system because everything opened to the internet is exposed to constant hacker attacks. Basic configuration, maintenance, and security on Linux servers for something that only uses SSH is not terribly hard, for someone with the right background. And I’m not special – it just so happens that I’ve been a Unix systems administrator in the past. So, please take these factors into consideration before going down this route.)

Edit 2023-04-22: I had been making bootable clones in the past, but after switching to a MacBook with Apple Silicon and macOS Ventura, I have now discovered that it’s basically impossible to make bootable clones anymore. Grrrr.


Oh to have been given so much useful and usable hardware and a suitable remote location with Internet access to host the remote backup server!

1 Like

imho TimeMachine should be the first backup solution implemented by all Mac users
I activated it over 10 years ago, and it’s still running

However, it does require an external drive
and it’s not an offsite solution


Welcome @MJP

Why not use Arq with an Amazon S3 account if you want a cloud backup?


Definitely! I’ve been fortunate in this regard. It’s partly due to the nature of my work, my work environment, my age (the longer you live, the more opportunities for collecting stuff you have), and my immigrant mentality of never throwing away something that might still be usable. (You should see my closet :eyes: )

I should also clarify that the system is not only used for my Arq backups. That would be kind of overkill. It’s used for other things too. One could build a cheaper backup destination by buying parts on, say, eBay or Craigslist or whatever is the local equivalent where one lives.

Finally, @BLUEFROG brings up an important point. It’s worth doing a cost calculation to compare using a ready-to-use service like Amazon or Backblaze vs the DIY route. Paying for a few years of those services may actually be less expensive than building your own …


Thanks @BLUEFROG and @mhucka
I was thinking both about cost and also about the “fun” of constructing and maintaining my own server, and the flexibility it would offer as described above.
However I’ve increasingly come to realise that as much as I used to love building computers and maintaining them myself, I am now at a stage in life where I prefer things to “just work” and not need constant tweaking - so commercial solution(s) are where I’m now focussed.
I need to review my backup strategy following a recent move and am likely to start making better use of cloud storage like BackBlaze.


Here is my backup strategy:

Arq: User profile (with exceptions) to local NAS, every 6 hours
Arq: User profile (with exceptions) to OneDrive for Business account, every day at 3pm
CCC: Full clone to external disk, every Sunday at 330am
CCC: Full clone to second external disk, as and when, e.g before macOS upgrade

I have considered using Backblaze for an off-site copy, just in case my house and all of Microsoft’s DCs go up in flames.

1 Like