I admire the excellent schemes that people have created! This discussion thread has been a nice source of new ideas …
So far, my approach has been the following.
A. When at home:
Hourly: Time Machine backups to a USB-attached HDD; TM is configured to back up the whole computer and an external SSD used for photos
bootable clone of whole computer to a USB-attached SSD.
b) Arq backups (encrypted) to an off-site computer I control; Arq is configured to back up my whole computer, all attached external disks, and an 8TB HDD disk that I use to store archives and infrequently-accessed files at home.
Weekly: copy, using a workflow defined in CCC (started manually), the most important contents to a small USB disk kept in a fire safe hidden at home.
Occasional (maybe once every year or two), copying a selection of TM backups to an HDD ready to be retired, and putting that HDD in a box.
B. When traveling:
Every night, use CCC to create a
bootable clone of the full internal drive to USB-attached SSD.
If possible (depending on network speeds), also run Arq while in the hotel room.
The USB-attached HDD used for Time Machine has enough capacity to hold about a year’s worth of TM backups, using TM’s default pruning scheme (which reduces old backups to something like 1/week or 1/month).
My destination for Arq backups is a Linux box at a location in another city. Because of the nature of my work and certain quirks of the academic life, over the years, I’ve been fortunate to have inherited many retired computers and computer parts. One of these was an old Dell server and a pile of 3.5" HDDs that were retired from other projects. I reconfigured it to run Linux (Ubuntu 20) in a console-only non-GUI setup, bought a used Dell RAID card on eBay, and set up the disks in a RAID5 configuration. Then I closed all the TCP ports except SSH and a couple of other necessary ones, and now it acts as an SFTP destination. All administration and control can be done remotely via SSH (command line, no GUI); rarely, I need to access the bare machine before the OS boots (e.g., to get to the BIOS screens), which requires console access, so the computer also has a networked KVM interface. As part of my maintenance of several similar computers, I have a calendar reminder to ssh into the machine and run OS updates, check the logs for signs of trouble, etc.
A natural question is, why set up my own backup server versus just buy space on Backblaze? I would have gone that route if it weren’t for the fact that I had this nice used hardware available and the background to be able to set it up. I don’t think backup speeds are any faster than with a service like Backblaze, and certainly it would have taken less time to use a cloud provider. However, there is one advantage, which is that the server is physically accessible to me and I can copy off multiple TB’s directly instead of waiting for network downloads or asking Backblaze to mail me a disk.
(I feel obliged to add some cautionary words here. Setting up your own server not only takes time once; you have to keep maintaining it, and you have to be careful about securing the system because everything opened to the internet is exposed to constant hacker attacks. Basic configuration, maintenance, and security on Linux servers for something that only uses SSH is not terribly hard, for someone with the right background. And I’m not special – it just so happens that I’ve been a Unix systems administrator in the past. So, please take these factors into consideration before going down this route.)
Edit 2023-04-22: I had been making bootable clones in the past, but after switching to a MacBook with Apple Silicon and macOS Ventura, I have now discovered that it’s basically impossible to make bootable clones anymore. Grrrr.