Understood—but how do you get the resulting ZIP files up to iCloud? Do you just deposit them somewhere in the Documents hierarchy, and let iCloud syncing handle the rest, or some other way?
I haven’t got this automated yet
The backup files are created in folders in my desktop and I use Finder to move them to the iCloud drive (just drag and drop)
Sorry to butt in from the beginning of this thread. I think I have the DTTGv3 problem, but only on one device, which was easy for me as I just deleted the app for now.
You are using TM, CCC and Arq here. I stopped using TM many years back when it and Time Capsule turned out to be an expensive way to not have any back ups at all.
I had not heard of Arq until this thread (looks good).
Why are you using all three? Is this because of backup mechanism redundancy, or are you getting different features from each method?
Sam.
Sure, that’s what the thread is for
I agree with your notion re. TimeMachine & Capsule. TimeMachine seems to me to be much more reliable nowadays, and I use it with external SSDs attached via USB.
I use Arq for its off-site backup; whilst I do store the SSDs which I cycle off-site when not in use, in case of fire, for example, I could still lose 3 weeks worth of work in the worst case. With Arq, I would expect not to lose more than 8 hours worth. I don’t back up all my data to Arq though (I assume that anything not under my control could be compromised; some data I am not legally able to put in “the cloud”).
I use TimeMachine and CCC because I hope that using two systems will offer redundancy (think a borked update which the maker of the software doesn’t notice until a month down the line); CCC because it allows me to make a bootable system backup (so if my internal SSD died, I could simply plug in the backup and keep working; that is not a seamless as is once was, because Big Sur); TimeMachine because I like the way it offers me easy access to previous versions of a file (so restoring a single file is really just a matter of clicking a couple of times).
So it’s a bit of both: mechanism redundancy and features.
Wait, what? That feauture doesn’t work anymore? What changed (beside Big Sur)?
It does work, but the system partition on the backup disk is no longer updated; Bombich have a detailed article on this.
I’m not sure how Arq is making money here.
I assume Arq have negotiated a good price with AWS, but even then the differences are large.
From a quick calc:
AWS storage for 2TB/year would be ~$565
Arq for the same would be ~$134
And it is actually cheaper to buy two premium instances of Arq (if they let you) for $60/TB/year rather than buy one instance and use 2TB through it.
Sam.
That pretty much depends on which S3 plan you choose. S3 One Zone – Infrequent Access costs about 0,0108 USD per GB/month (prices for Europe/Frankfurt). Summing up to about 260 USD. Glacier is even cheaper with 0,0045 USD per GB/month or 110 USD.
Given that they probably write most of the time and read only very rarely, this might be the model they use. Backup is not exactly heavy read/write access 24/7/365.
Update I just read in the Arq blog that they suggest Amazon S3 Deep Glacier, which is about 12 USD for 1 TB a year. They are also mentioning Google Archive, which is a bit more expensive (but still far below 50 USD/TB/year) but faster in restoring.
Arq can be used with many providers when buying a license (B2, Wasabi, AWS, …).
Their premium subscription uses Google Cloud as a backend provider.
Wow, Wasabi is really cheap. That is two products I had never heard of until today!
Sam.
Yeah. Wasabi invoices a minimum of 1 TB storage though, so it’s not ideal for anyone actually storing less.
I use Arq with B2, which costs $2.50 monthly for the 500 GB I need ($0.005/GB).
The cold storage options from Google and Amazon can be cheaper, but they are very expensive if you need to restore. B2 was the middle ground, since both storage and restores are very cheap.
All Arq destination options are listed here.
And if you happen to have Microsoft 365 and don’t know what to do with your TB of free space on OneDrive, Arq plays nice with OneDrive too
I pay 69 Euro per year for 6 TB in OneDrive.
This is 11.5 Euro per year for each TB, or about 96 Euro Cent monthly.
In regards to backup of my Mac Mini M1 and DEVONthink, my concept is as follows:
I use TimeMachine to automatically backup my system over USB to a SSD, beside some excluded folders, like for local DT backup archives, Downloads and some others.
Just in case, as it does not hurt and is both encrypted and automatic.
My main data is but on an external APFS-encrypted Thunderbolt drive, or in DEVONthink.
I have OneDrive set up with the Sync / Cache location on that external drive, all content pinned.
As I do not trust any Cloud provider, lots of content is encrypted with Cryptomator, which can be mounted on the Mac as regular volume.
DEVONthink is configured to sync to a remote WebDAV server and additionally to OneDrive (the local Sync Location, which get’s uploaded to OneDrive).
This saves my databases encrypted to those remote locations.
On my WebDAV server I create and rotate backups from time to time, but rarely.
I have some imported DT databases, some open, some as encrypted disk images.
Also, I started to index one of my Cryptomator Volumes, which will be the way to go, for me.
I plan to migrate all content to Cryptomator, which is stored on OneDrive and over DT synced to both WebDAV and OneDrive (the indexed database itself is not sync again to OneDrive, but all others).
Then, every few days, after adding or changing something, I back-up the OneDrive data to local VeraCrypt encrypted disks, one SSD and two HDD which I weekly rotate as archives.
This backup is simply done with “rsync” and a bash script.
The same is done to my Cryptomator content, which removed this content from the need of working Cryptomator software - the backup contains the unencrypted files, but on an encrypted volume.
And finally, my /Users/tja folder will be copies to those VeraCrypt volumes too - and here, I am not sure about what should be saved and what not.
Lots of stuff below ~/Library seems obsolete.
This is, because I don’t trust Microsoft - I already lost one account and hundreds of GB of content due to Microsoft closing this account.
This could happen at any time.
So, I use OneDrive but always expect it to vanish. A family account with about 69 Euro yearly for 6TB is hard to beat.
The same goes for my Cryptomator volumes - at any time after a macOS update, this may cease to work.
Making regular backups / archives to local VeraCrypt disks ensures that I could “survive” both a OneDrive shutdown and a Cryptomator update problem.
So far, I rsync the whole /Users/tja in two steps:
The first step excludes ~/Library and ~/Databases (where my DT databases lie).
The second step rsyncs ~/Library and excludes ~/Library/Caches
I think about excluding any and all “Caches” directories, but am not sure.
There also more cache* directories: “Cache”, “cache”, “caches.*”
Just getting into this whole topic as I have been using Arq to backup everything (including my DT databases whilst open).
-
Am I to understand that backing-up the DT databases themselves is not clever and to only backup ~/Backups (where the ‘daily backup archive’ script stores the backups)?
-
Using the ‘daily backup script’ is manual so is there an easy way of automating this? If one is backing the backups, and Arq does retention etc, wouldn’t it be better if the ‘daily backup script’ didnlt create a new file each time (with the date in it)?
…sorry for the dumb questions but I find backing up DT confusing and I want to get it right.
There’s quite an interesting discussion in this thread. I use a modified version of the script I posted there to backup all open DEVONthink databases plus a Hazel rule that ensures I keep only the two most recent versions of each DEVONthink archive.
Stephen
Thanks for the reply, @Stephen_C!
I am playing with a script I found on another thread (b ut cannot remember who posted it, maybe you!).
-
How do you kick off the script? Is it every so often in automator?
-
I assume your reply conformed that one shouldn’t;t backup the original (and possibly open) .dtbase2?
-
Given Time-Machine, Arq et al have versioning, why do you save each backup with the date-stamp in the filename?
Thanks again!
Well, the DT databases are just folders inside another folder. Whatever technique you use (TimeMachine, Arq, manual), they simply copy the folders (or a delta of them) to the destination. Quite simple, in fact.
I never understood the need for a “daily backup my DT databases as ZIPs” script anyway. Maybe that’s a remnant from the time before Timemachine, Arq, Backblaze etc.
I use it to archive my open DEVONthink databases weekly. Thus I simply open all of the databases and run the script effectively from the scripts folder in DEVONthink. (I have a reminder in DEVONthink to do that.) In fact I use a Keyboard Maestro conflict palette to facilitate access to scripts that I frequently use in the DEVONthink scripts folder.
No - the very point of the script is that it’s archiving open databases. Of course, I’m not working with the databases at the time the script runs.
I also make a weekly Time Machine backup, nightly Carbon Copy Cloner backups and daily Arq off-site backups. Call me paranoid but what I have now in DEVONthink is much too valuable to risk losing! As to the date stamp in the file name, it simply makes it easier to see at a glance the date and time of the archive if I ever need to restore it.
Stephen