I resurrect the thread to expose a problem not too dissimilar to @IvanPsy: I too have a rather slow connection and backing up DT to the cloud is pretty bloody. I use Arq to copy the database folder to cloud but every time the scanning and uploading job is very time-consuming: very heavy files like .dtMeta are always updated, in addition to any internal backup folders (perhaps they can be excluded?). To make it smoother I was thinking of creating copies via the sync function on the same machine and uploading those to the cloud, which I have noticed are more fragmented - and therefore probably more manageable. Is this a viable idea? And if so, could it possibly create problems for restore operations?
As a starting point, I recommend you re-read the section “A Word about Backups” starting on page 19 (3.9.0 version) of the outstanding DEVONthink Handbook.
Me, I would never backup to a cloud service that is based on sync, e.g. Apple iCloud, Dropbox, etc. That is because if a flaw develops on the local copy or on the server, then, “poof”, backup gone because the flaw is “synced”. And as you have issues with the slow network connection, the risk of corruption for you is probably high. IMHO.
I have a backup system which is probably “over the top”, but I don’t care. I won’t bore you with the details as it’s basically a very redundant 3-2-1 system (see the “interweb” thing to read about that.
Basically, I’d suggest you first get TimeMachine going to a local external/USB drive. TimeMachine backs up everything unless you exclude things. If you want offsite backups, consider a BackBlaze (or equiv) subscription. If you need to use arc or similiar, copy to a local external/USB Drive. Backblaze will backup connected external/USB drives.
In addition to Time Machine, I’m using Arq Premium to do an off-site backup (through their service), and no other cloud provider. However, loading and updating DTs is always very time-data-consuming, and I am looking for ways (or wildcards, e.g. on the internal backup folder to the databases themselves) to make them more agile.
Well, I’m not sure what you mean by “agile” … but maybe only send the Database Archives (zip files created) to the arc service. Those are discussed in the pointer to Page 19 of the outstanding “DEVONthink Manual”.
Might be one compressed zip faster on your slow network connection. With internet copies, they will take the time they take.
You can find here where it’s been discussed more than once how to automate creation of those zip archive files. I keep them in ~/Backups/DEVONthink so they are backed up by TimeMachine and Backblaze, also, all automatically.
I wish it were so! But every morning I wake up and the backup has always given errorsdue to loss of connection. I don’t use a desktop computer but a Macbook and there’s probably some setting I need to adjust, on the other hand I’d also like to avoid keeping the Macbook on all the time and decreasing its lifespan.
Arq can wake the machine up (at least it does so on a MacBook Pro). And if your network is really that shaky that you have problems every night, you might want to talk to your provider. In any case, it then is shaky all the time, and it doesn’t matter whether you run Arq at night or not.
Perhaps in your situation (if you can’t amend your network connection) a local solution via NAS is a better alternative? Of course, there are initial costs coming with that, but you don’t have recurring payments like with Arq.
I’m not a back-up expert, but if my internet connection was that shaky I’d be doing local backups and skipping the internet entirely. What’s the point on relying on an internet connection you know fails regularly, rather than just doing it yourself for far less stress and error? You clearly already know that your internet connection cannot be relied on for doing the backup, which means that in the case of a catastrophic loss - when you’re already very stressed - you also know it cannot be relied on to restore your missing files, because you’re going to be getting the same error messages at the same frequency.
It would be far less stressful (and less time-consuming) for you to backup to two external drives. One kept at home and one kept at an offsite location. If you needed to restore from backup, chances are you could just use your home backup, but if you couldn’t (e.g. there was a fire), a walk/cycle/drive to your offsite store to retrieve your backup disk sounds like it will be far more reliable and far quicker than crossing your fingers and hoping your internet connection will hold out long enough to download all the files you need restoring.
None in fact, I am considering the options and I think I will go for double physical backup - although I find Arq Premium a very good service.
That said, and I’m asking the more experienced ones who use Arq themselves: how can I create an exclusion or wildcard to remove these folders from the backup (see screenshot) and make it more streamlined (whether offsite, cloud or otherwise)? Has anyone managed this? I have tried different solutions (inserting the whole string, with */Backup, etc.) but to no avail.
You are playing with fire trying to figure out what parts of DEVONthink databases to backup.
Not recommended, nor will your restores probably work. Best to follow the recommendations as documented in the “DEVONthink Manual”. Otherwise, your mileage will vary.
@beausoleil I have enabled Prevent computer sleep during backup but not Wake computer at backup time. But it would probably make sense to activate the latter. In the end, it’s just a matter of if you’re comfortable with your computer waking up in the middle of the night to run a backup.
@BLUEFROG I decided to do my own manual back up after this convo (in addition to my automated one). I found a brief reference to it in the manual and not much info in the forum about the process, please can you confirm I’ve understood correctly? I like to try and understand things so I know how not to break them! The manual says:
File > Export > Database Archive
I did that and have my databases zipped on a backup drive. Is that all that needs to be done? If I needed to restore from that zip, would I just unzip the relevant database, move to the correct folder on the system and open in DT?