To confirm for those who might need to know this distinction, this process does backup the indexes inside the databases to the indexed files but does not included the indexed files in this archive zip. This correct?
If correct, then following advice in the “DEVONthink Manual”, a system backup is also required to have proper backups.
This is why I direct DEVONthink archive backups with an automated scheduled job to ~/backups/devonthink/ and also rely on system backup (multiple methods) to backup the indexed files, the DEVONthink archive zips, and the DEVONthink databases.
Thank you @beausoleil for having resurrected this thread!
I still have to solve this: I’d like to have a cloud backup (I already use Time Machine), so that I have an offsite backup, in case something happens to my Mac and/or to my HD.
But my database is more than 25 Gb, too much for my connection: I’m not able to upload such a massive amount each day nor each week.
Furthermore, if I solve this doubt the size of the database may reach more than 100Gb, because I’d move all the files inside DT (I keep the heavy files, such as my videos, on the cloud).
Reading here and there in this forum, I’m sold about moving my database on an external SSD (so that it’s detached from my machine, I think it’s another layer of security).
But I still fear the loss of my data: the SSD would always be next to the Mac, so their destiny is bond each other.
I ask here, don’t know if I should create a new thread.
Really? Interesting idea. Security from what risk(s)?
IMHO given “flakiness” of external wire to the SSD compared to the internal connection to an internal SSD, I think you are adding risk. Just my two bits.
Pity your internet connection not allowed (?) to have too much data. Backblaze (or equivalent) would do nicely for you.
This is a good move. Assume it to an external drive. Perhaps purchase another external drive dedicate one to be filled up with latest backup then moved to an “offsite” location somewhere.
Perhaps purchase another external drive dedicate one to be filled up with latest backup then moved to an “offsite” location somewhere.
This is an old, tried-and-true method we certainly support. We even have a client who does a monthly full backup to an external and mails it to his brother several states away (packaged very well, of course!)
A backup service worthy of its name would not do that, but instead create incremental backups, uploading only what has changed. Just as TimeMachine does
My thought: if something happens to my computer, I still have my data safe on the external SSD.
But I’m not a tech, so maybe I’m wrong with it.
Furthermore, I don’t have space enough in my Mac to store all the files I’d like to store.
Ok I wrote it wrong: I mean that my Internet connection is too slow to upload more than 25-30 Gb in just one night.
I agree, but I don’t have another physical offsite location…
I already do that with Get Backup Pro and iCloud.
Each evening GBP syncs the database as is on iCloud, and this upload is incremental.
It’s a manageable solution for me, but I’ve read somewhere in this forum that it is a bad idea, because the sync may start before the copy is complete, so the database may be corrupted on iCloud.
As we’ve trumpeted many times, sync is not a backup, neither advertised nor advocated as such. This is in part because a sync should be application-agnostic. Sync data is only useful to DEVONthink and DEVONthink To Go.
I think you’re confusing two different issues, which hopefully someone else will explain here better than me. But in case they don’t: syncing and backing up data are not the same thing. Relying on a sync as a backup is terrible, as others have explained.
For your question - how do I do a cloud backup with large databases, @chrillek said:
What they mean is that if your database is 25GB, a good backup service isn’t backing up the whole 25GB every time it runs. It has to do it the first time so that it has a complete copy of your harddrive, but after that it’s only backing up the changes. So unless you make 25GB’s worth of changes every time you use your database, you will be fine. You’re probably making what, 1-5GB’s worth of changes - on a busy day - and that’s what you need to consider when assessing what your internet connection can handle.
And as pointed out, this is also what Time Machine does. It only backs up the changes each time it runs, not the entire harddrive from scratch. That’s why it takes ages to run the first time you set it up, and after that usually takes <20 mins.
Thank you for confirming. In the very olden days I use to make backups by copying files to floppy disk. I don’t know why I’ve waited until now to zip my databases to a harddrive just in case! It’s reassuring
Hi, am new to Devonthink3, and have read several threads on this forum about backing up DT3 databases, and have a couple of questions that I’m hoping others could weigh in on:
Does DT3 have the ability to schedule backups natively? E.g., perform back-up once monthly.
Does DT3 have the ability to automatically delete backups older than a certain date and/or delete the oldest backups to maintain a constant number of backups? This question pertains to memory management of the backup location.
Personally, I think that I’d be comfortable editing existing DT3 scripts, combining them (e.g., calling one script from another) and writing new ones, if these are possible.
Perhaps with a smart rule. But those are executed only when DT runs.
Why not rely on your standard backup strategy/tool? That does everything from regular (incremental) backupd to thinning them. I don’t see the point of per-app backups.
Thanks for commenting. I don’t have a standard backup strategy/tool and have been setting up per-app backups so far. Would you be able to recommend one or more backup tools?
Yes, I intended for this to be my primary backup method, at least in the interim. The primary motivator that drove me to purchase a DT3 licence was its ability to perform advanced searches across directories. Currently, I am purely indexing, and never importing, material on my machine into DT3 databases. For now, I’d just like to backup the paths of these indexed material. In the event where I would need to restore DT3 from its backup, only these paths would be restored. At least that’s the idea I have in my head.
Why am I not importing content into DT3?
I am considering importing files into DT3 databases and working in DT3 but don’t have sufficient confidence yet, partly due to some bugs that I’ve experienced in DT3 related to the unresponsiveness of dialogue windows resulting in the failure to register (repeated) clicks on buttons, and the selection of files and folders. There’s also the issue of possible database corruption that I haven’t looked into which would be fatal for a “knowledge worker” like myself. Can’t say that DT3 being buggy when performing the aforementioned simple actions have instilled confidence in DT3’s ability to upkeep its databases’ integrity.
Additionally, I haven’t figured out how to incorporate the current workflow I have into DT3 – applications that edit content have to, I’m guessing, save directly into a DT3 database? I’m not sure how to do this with markdown apps like Obsidian, as well as LaTeX editors that concern both the code and compile the material into a pdf file. I’m guessing something has to be done to automatically import the compiled pdf file to overwrite the associated existing file in DT3. But while all this is, in principle, tangential to the discussion in this thread, I hope it gives the seasoned DT3 user or developer an idea of the difficulties a beginner has in making sense of whether to integrate DT3 into their workflow.
I am using DT3 as an “advanced” search engine with an ability to search across my content “scattered” across a handful of directories, exist in a handful formats (mostly pdf, md, epub, pptx, docx), and could possibly still be edited.
Sorry if I’ve digressed, and would be happy to hear from any users/developers who might have comments on my situation. Cheers, and have a great week ahead everyone.
I make regular and automated DEVONthink backup archive files stored on local drive. To backup local drives I use macOS TimeMachine, CarbonCopyCapture (CCC), BackBlaze, and Dropbox’s new backup service. I follow the 3-2-1 backup regime which you can read about in numerous places on the web. Backup destinations are two USB drives, a Synology NAS, and of course Backblaze and Dropbox backup servers.
See above posts (long thread) for other advise already given.
So how are you backing up the data underlying your DT databases? If it’s indexed, rather than imported, then DT doesn’t have it. A DT backup will capture the indexes and metadata, but that won’t be very helpful without the underlying files.