What's your Security strategy for your personal documents?

Google Drive is a defined storage location for Arq backups
Google Drive sync is not used

1 Like

If it’s Google Drive, a Google server is involved. So either it’s using Google Sync or Google’s published API.

~500mb Obsidian vault:

Synced to phone.
3x file versioning turned on.
Encrypted & backed up manually once a week.

~10mb private notes:

  1. Password manager with 2FA, but on the cloud. All eggs in one basket!
  2. StandardNotes

20gb persistent teaching resources:
On hard drive.
Supposed to backup once a week, but the process is often too slow, so it can be as slow as once every 3 months.

80gb temporary files:
Not backed up.


  1. I’d lose 6 days of data if infected with a cyberlocker. Because sync services are vulnerable to cyberlocker attacks.

  2. Some of those temporary files are less temp than I think, but in general I should be able to get them from their original sources.

  3. If I lost my teaching resources folder, (main threat - losing the laptop), I’d lose a few months of work, but not all of those resources took a lot of time and effort. In general, making stuff a 2nd time allows improvement. It shouldn’t sink the business.

My phone has a long pin, so when I lost the last one, I didn’t worry that much. Password manager are a worry… so keeping 2FA backups is most important. In general, I worry about losing things in disorganization more than getting hacked. That’s why I’m here.

That said, phishing attacks with AI is getting really sophisticated, so a massive new threat now is techno illiterate friends and family getting duped!

1 Like

What DTLow writes is (as always) very interesting. I never addressed this possible threat. Given that I have about thirty DT3 databases, it would be quite time consuming (and space consuming) regularly to backup in this way. I am wondering how great this threat is? If a DT3 database is not encrypted, is there a way to “break into it” to extract one’s own data in the event that it will not open with DT?

All files are stored inside the database package in its subfolder Files.noindex without any modification and are always accessible via the Finder whenever necessary (but don’t modify the internals of the database package on your own!)

1 Like

Thank you for that. So by using “show package contents”, all files are available. The benefit of using Export/Files & Folders… is mainly that the structure of the database is also saved?

I would say that in 15 years of use I never had a problem opening a DT database!

Plus additional (custom) metadata.

In case of a good backup strategy and especially with multiple computers, each having local copies of databases, there’s no real reason to worry.

But disabling automatic updates of macOS on one computer might be a good idea to avoid that a buggy system update/upgrade might immediately affect all machines.

1 Like

Thank you!

I wrote an article about backup up strategies for DevonThink maybe that helps DevonThink: Backup strategy for 2023 - future proof - Welcome to Steffis Cloud


Threat level minor
Did anyone mention ransomware; also a minor threat level
More likely my ancient Mac failing; but I have my iPad as backup
The extra backup is a minor effort for me (time & storage) so I just go for it

1 Like

Interesting. And I see you too refer to Synology, which I had not heard of before today. I must check it out!

if you need help let me know

1 Like

Thank you!

My former employer got hit with a ransomware attack a few years ago. No data was extracted from the network, it was all simply corrupted (I understand that ransom attacks often don’t remove data, they simply corrupt the files in situ). There’s a lot’s that I could say about this (well, lots I want to say but probably can’t for contractual reasons :grimacing:), but one funny thing (to me, as someone not in the tech team suddenly working 16 hour days trying to resolve the problem) is that once the network had been hit/accessed, the attack dominoed through the network and even hit the back-ups. Eventually the system had to be restored from secondary manual backups that hadn’t been connected to the network at the time of the attack.

Edited to add: when I say “eventually”, I mean like 18 months later and even then not everything had been rebuilt. Wiping out an entire company’s IT system is a colossal amount of data loss, and you can’t just reboot from backups a few days later. I had a friend who’s employer was also hit at the same time (it was during a well-known spate of attacks where many big companies were hit together), and they took over 2 years to bring some of their systems back online.

This absolutely shouldn’t have happened (if your backups are vulnerable to the same threats as your live data, you are doing something wrong), but it’s an important lesson in what backups are actually for and therefore how you implement them.


Then please tell exactly what to do to avoid this happening

To avoid the consequences of ransomware attacks? That’s a bit beyond this forum, I think. But Arq and Backblaze (and probably others) offer “immutable snapshots” that supposedly remedy this kind of problems. TimeMachine does not, AFAIK.


Air gaps are remarkably effective data protection. Having at least one backup that’s in an external drive sitting on a shelf (or in a safe deposit box) is a good first step.