Encrypted database empty after restore from TimeMachine? - Resolved

Whilst I can shed no light on why this has happened, you could try the following:

  • Make a backup copy of your database file before starting.

  • Rename the database (which I will call “file” here) from file.dtsparse to file.sparseimage (or file.sparsebundle, depending on which option you used along the way). Open that file from Finder (with DiskImageMounter) - you will be asked for the password. When the file is mounted, you can open it and should find your database inside.

That would be my first step to see whether the file is “valid”; I think in theory you can now pull the database file from the mounted sparse image and drop it wherever you keep your databases; opening it should open the (now unencrypted) database in DT. Next steps would presumably be to copy the content to a new encrypted database (although others may shed more light on options here, and I think once I had checked the database was still healthy I might might wait for input from DT support or others here in the community)

When you have the unencrypted database in front of you, you can show the contents by option-clicking in Finder and selecting “show contents” or similar. Whilst you should never make changes to the database this way, you could verify that your files are contained in it.

A couple of additional questions:

  • how did you set up the encrypted database? Did you use the option in DT, or did you follow advice to make a sparsebundle (found in a number of posts throughout this forum)?
  • was your encrypted database routinely closed after use? (there is a problem with TimeMachine on Catalina not backing up encrypted databases when they are open)

Thanks Blanc…so sorry I wasted your time, but it was operator error. I’ve figured it out, but the mistake was so silly I’ll keep it to myself. Could’ve been solved by a pot of coffee and a walk. This is what happens when you restore in panic mode :slight_smile:

Thanks!

well that’s a very pleasing outcome :slight_smile: all the best :slight_smile:

Now that everything is the way it was, might I ask whether you routinely test your backup by copying a random file from the disk?

There’s still a chance your backup somehow fails after such a test, but many people test their backup only when they actually need it. Which should work of course, but you don’t want to find out it didn’t when you most need it.

I agree with what you are saying wholeheartedly - I recommend having more than one backup in more than one location. I personally back up DT using TimeMachine to one set of disks, Carbon Copy Cloner to another set of disks and sync to a total of 3 further devices using Bonjour. In addition, I back up some content using Arq. The backups to TimeMachine and CCC are to disks which I cycle; those not currently active are stored offsite. (What I hope to have considered with this approach is software and hardware failure, timed ransomware and other timed or non-timed destructive malware, loss through carelessness, theft or fire; in addition, because they are both local and the files are stored in non-proprietary format, access to the databases is guaranteed even if DEVONtech were to fail.)

Great ideas…I need to explore this. I pay for Dropbox and iCloud, but neither are recommended for backing up DT, as far as I can tell. Although, I suppose it would be kosher to write a cron entry which would copy the databases to Dropbox, as opposed to storing the DT databases in Dropbox while in use, correct?

I have no experience backing up to dropbox or iCloud; from what I remember it is not recommended to copy the DT databases to any location whilst in use. Copying the databases to a backup location whilst the database is closed would not be problematic.

You might want to look at Arq. It isn’t specifically advocated by DT, but is used by the company president (source: Heads up when using Arq to backup Devonthink). There appears to be no problem backing up open databases. Arq will backup to a cloud provider you specify (not iCloud as far as I can tell, but Dropbox; if you use Office 365 you might also have a terabyte of Onedrive space, which could also be used). It is a one-off purchase if you don’t use Arq’s cloud service). (I am not affiliated, and I’ll point out that on going from v5 to v6 Arq caused some of their user base quite a headache; their format is proprietary, which is not what I want from a backup, but looking into all this I found it to be the solution which best fit my needs)

You should never put your DEVONthink databases in any cloud-synced folder.

This might sound silly, but have you considered to also backup to optical media like DVD-R or Blueray? To be clear, not as a replacement, but as an addition to the whole hoopla of backups, as I think optical media are more or less 100% resistant to ransomware. They’re also water resistant and most likely of no value to most thieves and burglars.

Apple surprisingly still sells the Superdrive, but any of such devices will probably work. Store the disks in the dark and replace them on time (which you’ll probably do anyway, as a read-only backup is obviously nothing more than a fixed snapshot of that moment in time). As with other backups (including cloud based solutions) you might consider encrypting them before writing.

2 Likes

That’s actually what I used to do (before the days of affordable or even fast external HDD/SSDs) - although I found the backups to be somewhat error prone and assumed degradation of the DVD R material over time. The environmental aspect is also something I don’t want to brush aside completely.

It might be time to revisit the concept - do you back up in this way? If so, how often? Which software do you use (some of my databases are or will be larger than a DL DVD R).

I also use DVD-R, not only DVD-R as you only make a backup this way every so often. Sorry for the long reply.

I think the frequency is up to you as you take into account your personal risk/benefit. I must admit my workflow requires several tedious steps, which makes it a proces that I perform less often than I want to. I’m thinking about automating parts of it, as that enhances the chance of committing to it.

You can simply backup from DT into a ZIP as long as the zipped DB is smaller than the disk size and a tiny bit of overhead. Either use optical disks with a large capacity like the more expensive Blueray if your DB is large, or create multiple databases with a certain size.

There are several ways to handle the backup process I guess. One way is to create an (encrypted) disk image with disk utility. Then mount that image and have DT write the zip to the image. Then insert the DVD-R and drag the image to the disk and hit burn. The software is incorporated in macOS, at least when you use a Superdrive. I haven’t used other drives, so those might require additional software.

Steps to automate might be to automatically copy the DB to a temporary location, give it a unique name, zip it, empty the disk image once zipped, write the files to the disk image and create a SHA checksum. When you’re ready to backup you only have to drag the image to the disk after you loaded a disk.

After burning I test the backup simply by unmounting an remounting the drive to decrypt the content on the disk (and if I’m motivated enough copying it, compare the checksum, open it in DT and view some files :slight_smile: )

This might appear as a huge undertaking, but it’s doable. You can of course minimize parts of it to your liking.

As a side note: DT might benefit from a ‘build in’ checksum to compare databases. E.g. when creating an archive. When you copy the archive from a different medium and reopen it in DT you could have DT automatically check it’s integrity based on the checksum in the archive. Would that be something to incorporate @BLUEFROG? This would also require a ‘restore archive’ function (basically nothing more than copy the ZIP, unzipping the content, checking the checksum). Checksum generators like SHA256 are a standard part of macOS.

1 Like

I wasn’t suggesting placing it there…just copying it to the sync’d server on a periodic basis for backup. I presume this would work?

@cgrunenberg would have to comment on this but a checksum is something we could look into.

just copying it to the sync’d server

What are you referring to as the “synced server” ?

Sorry…that was meant to be service.

What I mean is that I’m guessing the problem with using sync services is that they may try to sync the database while it’s open by DT. I’m assuming that there’s no problem if I copy the database myself, via the command line, to the Dropbox folder. So I wouldn’t be running DT against files used by Dropbox…I’d still use the databases locally on the filesystem, but have a cron job which runs rsync or cp -a to a Dropbox folder on a regular basis.

Does that make sense?

Yes, but still you should not copy the database, i.e., the dtBase2 file to a cloud location. It is not data-safe.

Doing a periodic Script > Export > Daily Backup Archive creates a ZIP file that can be put in a cloud location. That is safe.

Perfect information. Thank you!

Perhaps I need to start another thread here, so tell me if so. Even though I’ve successfully verified the database, and have manually run File -> Optimize Database with no errors, when I run that script I get “Optimization of database failed”. Any ideas?

Also, is there a way to make this happen automatically on a periodic basis?

Thanks!

You’re welcome.

Did you run a File > Verify & Repair on it?