Daily Backup Archive and the Cloud

Would there be any notable reason against changing my Daily Backup Archive script destination folder to a folder on the cloud? (I use, depending on the client, DropBox, iCloud, Google Drive, OneDrive).
My main backup is making CarbonCopyCloner backups of my system drive to two alternating external HDs.
I am aware of the advice against putting a functioning DB on the cloud but wasn’t sure if the process of zipping DB’s directly to a cloud folder would also be exposing the DB to some possibility of corruption?

Are you referring to a local folder, like Dropbox? I would not try and zip to a location only on the cloud.

You’d have to test this as I can’t answer whether any of these services would wait for the file to be finished before it tried to upload it. It wouldn’t corrupt the originating database, but I would say there’s potential the ZIP could be corrupted.

Dropbox is Rock-Solid in this regard. Would not be a problem.

OK. I thought there may be an issue where during the short period of time when the zip is being made something could get messed up by some cloud syncing function.
As it is now, I just open each Devon DB (5 mail archives, General Notes, 6 active projects) run the script and then open the backup folder and copy the finished ZIP files to the local Dropbox folder. I guess the last step would be a test of a complete restore of a DB from one of the zipped backups.

If that were true then there would be the same risk when saving to a local hard drive.

Devonthink is completely correct that there are major risks when syncing a file via Dropbox i.e. actively editing an existing file on the Dropbox cloud. However there are no risks when saving a file to Dropbox i.e. a standard 1-time file save. The ability to to a regular “Save” to your Dropbox folder and then have Dropbox reliably upload that to its cloud servers is the essence of Dropbox’s business.

Dropbox is Rock-Solid in this regard.

In what regard? Are you actually doing the procedure the OP has suggested?

If that were true then there would be the same risk when saving to a local hard drive.

I disagree with this. When you’re saving to your local hard drive, there isn’t another mechanism transferring the file to another location, especially a networked location. And yes, if you refer to iCloud, I would have the same question as I do about the Dropbox situation.

Also, the mechanism used to create the archive could play into this. Notice the beginning of this export, it generates a 2MB file…

and the end, almost a minute later, it yields a ~235Mb file…

So the file exists at the beginning of the process, but is still acted on by DEVONthink’s process. I have no solid data on what Dropbox would do here and also, it’s possible it may or may not work. If it doesn’t the corruption could be sporadic. So I have no data to approve or disapprove of the method, but I would err on the side of caution.

Yes I have done that. Though my current preference is to not use any script but rather to shut down DT3 entirely and copy files to their backup destination because I think there is a risk of backing up any database while the live app is running.

When you use Dropbox, you literally are saving it to your local hard drive - you are just saving it to a folder inside your Dropbox folder instead of elsewhere on your hard drive. It is identical as far as application software is concerned that is doing the saving. The upload occurs afterwards - that does not create any increased risk; it is just making a copy of what is on your local drive and moving it to the cloud. Thus it is a 1-way copy to the Dropbox cloud.

It is identical as far as application software is concerned that is doing the saving. The upload occurs afterwards - that does not create any increased risk; it is just making a copy of what is on your local drive and moving it to the cloud. Thus it is a 1-way copy to the Dropbox cloud.

Do you have a whitepaper on this?

No- but you can save to your Dropbox folder with networking/Wifi/Ethernet turned off. Then it syncs just fine after you turn networking back on.

You can also watch the Dropbox status report/log in realtime as it identifies which files it is copying.

No- but you can save to your Dropbox folder with networking/Wifi/Ethernet turned off. Then it syncs just fine after you turn networking back on.

Yes, but that doesn’t prove the point. If there is no internet connection, then Dropbox has to wait until one is available.

You can also watch the Dropbox status report/log in realtime as it identifies which files it is copying.

I know, which is what I’m suspicious of. The moment the file first appears, Dropbox is initiating activity. As shown in my screen captures, the file is not done and accounted for when it is first created. It is still being worked on by the process DEVONthink initiated. Meanwhile, Dropbox is showing activity that could be unsafe.

Also, if you look at their help, they aren’t saying it’s explicitly safe to compress files to your Dropbox folder. I’m seeing many comments like this…

This all being said, the safest option (which I would definitely advocate) would be to create the ZIP file outside a Dropbox (or other cloud-synced) folder and put it in the location after the process has finished.

For sure that is safest of all. Moreover the script itself (which comes with DT3) is not truly “safe” because it is never “safest” to backup a database which is live.

That said, surely DT3 is not changing the data after the .zip creation starts - correct? In other words, it writes the file to the drive once; it doesn’t write to the file and then edit it thereafter, correct?

All that said, you would know if there were problems with the file because there is a CRC check or similar when unzipping a .zip file.

I very regularly take huge files of various types (sometimes over 1Gb) and compress them “live” in a Dropbox folder. I have never had a file integrity problem related to that.

I completely get why you don’t want a live database stored in a Dropbox folder. But I cannot see why there is any problem creating a .zip file inside a Dropbox folder. Most notably, if that really low level of risk is of concern to you, then you ought to shut down DT3 entirely during your backup process and forget the backup script. The backup script on a live database is at least as “risky” as “zipping within Dropbox.”

FWIW I’ve backed up dozens of Scrivener projects (zipped) to DB & never had a problem.

I am not referring to corrupting the originating database, so quitting DEVONthink wouldn’t be necessary. I am referring to Dropbox trying to sync the file while it’s being created and compressed.

Why not modify the script so it zips to a local folder, the moves the completed archive to the Dropbox folder. Just to be sure?

2 Likes

Sounds good to me. Can you send me that modified script in 20 minutes. I have a REALLY important meeting in 30 minutes and would love to take credit for doing it myself but I am lazy and don’t know how.
smiley emoticon x infinity

BTW: I started this topic because back in the early days of Dropbox I tried using it to send a screening copy of a video by rendering the video export from Final Cut Pro(classic) directly to a Dropbox folder. Exports of 2 hour long shows could take multiples of the duration. The local copy was fine and played well but any copy that was made to synced folders was unplayable. It only worked if we rendered to a local folder and then copied to the Dropbox folder. It wasn’t worth anyone’s time to figure out where the problem stemmed from as there was a simple workaround with available labour. i.e. someone on the overnight shift could drag and drop a file when the progress bar went away.
When I started using DevonThink earlier this year and started digging into the backing up process, that experience came back to me when I saw the ZIP files being made on some large databases like my imported Evernotes.

Actually… the Script menu > Export > Daily Backup Archive already exports to ~/Backup by default.