Sync Devonthink between two machines.

I use want to use Devonthink at home and at work. How can I keep the databases between the two machines - synced?
Thanks

FG :unamused:

Hi,

this topic has been discussed on this forum several times, just search a bit. Nothing has changed since the last discussion.

I still work with the solution of having a “master db” on my desktop, while I export changes from my ibook and import them into my desktop db. So I need not copy the huge database too frequently.

Maria

Note to Bill D.: this is a FAQ-worthy topic. :slight_smile:

sjk: You’re right. The only problem is that it’s also a complex, book-worthy topic. :slight_smile:

I can tackle some simple, straight-forward procedures. Maria’s tricks for working with her two computers really ought to be considered by many users. DEVONthink’s Export and Import procedures make it easy to transfer documents between two databases. Tools > History can be used to identify recently added material that one might wish to send to the other database, for example.

But I barely know enough to be dangerous when it comes to what many users want – 2-way synchronization of databases between two computers. What little I do know tells me there are lots of pitfalls to using ‘synchronization’ software that probably won’t do what is expected.

So I lean towards the brute force approach. Just copy the database back and forth. (And make certain that I’ve chosen the right direction for the copy!)

I know that this has been suggested before, but why is it that we can’t sync the DT database without having to go through hoops like this? Something as rudimentary as what StickyBrain does with a .Mac backup/restore would be better than the current “export, import” cycle. For something as ambitious as DT not to have a sync feature seems like a glaring oversight to me. Especially considering it seems like a much sought after feature. I know I’d like to have a sync feature so I can use DT on my desktop without worrying about screwing something up.

Is this planned for 2.0? Or in the (what seems like the oft-promised, never quite delivered) DT Pro version?

See what I mean about “synchronization” having a wide span of meanings? :smiley: In principle, .Mac could be used to swap databases between two computers. The practical problem is that a great many users have large DT databases. In that case, a .Mac transfer is unacceptable both for speed and size reasons. The current database structure of DEVONthink is monolithic, which means that most synchronization programs can’t deal very well with it.

DT Version 2 will have a different database structure that separates the document/image contents from DT’s database “about” the documents/images. Among other things, Spotlight searches will work in conjunction with DEVONthink. This database structure will in some ways simplify synchronization between two computers.

Because DEVONthink can use a variety of means to import information from external files, the concept of synchronization between computers can be more complicated than, for example, FileMaker databases. Are all the documents contained internally in the database, for example? Yes for FileMaker, often not for DEVONthink.

I make backups of my database on a portable FireWire drive. And that’s my personal solution for moving my very large database between computers. But that doesn’t totally synchronize the computers, as some contents are linked to files resident on one computer and not the other. Like Maria, I keep a master working database on one computer. A copy of this can run on another Mac, but I don’t try to make it a complete clone (including external files) of the master database, as that’s not necessary for my purposes. If I add new material to the second database, I’ll export it periodically for transfer to the master database.

I’ve settled for copying all files into the database package. At least I won’t loose links.

The drawback now is that my database is really big and it’s more than just a little hassle to copy back and forth between powerbook and desktop (timewise) - especially taking into account the 3 backups which would copy too and quadrupple the size to be copied if I don’t select only the current 10 database files from within the package. Plus I forget easily and having to remember on which machine I did the latest changes does not always workout, and I use DTP intently enough that using history to export/import has become a drag.

Next to that I found that sometimes exporting a sheet does weird stuff and it will not arrive as it was in the original location (e.g. a tab-del file will arrive with quotes plus deleted columns suddenly show up again - I tried this about 5 times with the same sheet and each time ended up with an altered sheet).

Thus, my wish for the future is to choose via some preference settings to either :

a) sync DTP fully between 2 machines, including the files folder within the database package - or
b) sync DTP between 2 machines w/o the backup files and w/o the files folder but links pointing to the files withing that files folder on that volumen

For DT Pro users who want to move the complete database back and forth between two computers, here’s an approach that will make the smallest possible backup of your database.

First principle: Copy all imported files into your database or your database Files folder, via preferences settings. That means you won’t have to worry about external links to files that are not on both computers.

Now, when you are ready to make a copy of the database for transport to the other computer, select (in DT Pro) Script (symbol “S” to the left of the Help menu) > Export > Backup Archive.

This script will Verify, Backup & Optimize, then make an archived copy of the database (without internal Backup folders). The date of the archived database will be appended to the file name, thus reducing confusion as to which backup is most recent.

Because the archived database is as small as possible, it will take less time to copy it to the other computer, whether one is using a network transfer, Target Disk mode or external medium such as a portable FireWire drive. Then double-click the archive to expand it, and open your database normally.

And it has the added benefit of facilitating keeping recent external backups of your database, which is a Very Good Thing. :slight_smile:

This seems to be a working solution - until full sync will be implemented :slight_smile:

But seriously, this does take care of the overblown size issue - it managed to make my database 5x smaller then on the desktop, getting it zip’ed in the process makes copying across the network also much easier, and the time for the whole process wasn’t too grusome. The main drawback still being that one has to remember where the most current version is. I’m actually considering to move the main database on it’s own drive on our home server and have both, desktop and powerbook work with that and only when I have to take DTP along with me would I copy it to the powerbook.

Thanks for this suggestion.

When I travel, I just carry my database on a small portable FireWire drive and run it off that drive. :slight_smile:

Next week, I’ll be visiting with the DEVONtechnologies folk in Europe. As they have DT Pro on their computers, no need for me to carry my PowerBook.

I was considering to use my 20 gig firewire iPod to become my external DT carrier, just for this. But it adds up when traveling : iPod, Palm, Camera, GPS, Cellphone, Batteries, Cables, Printer, and and and - so I am glad for any addon I can avoid :slight_smile:

…you forgot your desk, chair and bed there!! - just kidding :slight_smile:

I flog my 40gb iPod at the end of every day to carry my 1gb database home with me. It takes less than two minutes and so worth it. Agreed, an intelligent and foolproof method to sync desktop database contents with that in the iPod would be the most efficient way to sync the daily updates without slamming it again with the iron brick everynight :slight_smile:

Spyder

P.S: I too keep everything self contained in the database. Frankly I don’t think I fully appreciate the logic of that middle option of importing into a database folder?? instead of the database - isn’t .dtbase itself a package of files and a folder for imported files anyway?

LOL. I think I basically just resist the idea of needing an external drive to manage data which I use on 2 different machines, my PB and my desktop have ample place - and btw. my computer bag has more than once acted as desk AND also as a pillow :wink:

I do believe it to be quite a big difference between the import options. In the “database package/folder” you’ll end up with 10 database files within the package PLUS a separate folder which contains all images, pdf and other files used in the database. If you import it into the database itself, the data would be stored inline within the 10 database files and would a) increase their size a lot and b) if the database file got corrupted for whatever reason, you could potentially loose images and pdf files etc. - that at least is the difference how I understood it, which very possibly is atotally wrong concept :slight_smile:

Almost, but not quite right.

Practically speaking, there’s not a significant size difference between the options of storing, e.g., PDF documents in the ‘body’ of the database, or in the database’s Files folder. In either case, adding files increases the size of the database package file (or the DEVONthink Personal database folder).

But my own preference is for storing images, PDFs, QuickTime files and some ‘unknown’ file types in the database Files folder. These options are set in DT Pro preferences.

Importing or creating file types such as text, RTF, RTFD, HTML and some others results in storing them in the ‘body’ of the database.

I don’t worry about database corruption. My large main database has files dating back almost 4 years. I’ve had database problems only 3 times that I can recall, during periods of using alpha versions of the software for my daily work flows. In one case I had imported some corrupt text tiles, which a Rebuild eliminated. In all cases I had a current or only hours old backup. (I do manual runs of Tools > Verify & Repair and Backup & Optimize whenever I’ve invested hours in database additions and/or editing. Something that can be done at break time, which can save grief.) I highly recommend making external backups of databases – although I’ve actually never had to use one, it is prudent.

I run my desktop Mac on an uninterruptible power supply (UPS), which has saved my neck a number of times when electric power went off – including lots of voltage fluctuations and extended outages related to Hurricane Katrina! And I recommend routing OS X System and drive housekeeping to keep one’s computer running cleanly. I always do System and disk housekeeping before and after OS upgrades and Security upgrades, and I always install incremental upgrades just using Software Update. The only time I’ve ever had to reinstall an operating system was that time I was playing with some System files and munged things up (my DT database was fine). :slight_smile:

The most common cause of database corruption is a power outage while data is being written to disk. Other common causes, of course, are messed up drive directories (easily preventable), and System crashes (generally easily preventable). Restarting your computer when operations slow down (usually because Virtual Memory swap files are being used a lot), or every few days, goes a long way towards keeping your computer happy. Don’t try to set records for the longest time between restarts – that’s only for servers, which generally use error-tolerant RAM.

Thanks for the explanation. good to hear that you don’t worry about corruption - unfortunately I was one who some 2 years ago (or so) had a BIG DT database go corrupt and I lost quite alot of gathered data (that was the time where some sync between 2 machines even had been promoted via the sync option). Don’t remember what the final conclusion was of the problem, but certainly also some user error next to some other problem involved. That made me being “double cautious”. It’s not only DT, anything that creates one big large database becomes (from my experience and from what I’ve heard) “rather prone” to problems (e.g. I know people who had HUGE Entourage datafiles go corrupt and many other similar stories). Often it’s just some kind of unfortunate summary of coincidences, but I still like to avoid those potential pitfalls :slight_smile: