Encryption of DEVONthink Pro data file

A newbie question here–

What is the practical size limit for storing an encrypted Dropbox archive of DTP? I’ve been using for less than a week, and have a db that is 3GB already. I could see 6-8 GB total by the time I’ve moved all the data in.


[See Bill’s comments, below.]

If you mean a a database file stored within an encrypted sparse disk image (or other encryption method), the practical limit would be tied to the time taken to upload/download it as a single file, and so to your broadband speeds. Changes can’t be handled incrementally. Many ISPs allow much slower uploads than downloads.

I’ll assume that part of your question is about the practical size limitations of a DEVONthink database.

Although the database file size in gigabytes is relevant to data transfer speeds, there are other measurements that are more important as to the responsiveness of a database.

Christian suggests that databases should be limited to a maximum of 200,000 documents and a total word count of 300,000,000. But I want to be able to run my databases on a laptop with 4 GB RAM, and I’m spoiled, as I want quick performance and no spinning balls. That means I don’t want heavy use of Virtual Memory, which would require swapping data back and forth between RAM and Virtual Memory swap files on disk. I’m happy as long as there is free physical RAM,

In practice, I don’t want a database to exceed 40,000,000 total words. My main database contain about 35,000 documents and about 35,000,000 total words.

The word count density of documents varies widely by filetype. A plain text document is ‘heavy’ and a PDF or WebArchive is ‘light’. Often, a PDF with file size measured in megabytes contains only tens of kilobytes of text.

Thus, a database that contains 2 GB of PDFs would be well below the size maximum I use for my databases, but a database with 2 GB of plain text files would probably seem too large to me.

When I’m hammering away at my databases, I keep an eye on the remaining amount of free RAM, because I know procedures will begin to slow down when free RAM is exhausted. Apple’s memory management is pretty good at freeing up ‘unused’ RAM, but it’s not perfect. The latest release of C ocktail for Snow Leopard has a procedure to clear unused RAM and add it to available free RAM. My tests so far indicate that it helps keep my databases very responsive.