max. Size of DT-database

Is there a max. Size-Limit of DT-database, because of technical or practical reasons (backup, etc.)?

I would like to save all my pdf-Magazins and eBooks in ONE DT-Database.

This could fast reach 50 GB (Gigabyte!) or more.

Any suggestions (other than “don’t do it” :slight_smile: … )

There’s a maximum number of images – 10,000 including PDFs – that can be stored in a DT Personal database.

There’s no such limitation with DT Pro/DT Pro Office databases.

Select some of your typical PDF documents in a database and examine the Size and File information for each file in its Info panel. For many PDFs the Size, which indicates the text content, is substantially smaller than the File, which indicates the disk storage space for that PDF.

For practical purposes, Size is related to the memory requirement of a PDF in your database, but File is not. Note, however, that you must have enough free space on your hard drive to allow duplication (copying the PDFs into the Files folder inside the database) of the PDFs if you are Import-capturing them, plus the additional space required to store the 3 internal Backup files within the database. Plus, of course, enough free space for temporary files, especially any Virtual Memory swap files that may be required. It’s a good idea to keep about 15% of your hard drive’s capacity in reserve for those temporary and swap files, so that your applications and the System never run out of free drive space.

Let’s assume that you have ample free hard drive space available. We’ll focus on memory requirements, which in turn relates to the responsiveness of your database in the event that loading the database and operating it requires significantly more memory than your available free RAM on the computer.

We’ve already noted that the memory requirement os a PDF may be less than its file size. So your 50 GB collection of PDFs will require, let’s say, perhaps 15 GB of memory in your database. (Could be less, but I’m making assumptions.)

If your computer has 2 GB RAM, not all of which can be allocated to the database, you will be able to load the database into memory. But the larger the database, the longer time will be required to get through verification and other steps involved in opening the database. Worse, because you don’t have enough free physical RAM, your computer will use Virtual Memory to supplement physical RAM. That means that most of the data in your database may be stored in VM swap files on your hard drive and must be accessed from the drive for any data manipulations. Reading and writing data on your hard drive is much slower than performing those operations in RAM.

I don’t think you would be happy with the responsiveness of the database on that computer.

But if you’ve got a Mac Pro with 32 GB RAM, you would likely find the responsiveness of that database perfectly acceptable.

That’s why, as a practical matter, I use topically designed databases that load with a good amount of free RAM still left over. As I often note, I’m spoiled. I want most of my single-term search queries to complete in less than 50 milliseconds, and I want See Also lists to pop up as quickly as a new document is displayed. I never want to see a “wait for me” spinning pause indicator. My main database currently contains 27,128,061 total words. On my ModBook with 4 GB RAM and a few other apps open, I’ve still got almost 2 GB free RAM left when that database loads. So my database runs at full speed and still has considerable room left for growth on that computer.

Apple’s Virtual Memory is remarkable. It allows procedures to chug along to completion with limited RAM. But as the number of pageouts becomes an appreciable fraction of the number of pageins, spinning balls appear. The art of Zen becomes increasingly important.

Suggestion: Go ahead and experiment. Assuming you have enough free hard drive space, create a new database and see how big you can make it before performance becomes unacceptable for you. If you don’t delete the original files that you copied or Indexed into the new database, nothing will be harmed if you subsequently delete a test database.

Hello,

DevonThink stores its imported files all in one folder. With a large number of files, this could become a source of trouble outside the DevonWorld.

I have recently had problems with such a “Files” folder within a DTP database. It held over 63000 files and was the cause for Micromat’s TechTool Pro (current version) to never finish some tests/repairs although the progress bar had long completed. One test seemed to loop through said folder, as I could guess from the flashing filenames. Moving files into smaller folders (subfolders or outside) eliminated the problem for techtool, so there was a manual workaround. Finding this out costed a lot of workhours… Micromat was alerted and the folder sent to them, hopefully there will be a fix.

More generally speaking: would it be useful to set DevonX to limit the files per folder, creating new (sub-/side-)folders as needed?

HTH,
Br@