Performance with large DB and large PDF files

I have many large database files, and I also import/export PDF files in and out of the devonthink databases. I use DTPO 1.5.1.

One thing I noticed is that the DT makes many disk i/o calls while importing files, or when “see also” and “classify” functions are used. The CPU usage is close to zero, and the data throughput is very low, less than 5MB/s. There is plenty of memory. It does not appear that this is not my hardware problem.

Is there a way to improve performance in handling large databases? (The fact that I can’t open multiple databases simultaneously encourages me to make database larger and this is the result of it…)



You might check if Tools > Backup & Optimize will increase the performance.