Long beach ball with any first operation upon opening DTPO


As I describe in the title, the very first database-modifying operation that I perform with DevonThink after each application startup takes very long to complete, and imposes a standstill (beach ball) in the meantime. It can take up to a couple of minutes. I understand that my database is large (almost 5GB) but on a dual-core Mac with 6GB of RAM (albeit without SSD storage) I would expect better results. Why would a simple folder rename require two minutes of runtime?

I have optimized and maintained the database with all available mechanisms in DTPO but there has been no noticeable improvement.

Thanks for listening.


Using Exposé to display all DTPO windows (note: not all running app windows) revealed three icons at the bottom of my screen, where minimized windows of the app are supposed to be shown. I found that weird as not only did I have only one database open but the other two supposedly minimized databases where no longer anywhere on my hard disk! They had been deleted long ago.

What solved the issue was a simple “Open Recent Files > Clean List” (or something like that, the precise wording escapes me now).

The problem was actually not solved! Saving a simple rich-text note into my admittedly large (~5GB) database often takes a couple of minutes.

Any remedial ideas? These slowdowns are quite disruptive!

Thank you.

The most meaningful measure of database size is not the disk storage size, but the total number of words in the database. What’s your total word count displayed in File > Database Properties?

A database holding plain text files could easily be a thousand times larger in total word count than a database holding PDF files, although both have the same file size.

I can run my standard set of five databases, which have a total of 6.2 GB file size, on my MacBook Air with 4 GB RAM, at full speed, without slowdowns (as long as there remains several hundred MB of free physical RAM). But I would expect very poor performance if that 6.2 GB of database files contained only plain text document files. My set of open databases contains just under 40 million total words. I expect most of my searches to take 50 milliseconds or less, Classtfy and See Also suggestions to appear in no more than a second or two, and spinning balls to never appear.

Launch Apple’s Activity Monitor app, which allows checking the amount of free physical RAM, the number of page outs occurring and the size of Virtual Memory swap files. When your Mac runs out of free RAM, Virtual memory takes over, allowing continued processing of data by swapping data Bach and forth between RAM and disk-based swap files. But heavy use of swap files results in slowdowns, sometimes long pauses during which the dreaded spinning ball appears.

My guess is that the slow performance you see is happening because the amount of free physical RAM is so low that Virtual Memory is busily moving data back and forth between RAM and swap files. Between the memory needs of your database, the memory demands of other applications and the accumulation of “crud” inactive data stuck in RAM, free RAM may vanish in time. That’s not good!

Depending on the total word count in your database, you might consider splitting your database into or more databases so that they can individually run without slowdown, along with other applications you wish to keep open. I’ve been doing that for years. Altogether I manage more than 250,000 documents among a number of databases, each of which meets a particular need or interest. Needless to say, I couldn’t fit all of them onto my Air’s 256 GB SSD. That’s fine; I spend about 99% of my time working in the set of databases that I keep on the Air. If needed, I could access rarely used databases from an external drive mounted on the Air via Thunderbolt. (The read/write speeds of the Air’s SSD and the Pegasus Thunderbolt RAID unit make my quad-core iMac feel bog slow.)

Apple has just released a Safari update, and the release notes say that memory management is improved. I hope so, as the previous version soaked up RAM more than any other app on my Mac, including sticking a great deal of currently inactive data in RAM. I found that, with only Safari open, I could get it to eventually use up all free RAM - even on my iMac with 8 GB RAM.

I use two utilities to avoid running out of free RAM, so as to keep my databases at top speed, Although I frequently preach about the potential hazards of installing hacks, I use one called MenuMeters that allows me to monitor memory usage in the menubar. When I see that free RAM has dropped to about 700 MB, I click on the Purge button in C0cktail to clear inactive data from RAM and optimize memory. Presto! I’ve once more got plenty of free RAM “headroom” and the Air never slows down.

Hi Bill,

Thanks for the thorough post.

I regularly check the memory allocation on my 6GB Core2Duo Macbook using the iStatPro widget . Anyhow, the issue is largely uncorrelated to the amount of free memory as it persists even with whole gigs of free RAM. I am intrigued by the maintenance app you mention and its “release” button. Tempted to try it out but also reluctant to directly poke with memory management (Lion is already slightly unstable as it is).

My database contains few text notes but quite many OCR’ed books. It amounts to approx. 50m words (1,5m unique words). I am attaching a screenshot with my stats, should you find it useful. It seems comparable in size to your database, which as you say responds instantly, so I am still puzzled.


not sure if this could be in connection with your problem, but as you mention OCRed (=scanned) documents I remember having problems with some OCRed documents (not all!), which took very long to be displayed in DevonThink.

Is this also the case for you?

Honestly I don’t remember the details, but Christian Grunenberg found out, that it had to do with the resolution of the preview images in the scanned pdfs (if I remember correctly)

@Bill DeVille
Thanks for your elaborate explanation.
Unfortunately, my old MacBook is limited to 3 GB of RAM, so I’m reaching the limits constantly with my large databases, which makes working painfully slow.
I had the same thought like you when I read about the optimization of memory usage of Safari, as it is one of the most ressource-wasting applications, unfortunately.

(In addition, I often have many windows and tabs open in Safari and do not want to close them because I’m afraid of loosing inforamation… :unamused:
Sure a good workflow with DevonThink could help, but is not established yet.
What helped me a lot was discovering the Sessions plugin for Safari (dl.dropbox.com/u/8247646/sessions/index.html) which lets me easily save the last state of open windows and tabs and give a good overview of the browsed sites in a saved session)

@ Macula:

elwood’s comment about document file loading time is worth thinking about.

When you launch the DEVONthink application, is it opening (automatically or by your manual action) a set of windows that would require opening some large documents to display them? That might take a while, and would be consistent with your comment that the beach ball shows up early on.

Re your database size of 50 million total words: That exceeds my role of thumb size limit of 40 million or less total words in any one database, or the aggregate of total words in a set that’s to be kept open for an extended period of operation. It’s 25% larger, so I would split it if it were to be routinely run on my laptops that hold 4 GB RAM. I treat my databases like information Lego blocks that can be assembled and searched across as needed.

@macula, Bill DeVille:

I just stumbled upon my report about the above mentioned problem:

(maybe it helps, however the information about the reasons are not (yet) in this discussion: it seems, that some of the PDFs contained incorrect dpi values and therefore were rendered as a huge bitmap image, loading very slowly in DTPro and Preview.app)


I opened my database package file in Finder and discovered a number of very large files (e.g. metadata files) that were duplicates. These were produced by some failed synchronization session attempted by Dropbox. I have no idea what the exact sequence of events was as I currently have only one machine, therefore I would not expect sync conflicts to occur. Yet indeed multiple conflicts had led to this very bloated database file, and possibly to the delays that I complained about originally.

Interestingly, despite those ‘conflicted’ files, no warning signs were given by the “Verify & Repair” mechanism of DTPO.

I deleted those duplicate files from the package (always double-checking that their ‘last modified’ date is earlier than that of the respective original file) and reopened the database. There have been no delays whatsoever ever since. I will let you all know if the problems resurface, though so far they seem resolved.

Moral of the story: Use Dropbox and DevonThink together only with extreme caution, and keep your eyes open for possible complications.

Thanks for the report.

The forthcoming Sync plugin for DEVONthink should make Dropbox safer to use.

In the meantime, we don’t recommend using Dropbox for sync or backup.

I have the same problem (extended beach-ball spinning) with my default database, which has 60+ million words in it. Here’s a screenshot of the database properties:

Since I read this forum I’ve divided its contents into other databases to keep their word-counts lower. The way I found files with high word-counts was to do an Advanced Search of my database for files with word counts greater than 100,000. Then I thought carefully about whether they could go someplace else or not.

The problem is that the majority of my folders have replicants of other folders in them. If I were to move those folders to a new database, it would break that chain.

If only it were possible to replicate materials across databases, or to link to materials that live in other databases! :wink: