Devonthink runs more and more slowly

My database has grown to 1.82 gigs, and when I add material to DT Pro and switch views it runs more and more slowly, sometimes taking ten seconds (or so it seems) to perform a task. I almost always get Mac’s “rolling beach-ball of doom” on the screen when it performs a function. The only time it seems to function efficiently is when it is alone on the desktop and it is the only program in use (obviously), which I find impractical. Have I done something wrong, or is this the nature of the beast?


That is a pretty big database…

It sounds to me like you are paging out to virtual memory. If that’s true, the only real solutions are to buy more memory and/or split the database so that it is smaller.


Katherine’s suggestions are on point. A smart document management and information analysis application is memory-intensive.

But there are a few things you can do to speed things up, at least for a time, with your present computer configuration.

  1. Run Tools > Verify & Repair, then Tools > Backup & Optimize after adding new content. Optimizing the database compacts the database and gives a bit of speed increase. I like to use Scripts > Export > Backup Archive at a convenient break time, after making significant changes to a database.

  2. Katherine’s diagnosis that you are slowing down because of Virtual Memory use is right. If you were to quit DT Pro when it gets slow, then relaunch it, you would see an immediate speedup of operations. If you were to restart the computer the VM swap files would be cleared and you will start off using physical RAM, which is much faster than Virtual Memory, as VM involves swapping data back and forth between RAM and your hard drive.

  3. Temporarily shutting down other open applications when you are about to do searches or See Also operations in DT Pro will help. Fortunately, OS X lets you relaunch the other apps quickly when you need them.

Comment: There are also other possible causes of slow performance. Activity (Applications > Utilities) can display the current state of memory and CPU usage.

  • Lots of active Widgets can consume RAM and CPU. Some Widgets can also tie up network resources. I’m not much of a Widget fan. The only one I have consistently active lets me convert units of measurement.

  • A “stuck” process can eat up CPU resources and slow a computer to a crawl. Activity can let you take a look at what’s going on, and allow you to kill an errant process. When the computer is idling it shouldn’t normally show much CPU activity going on.

My main database file size is 4.52 GB. Total word count is 27,107,121 at this moment. As it’s my default database, I also use it to capture new content intended for transfer to another database. It’s due for a pruning by transfer, which will likely bring it down to about 24 million total words.

It runs pretty responsively most of the time on my MacBook Pro with 2 GB RAM. But the longer I work it, the more it starts using Virtual Memory. When that bothers me (I’m spoiled) its time to quit and relaunch DT Pro Office. That same database runs without pageouts on my Power Mac G5 with 5 GB RAM. So my next laptop will have 4 GB RAM.


just for info. My database is about 12GB and contained 15,000 PDF files.
I don’t think that my database is slow. If I do “See Also” for the first time I wait
35 seconds but the next time I use “See Also” it took only 3 seconds.

My hardware are a Mac Mini and 2GB of RAM and a MacBook Pro with 2GB RAM.


Right, Stefan. The size of indexed text is more relevant as a measure of database size than the file size required for disk storage, especially for PDF content (and for other content stored in the Files folder inside the database package file).

For example, I’m looking at one of my PDF documents. It has lots of graphics content. The Info panel lists its “Size” as 72 KB (that’s the text content, which is relevant for loading the database and for information analysis), whereas the “File Size” is 2435 KB.


I had this problem quite some time ago. I bought more RAM and splitted the database. It did not help. What helped was deleting all HTML files and materials that tried to connect to the internet and look for updates (although we have cable connection here in Japan which provides excellent speed).

With this solution I have no more speed problems. I just store RTFs and check HTML archives.

Hope this helps,

Maria, I’m interested in what you say. I have rather lengthy slow-downs when doing something new in DTPO (i.e., creating a new group or file). I’ve always thought it just like Bill indicates above.

But now that I think about it, I have a small number of ‘links’ in a specific group within my database. I still have quite a few webarchives, but these are not ‘active’ links to the internet in the same way that the actual links themselves are. To this end, I didn’t think they would be active in accessing the internet even if they were not open in front of me within the database. Perhaps Bill can clarify if this is indeed the case?

Bookmark links stored in a database don’t cause problems. I’ve got a Bookmarks group with probably more than 200 bookmarks – sites that I routinely visit such as journals, others that I might look at only occasionally for a special purpose. A bookmark document doesn’t go out to the Web unless selected.


my experiences are from, I think, 2005. I cannot say what has changed, in fact in my case this was obvious.

Sorry I cannot say more, I am going to leave my computer for a while now.

All the best, Maria

My own database is currently 1.1 Gb and contains around 43 million words. I have 3 Gb of memory on my machine, and the RSS for DTP is usually around 250 Mb.

Most days, DTP is pretty fast to load and to search. But some times – unrelated to anything else going on – it just slows to a CRAWL. Once I tried to click on a menu item in the menubar, and was required to wait for 2 full minutes before the menu was painted. This kind of performance is very spotty; some days are good, some days DTP is so consistently slow that I end up switching to another task and forgetting what I wanted to do with it.


Man, do I know all about that !

Whether it has anything to do with DT or not… :frowning:

But seriously, a year or so ago I decided to split the big db into 8 smaller ones, more topic specific. It is a PITA to deal with, but so far the benefits have outweighed the pain. These days, I’m in the habit of doing a thing, then closing the db, but leaving the app running. I created a keyboard shortcut for doing that, so it’s easy, almost like saving. When I’m ready for the next thing, I use LaunchBar to open the specific db. (by naming them and training launchbar, I know the correct keys to use for a given topic) I transfer the new stuff to the opened db, then close it again. Things are pretty quick that way, plus the db stays optimized and fit for duty.

The reason I close it afterward instead of switching to desired db upon the need to enter a new item is that it’s a forced laziness prevention technique. If I don’t close it, I’m tempted to put the new thing into the wrong db, and that really becomes a management nightmare, to try and remember into which db I put an item that I know is somewhere in one of the dbs’s.


I actually have split into two databases, since my Java technologies database alone is almost as large as the other one. But the reason I’m loathe to split further is that while I program, I’m often researching questions on the Web. While doing so, it’s common for me to come across interesting tidbits that I want to “tuck away” for later. I’m used to using a Cmd-I macro to quickly snapshot the page I’m viewing as a webarchive in DTP. If I started using multiple databases for everything, this snapshotting would become extremely cumbersome – whereas it’s exactly the ease of this that ensures it happens at all!

This will all be fixed once DTP supports multiple open databases. Then I can have my scrapbook database open 100% of the time, and various topic databases open depending on my task. It’s safe to say that DTP 2 is perhaps my Most Anticipated Upgrade ever.


Well I know all about that approach as well. That’s what I do on the laptop. The challenge there, for me, is that it becomes a dumping ground that I never see again. From time to time I try to ‘deal with’ the dumping ground and become overwhelmed, with thousands and thousands of items in there I start avoiding trying to deal with it. Unfortunately, I typically drag URL’s there, which DT doesn’t help categorize automagically. The reason I do that is that I really don’t want the whole pages polluting the db, and I know I only want parts of them, but dealing with snipping the important parts from each page at the time seems too time-consuming, plus mine are mostly from forum discussions where until the subject becomes stale in the forum, there is usually more to be gained from the conversation after the first time I come across it.

And the other challenge for me is that when I have a ‘triage’ db, I can’t remember if the item is in the temp db, or in the ‘real’ master db. Yuck. I know there’s gotta be a better way.

Basically, you got to pay one way or the other. Pain now, or pain later, just as in real life! LOL :unamused:

I hope I haven’t hijacked the thread, but as far as speed is concerned, doing things this way is the only manageable way I’ve discovered.

[edit: from what I understand, it’s not DT’s fault that it can’t deal with urls as real data, it’s Apple’s, or at least that’s what I remember.]