Frequent crashes on Mavericks

Has anyone else found DevonThink very unstable on Mavericks? I’m running DTPO 2.7.2 on a fast machine with plenty of RAM and all the software up to date, and I get crashes all the time. Sometimes, it’s when I do something (open a file, search for something), and sometimes I’m not interacting with DT at all and it suddenly crashes while in the background. It’s very annoying, especially when I’m looking at several files at once and then I have to go find all of them again.

Mountain Lion didn’t seem to have these problems - DT would crash occasionally, but nowhere near this much. I also don’t have problems with most of my other programs on Mavericks.

I’ve run DTPO now for two years, on Snow Leopard and then Mavericks. Never had a crash on either OS. I’d suggest you open a support ticket, and send in your crash logs.

OK, opened one, thanks.

I have made the same experience. First I thought it’d be due to a very large database (>15G). I tried to split the database, but could not as on every try DTPO crashes.
So currently I open up any of my databases - whatever I do (adding new files, mails or just searching the database) - DTPO quits ungracefully :frowning:


Please start a Support Ticket.

@dt-mor: It sounds like you have created a database that’s much too large for the memory resources of your computer. The total number of words in a database (displayed in File > Database Properties) is a more meaningful measure of database size than the database file size.

An alternative approach is to create multiple topical databases, each of which has a total word count that leaves enough free RAM available for the computer to carry out procedures at full speed, without resorting to using Virtual Memory swap files.

For example, on a Mac with 4 GB installed RAM my rule of thumb would be to keep the total word count of all open DEVONthink databases below 40,000,000. Even so, it may be necessary to monitor free RAM use over time, to avoid slowdowns caused by data swapping between RAM and Virtual Memory swap files on disk. Safari (and perhaps other browsers) can be a memory hog, especially if multiple tabs are used; quitting the browser occasionally can restore free RAM.

Given the current size of your database, the best approach to splitting it would be to select some of the content, choose File > Export > Files & Folders and create a new Finder folder to hold exported content, then confirm the export. Afterwards, delete the selected material and empty the DEVONthink Trash. Continue doing that (saving the exports into different Finder files) until you have reduced the total word count of the current database to an acceptable level.

At that point, you can create a new, empty database and import (File >Import > Files & Folders) the contents of one of the Finder folders that holds previously exported material.

Obviously, you will not want to open all of those “split” databases simultaneously. Instead, treat them like information Lego blocks.

Although in your circumstance you will not be able to open and search across all of your databases within DEVONthink, you will be able to do so using Spotlight. By default, each of your databases provides indexes of its contents to Spotlight. When you do a Spotlight search, choose the option to view all results. Items contained in your DEVONthink databases will display the blue ammonite shell icon. Select one and press the Space bar to view it in Quick Look, or double-click it to open it in its DEVONthink database.

Hello everybody,
I opened a ticket already.

I do not think, the problem is coming from to large databases. The largest one contains 2.500.000 keywords. Seperating in more different databases does not make any sense to me, as I start loosing all advantages. For example, there is different ways as I do get licences - per mail or in written, so I do have to scan them. So having scans in an other database than mails makes me create seperate tags and searching independently two different databases (search over all open databases is only possible from the menu “search in databases”). So splitting my databases in even smaller fragments starts loosing all advantages and instead buying me discomfort.
Indeed the problem seems to be coming from Mavericks memory management (as support told me). With the latest Mavericks update the problems reduced to a level, I can live with (once or twice a day DTPO crashes - for an old windows user you learn to cope with :frowning: )

Anyhow thank you a lot for your explanantion, and any further tips are greatly accepted!


I opened a ticket :wink:

Let me describe my current situation.
I have got 4 different databases with roughly 2.800.000 words. My largest database contains about 15.000 files (85% pictures and the rest LaTex-, doc(x), xls and pdf) - makes in sum about 100G - but should make a pretty small footprint.
The second database contains only mails (about 25G), and has got the largest footprint - 2.400.000 words (as the mails are already grouped topically in Mail, I use it mainly for (re)searching, which is a lot slower than Mail).
The other two databases are fairly small.
One containing scans of receipts, snail mail, … And one with web findings. These two databases are really tiny in comparison to the other two databases.
So I presume, memory is not the limiting factor - even when all databases are opened.

What the crash logs do suggest is that memory handling of Mavericks is somehow unstable (which seems to have improved a little bit after the last update of Mavericks). Still DTPO crashes more often than I would like it to be (still far less than some other applications I have used under other OSs eg).
Never the less I would say that a further splitting of databases does not make any sense, as I think it would mean loosing some of the big advantages of DTPO. Even using 4 different databases is sometimes giving me some headache. As the databases are split more or the less from the source it is coming from. All of them could have the same tags (receipts, manuals, information regarding an area of research, …)
Searching for information within more different databases is not really great (as the have 4 different indexes, …).

Maybe my strategy could be improved?

Thank you so far for clarifying the matter! I’m open to any hint to work better with DTPO :wink: