My only major criticism of DT is that it’s too slow, especially when saving text from another app. StickyBrain, while limited in other ways, is much faster, and consequently I find myself using it for saving scraps of text
How much memory does your Mac have and how big is the database? Anyway, v1.8 (coming in a few days) will be much faster (e.g. importing/adding plain & rich text usually 2-4 times, modifying text around 10 times, deleting contents up to 20 times).
I have to say that speed (or lack of it) really really is a BIG issue for me too. First of all, because I back up and verify on launch it takes AGES for DT to launch. Having launched the first few operations I perform take MINUTES. You did explain to me in the past that DT doesn’t load the database until a few operations have been performed … but why can’t the database be loaded as soon as it is launched? This speed problem is a real deterrent compared to using the wonderful Hog Bay Notebook, which I use to prepare all data before importing it into DT. Instantly accessible from launch, and lightning fast searches as well. For reference I am using a 700 MHz G3 iBook with 640 MB RAM. I really do hope 1.8 is as fast as you suggest … and I am also desperately waiting for the Pro version and the client server versions, as I have prepared a knowledgebase of 100,000 pages of data for everyone in my company to access as soon as this becomes available.
I can tell you that the 1.8 beta is MUCH faster than prior releases. It feels like a whole new app.
And version 1.8 will be released today… However, as notebook harddiscs are definitely much slower than desktop harddiscs, it’s definitely no good idea to load the whole database on startup (and this would be a huge waste of memory too).
But I’m still curious about the size of your database as the 800 MB database created yesterday for testing purposes on my private iMac/800 performs much better than the database on your machine.
By the way - verifying on startup needs no time. But backing up a huge database on startup needs of course lots of time but this is only limited by the speed of your harddisc - you can’t expect DT to copy hundreds of megabytes in no time.
The database currently stands at 100.7 MB, optimized. That’s the size of the four files that make up the database, but not the backups. That currently compriese 4069 rich text pages, 19 HTML pages and 441 groups.
Approximately 20,000 further rich text pages are currently in different stages of preparation ready for eventual import into the database. The pages are of variable size, and range from 100 to 20,000 words. I will then be extracting a lot of that data in the database, possibly creating up to another 10,000 rich text pages in the process.
The problem as it stands is that it takes so long from startup to get the database to the point of speedy performance that it becomes a deterrent to use DT. In comparison, the same data in Hog Bay Notebook is instantly accessible, and searches are performed as rapidly as I can key in the search terms right from the moment the database is launched.
I shall therefore look forward to using DT 1.8, and let you know how well it overcomes this speed problem. I am also, for the reasons mentioned in my previous post, very interested to know when the Pro and EE versions will be available …
Actually this database should perform VERY well even with v1.7.5. Is there anything special about your system? E.g. some tricky third party software installed? Virus scanners? Anything accessing the harddisc concurrently and therefore decreasing DT’s performance dramatically? Because seek times of harddiscs are still very poor.
It performs fine once it’s up to speed. It’s just very sluggish for the first few minutes. until it gets up to speed. Typically I have to perform 6-7 searches after launch (which start off very slow, but get progressively faster) before it speeds up and performs in the manner I expect. After that it’s fine … the problem is just the hassle of getting it to perform immediately after launch.
No I don’y have any tricky third party software installed. I avoid averything like that like the plague. Hence no virus checkers either!!!
I’ve just downloaded 1.8, so will report back on the difference it makes.
Maybe running sudo fs_usage DEVONthink immediately after starting DT could help isolate where the performance bottleneck is? Since the output has timestamps that would at least show if there are any significant delays while the database is loading.
I usually run fs_usage first, then sc_usage, then ktrace for basic process debugging. fs_usage helped me discover how "noisy" Palm Desktop is.
Thanks for that suggestion. That’s taking me into unfamiliar territory, so I’ll try it when I don’t have time pressures.
I have now tried 1.8 and it does launch a lot faster. The first few searches were still a little slow, but not as slow as before. It’s looking a lot better. I’ll comment further once I’ve been able to observe it for a few days.
I am very glad that you’ve incorporated a display of the location of the current selection in the status bar, of the search window. Can I request you also add the same feature into the main browser window as well, alongside the number of items found in the search (which I was also very glad to see).
Just started using 1.8 and all I can say is – wow! MUCH faster. Thanks. This program suddenly became a lot more useful to me.
Which search settings do you usually use? E.g. phrase/wildcards/fuzzy are much slower than other settings but still very fast over here. And did you rebuild the database or just switch to v1.8?
I appreciate the difference the search settings make. All searches were on “any word” … and yes I rebuilt the database so that it was the latest 1.8format. Certainly it is now a LOT faster than it was, and launches a lot faster, so I really appreciate what you’ve done to speed it up. It’s just the first few searches after launch that are a bit slower … but overall speed is really not so much of an issue any more. Thank you!