I’ve installed DTpro 1.1 over 1.0 and tried to open a “heavy” database with 8.000 E-Mails hoping that the upgrade can handle this stuff.
I tried to open it via the open menu “Recently used”, DTpro started indexing the database but after 20 minutes of indexing and stuff there still hasn’t been an open database. I forced DT to quit via COM Opt Esc.
Please make sure you use the 1.1.1 version of DEVONthink Pro that was released a couple of days ago.
Now from your post it isn’t clear to me if you are trying to convert an existing database or import 8000 emails into a database. I would likely quit all other apps you don’t need so that the app has a lot of memory available. Also make sure you have about twice the size of the original database available on your hard disk. Then I would start it before you go to sleep and it should be ready by the time you wake up (don’t forget to allow the machine to run forever in the Energy System Preference).
We are aware that processing a large number of files takes too long to finish. Expect improvements in this area in a future release.
No, I’m not kidding when I’m talking about importing. We needed to import a couple of very large email databases of ~15000 emails and it really took several hours (on an 800MHz iMac G4)… After this has been imported the database is ok and snappy.
With regards to rebuilding an existing database, I have no direct experience with that. I know Christian has a HUGE database for testing purposes. I don’t know what his experience is with a rebuild of that database.
But nevertheless I’m also not kidding when I say that this import behaviour will improve in a future release.
5 minutes is still very slow - I guess you’re running out of real memory and virtual memory is much slower (up to 100 times). How large is the database and how much RAM does your computer have and which version of Mac OS X do you use?
The speed of importing emails on the other hand is currently limited by the speed of AppleScript but (as Annard already said) we’re working on this.
In the end this sounds like a memory issue - several applications running at the same time, a huge database and only 768 MB of RAM. At the moment (until v2 will be available) the only possible solutions are to either…