I have created a DTP database just by indexing my various data stores. It holds roughly 88,000 items and the file size is about 310MB. Currently DTP is taking up about 333MB of resident memory and about 580MB of virtual memory. Is this normal?
Thanks for the info. Just wanted to make sure something goofy wasn’t going on. Last night while indexing, DTP took up over 800MB of resident memory! (Out of 1.5GB) When I relaunched, it was back to its normal 300+MB. Looks like may need to add another GB of ram just to support DTP properly.
My main database hovers around 24 million total words content, and runs quickly on my MacBook Pro with 2 GB RAM or on my Power Mac with 5 GB RAM. Search results often take only a few milliseconds.
But when I’m really exercising the database by adding lots of content and running AI routines such as Classify or See Also, I’ll start seeing pageouts on the MacBook Pro, as even 2 GB RAM isn’t enough to keep from moving over into Virtual Memory swap files on disk.
But the 5 GB RAM on the Power Mac will let me run at full speed for days, without calling pageouts.
“RAM is good; more RAM is better.”