The most important measure of database size is usually the total word count, not the file storage size.
Your database with 61.5 million total words is larger than I would usually try to run on a Mac with 4 GB RAM. (My rule of thumb with 4 GB RAM is to usually hold the total word count of all open databases to 40 million words or less.) I suspect you encounter slowdowns and the beach ball during operation. That’s because when free RAM is used up, the computer begins swapping data back and forth between RAM and Virtual Memory swap files. Because disk read/write accesses are orders of magnitude slower than in RAM, processes slow down.
You can check the status of RAM and swap files by launching Activity Monitor (Applications > Utilities > Activity Monitor.app). Click on the System Memory tab. Low free RAM and lots of pageouts would indicate current or impending slowdown.
When performance becomes unsatisfactory a Restart of the computer can speed it up again, at least temporarily. There are utilities that can remove “crud” inactive data from RAM, and so provide more available free RAM – rather like removing a blockage from an artery. I use C0cktail for OS X maintenance, and it has a routine that can do that.
You might consider splitting the database into two or more topically designed databases, as each would probably fit more comfortably into the RAM space on your Mac and reduce the potential for slowdown. I treat such topical databases like information Lego blocks that can be opened or closed as needed.
If your memory is already strained, the safest approach to splitting a large database would be to select the content that is to be moved, choose File > Export > Files & Folders, and create a new folder to hold the exported items. When the export is complete, delete the still selected items that had been exported.
Next, create and name a new, empty database and Import (File > Import > Files & Folders) ALL the contents of the folder that holds the exported content.
Note that when DEVONthink opens a database, it must load into memory information about the text index and other metadata such as groups and tags. But it does not load documents into memory unless there were documents open in their own windows when the database was last closed. If large documents are loaded into memory when the database is opened, that would increase initialization time.