The amount of free physical RAM is the ultimate limiting factor in performance as databases grow large. Procedures such as search, See Also and Classify involve memory-intensive operations. These days, 1 GB of RAM is “only” 1 GB of RAM. Some current Mac laptops can hold up to 8 GB RAM. (Gee, I remember spending hundreds of dollars to upgrade my Mac Portable to 4 MB RAM, in preparation for an Egyptian project.)
The reason memory-intensive operations slow down when they run out of available RAM is that Virtual Memory comes into play. VM swaps data back and forth between RAM and disk. Of course, disk reads and writes are very much slower than reads and writes in RAM, so there can be perceptible pauses.
The same database created in DT 1 should be much more responsive in DT 2, because of differences in the database structure, which reduce memory requirements.
But I suspect much of the problem you have relates to database errors. If non-correctable errors are found by the Verify & Repair procedure, the database needs maintenance, as matters will only get worse otherwise.
Before working with a damaged database, I often recommend making a compressed copy of it, as a resource that can come in handy if repair attempts really mess up the original. Caution: Always Quit DEVONthink before making a Finder copy of a database, as the copy may be incomplete or have errors.
Now that you have a zipped copy of the database, launch DT 2 and choose Tools > Rebuild Database. The idea behind this is that the database will first export all its contents (groups and documents), then bring them back into the database. Any files that may have errors will likely fail the export/import procedure, and they will be listed in the Log. Note: If there is a list of ‘failed’ files, save the list for future reference.
After the Rebuild, does the database seem OK? Inspect it for any obvious problems such as lost groups, etc. If it seems right, run Verify & Repair, If there are no problems run Tools > Backup & Optimize.
Remember that zipped copy of your database? It may come in handy if you have a number of files that failed to be included in the rebuilt database. They should be in the Files.nondex folder inside that database. Copy that Files.noindex folder to, e.g., the Desktop and start mining it for your missing files. I would suggest that you capture them into the rebuilt database in small batches. Look at the Log after each batch to see if one or more failed to import, perhaps because of damage. Periodically, run Tools > Verify & Repair to make sure you haven’t reintroduced the cause of problems.
Note that when new data is added to a database, if you have turned on Spotlight indexing Spotlight will likely go to work indexing the new content.
I doubt that the slowdown is related to RSS feeds, unless you have a great deal of stuff being dumped into the database, triggering Spotlight indexing as well. As a check, turn off Internet access (WiFI) for a while and watch performance.