Reviewing the Forun ref this matter, BLUEFROG indicated in 2020 the following “for a given database, the number of words / unique words are more critical. On a modern machine with 8GB+ RAM, a comfortable top limit is 200,000,000 words and 4,000,000 unique words in a da…”
and in 2023 " If you get above 4.5 million unique words or 250,000+ items in a database, you may want to start thinking about a split."
So the limit has increased a little bit …
Ref of my DB I get : 2.422.562 unique words + 279 471 639 words totalizing 26GB so i am under the limit ref unique words and above limit ref total words
I have a M1 Max MB 32GB everything running like a charm
I would like to merge with another DB totalizinf about 50GB, with thsi scenario will need flexibility to reach 100 GB … No idea how many unique words i will get in this scenario
Questions : I usually change my MB every three years will M3, M4 etc … change the limit? IN 2024-2025 have the figures changed ? Finally has anyone experienced DB with 50-100GB if so how many unique words and total words have you got?
Whether this always works for everyone remains to be seen, of course.
What is certain is that anyone selling a product will never base their recommendations on the actual limit. They will always maintain a certain safety margin so as not to have to revoke the recommendation. That is fundamentally wise. In the end, you simply have to try such things out for yourself … at your own risk.
A question of similar nature concerns the limit of human population on earth (or a certain city on that planet). There are experts who have calculated an estimate. Their number is not an absolute (or precise) limit because human society will still work (albeit with major caveats) even if it overshoots the calculated limit by a considerable margin.
It’s impossible to define what constitutes a precise limit. If you mean a size beyond which the system will immediately stop working, then it’s perhaps the same as the capacity of your SSD.
Yes “precise limit” sounds a weird expression all the more as I cannot “anticipate precisely” how tomorrow’s innovation will impact the limit of my DB (My SSD is 2T).
That is not prescriptive; it is only descriptive, especially as the great majority of users don’t have 96GB RAM or anywhere near that. So it is not something we advocate.
Could it work? Possibly.
Is there the potential for serious performance problems? At that scale, yes. If you have a machine with 8GB RAM, it would definitely be inadvisable.
This is why we recommend comfortable limits, balancing performance and utility for most situations.
I would like to merge with another DB totalizinf about 50GB, with thsi scenario will need flexibility to reach 100 GB
Currently I have 8 databases, biggest is 216 GB, total size is about 500 GB, and with all open in my 2019 iMac with 24 GB of RAM and 2 TB SSD disk the normal use (navigate across databases, move items or jump into Groups (CTRL+CMD+J) is almost immediate, and search takes its time but works. In my 32 GB RAM with 2 TB Mac mini M2Pro, the only difference is search, that is significantly faster. My MPB M1Pro with 16GB of RAM has a good performance as well.
As far as I remember your computer isn’t the typical, average Mac And actually I’ve seen screenshots of databases with millions of items and billions of words. It all depends on the hardware. And a little bit of patience might be necessary.