Maximum file size DEVONthink can import/index?

What is the maximum file size DEVONthink can import or index?

I am trying to index a text file of 8,94 GB containing mixed Arabic and English text, and DT simply gives me a “Failed” error.

Thanks for any explanation.

Where did you get this file?

I concatenated it myself. Plan is to make a word cloud from it.

Concatenating isn’t necessary actually, you could simply import smaller files (max. 256 MB in case of plain text) as the Concordance inspector supports multiple selected items. However, due to the amount of data which exceeds our recommendations for a database a lot, a computer with lots of RAM (64 GB or more) and CPU power would be ideal. And even then the Concordance might require some patience.

Thanks for letting me know the max size of plain text files (256 MB).
I concatenated my file in R by multiplying (repeating) a number of source texts depending on how often they were shared on social media (thousands to tens of thousands of times). The aim is to construct a word cloud weighted according to the popularity of texts shared. I.e., a text shared 52365 times would weigh relatively more than a text shared only 1001 time.
Splitting the concatenated file it into individual manageable bits might prove somewhat challenging as I don’t want to split words.
Any other thoughts on how to solve this issue / create a weighted word cloud?

Why not spilt at line breaks?
Also, splitting a 9GB file at 256MB would result in about 36 files, so about the same number of incorrectly split words. Why would that be a problem given the overall number of words?