Several days of continuous processing for a 4TB disk. Doesn’t that hurt the potential life of the drive?
What do you mean by “continuous processing” and why are you adding a full 4TB drive to DEVONthink?
DEVONthink is not a Finder nor a Spotlight replacement. I suggest you read the Help > Documentation > Getting Started > Building Your Database and this blog post…
Not adding the files, just indexing.
Continuous activity is “Uploading items to MBP_M1.dtcloud” for > 100 000 items…
That may be more important than the drive size. Jim may well better understand what is going on, but can you please explain where you are seeing that message? Because I would understand it to mean that items are being uploaded to a local sync store, which may well have to happen, but which is not (to me) obviously part of an indexing process (but something which could happen as an independent stage). So is it really still indexing?
Importing or indexing doesn’t matter.
My original question stands: why are you adding a full 4TB drive to DEVONthink?
The goal is to have access to the list of files and their annotations even if the disc is not connected.
Could this be a deprecated use of DT3?
You won’t have access to the files if the disk is disconnected. You only could search and see results, but not open found documents nor searching inside them.
I have about 500 GB in indexed data, in 15 databases, my limit of files for each database is about 5000 files (mostly PDF). I think here I’m one of the most heavy users in relation of amount of files.
DT performs well (some seconds delay in search) in my iMac (SSD 2TB, 24GB RAM), my MacBook Pro Intel (1TB 16GB) and my MacBook Air M1 (1TB 16GB), in this case search takes less than 1 second), but I think 4TB is too much.
I was thinking of trying to use DT for something like what I think the OP was aiming for.
I have tons (Four large storage bins worth) of hard drives full of media (at least 50TB) for editing projects. All sorts of naming conventions and labeling states. I have to spin them up regularly as part of maintenance.
There are a few drive cataloguing packages out there but none that I’ve stuck with over the years. I was thinking of indexing a drive in DT, making a DB of it and backing that up before I put the drive in storage. If I’m looking for something I can find it in the DB and go get that drive from storage.
The first drive I tried to index I stopped it after it got to over “preparing 1000000 files”. Video is thousands of files and tons of data in every form. I use every editing package out there so there’s no one way to organize every project and every client has different needs and deliverables.
These are well organized drives so I could see it working if I could index only to a certain depth as I don’t need to know where every render/pre-compute/media database file is but the more I think about it the more I realize DT isn’t the way to do this.
Maybe one of those drive catalog systems could be used within DevonThink like an excel/numbers file or something.
I might be seeing everything as a nail while I carry the DevonThink hammer around right now.
Thanks for the comment. I came to the same conclusions.