If you import any files into your DT library, then the integrity of the data will be checked each time you open the data base.
But what if you just index files which stay outside the data base. Is it possible to run data integrity check on these files?
I have several big files (such as videos and photos in RAW format). Currently I use EagleFiler to regularly check their integrity. However, as DT is my primary data management tool, I would prefer to use DT for this purpose.
Of course I could just import all those files into mu data base, but a data base of some 100 GB would just “kill” my Time Machine.
Any suggestion? Thanks in advance,
No, this can’t be checked right now. The only thing that could be checked via AppleScript is if the files exist.
thanks a lot for the quick info. As a side note, there are only two advantages which EagleFiler has over DT (at least in my opinion):
a) EagleFiler stores files in their original form (but DT 2.0 will do it, too)
b) EagleFiler data bases are Time Machine-compatible: if you make some small changes into a big data base, it will not cause the entire data base to be copied the next time you run Time Machine. And I think chances are pretty good that DT 2.0 will also be Time Machine-compatible.
Other than that, I think DT is a great product. Actually, DTPO has become by far the most important single application I have on my iMac
Wouldn’t another possible check for indexed files be to synchronize the folder?
That way DevonThink recursively parses the whole folder structure and checks each path stored in the database (and updates the database and index for all new files/folders and all files/folders removed).
Unfortunately the log/protocol created is unreliable regarding indexed files.
Sometimes items are missing in the protocol that have been newly indexed upon synchronization. Hopefully this will be fixed soon as the log/protocol is the only way to check database integrity with huge collections of files.