I’m not getting the “sycnhronize” operation to work. I observed this problem with a db converted to DT2, but since have tested with a freshly created one.

i.e.: import a folder and its files to a db, then add new documents to the folder and execute “synchronize.” … “nothing” happens?

If you had Index-captured that folder using File > Index, a continuing relationship is established between that folder and its contents in the Finder, and the corresponding group and its contents in the database.

Thus, if you subsequently add new files to that folder, or edit and save one or more files with modifications. there will be divergence between the Finder folder and its contents and the corresponding group and contents in the database.

Now, if you select that database group and choose File > Synchronize, the Path to that folder will be followed and the Index-capture will be updated. You will see all new and changed files from the external folder, in the database.

How can you tell if a group or document was Index-captured? In DEVONthink 1 the symbol for Indexed is a blue circle with white “lightning bolt” inside it, to the right side of the Name. In DEVONthink 2, that symbol is a gray arrow curving upwards to the right, also at the right edge of the Name field in a view.

There’s another significant indicator: look (in the Info panel) at the Path of an Index-captured group or document. It will point to a location outside the database package file.

Important: if you select a document that has been Index-captured, such as a PDF, you may find it impossible to read. That’s what would happen if the Path to the external PDF file were broken so that the database can no longer find it. Information will have been lost. If that external file were deleted, renamed or moved to another volume, the database will lose information.

So there’s one-way synchronization from the Indexed folder to the Index-captured group in the database.

If you had Import-captured that folder to your database, there is no continuing relationship between the Finder folder and the database group. That’s because the group’s Path now points internally within the database to the copy of the folder within the database. Any subsequent changes to the contents of the Finder folder that had been imported are irrelevant to the database. You are free to delete that Finder folder, and there will be no loss of information in the database.

An Import-captured database is self-contained. An Index-captured database remains dependent on unbroken Paths to external files.

I routinely move my databases from one computer to another, or run them on an external drive that moves between computers. For that reason, I prefer Import-captured databases, as it’s much trickier to move both an Index-captured database and it’s associated external files, without breaking Paths.

I understand the difference between import and index: with DT1 I always imported the top folder of a category that I wanted to maintain in DT because I wanted the database to be self contained.

Synchronization did work to import the additional new files, but not modified files except in some circumstances (the conditions for which I never worked out … my version control “problem” I took care of elsewhere … except that synch would also re-import certain unchanged files as well with a -xx). That was slightly annoying, but the overall effectiveness still powerful.

The user guide is not distinguishing between the two db styles.

"To create this link, which is maintained by the Path field in the Info panel, you have to freshly import/index your folder of choice.

Select the group(s) and/or document(s) and choose File > Synchronize to import all files that have been newly added to the synced folder(s) in the file system."

The thing is, Bill, with respect, what you may do is not the same as what we may want (or indeed need) to do. Things would be much easier for an evangelist if users’ requirements were invariably a subset of his own; but 'tis not always so…

Yes, archiving files rather than indexing them is undoubtedly safer. But it won’t always do. It’s fine for archived material which isn’t going to change. But for material which is “live”, and which IS going to change, archiving is unsatisfactory. Anything produced in a foreign format (that is, not RTF or TXT) is going to be modified externally. I am working in Pages at the moment. For very good reasons, my files for this project are indexed in DT Pro. For equally good reasons, many of those files are “live”. Here’s what happens:

  1. I index a file in DT
  2. I work on that file in Pages. (Doesn’t matter whether I open it from Pages or from DT, of course).
  3. I save my changes.
  4. QuickLook will preview the file, complete with changes, in DT; but…
  5. DT knows nothing about those changes until I sync the changed file.

“Okay” you may say, “so sync the file, then.”

Not as easy as it looks. I have to remember every file I changed, select each one (if they are not contiguous/in a DT folder) and run File>Synchronize. I might easily miss something; or, long after the event, go looking for something and run straight into the I-Know-It’s-In-Here-But-I-Can’t-Find-It problem.

A number of things could make this easier.

  1. A command which finds all files/folders which have changed since their last sync date and synchronizes them.

  2. A smart folder which will find those files. (Kind = Indexed File/Folder, perhaps)

  3. A background version of (1) applicable on a file-by-file basis, for people who don’t want background sync all the time (version control lashups, for example).

While we’re at it, how about another command which says “Okay, I’ve done with this file now, archive a copy and delete the indexed-file link” – because it’s usually once a project has ended that we move our files to an Archive folder or throw them away for good.

And, still while we’re at it, how about

(1) Finder-style file links which can track a file even if it’s moved

(2) An “import-and-delete-original” command or CM.

The thing is, Indexing isn’t properly up to the job. It’s a one-time, unidirectional process that has to be manually updated but doesn’t offer a convenient toolset for doing so.

But indexing isn’t the poor relation to archiving. It’s a different method for different circumstances, and could do with some attention. Conceptually, what’s required is very straightforward. I can’t comment on the actual code-level work it would take, of course.

If I could sum up the single tool which would make Indexing really useful, it would simply be: Automatic background sync.

mbywater, I agree with your point that maintaining synchronization of Indexed files by individual synchronization of groups or files can be a pain. Currently, the only way to simplify that would be to organize the external files and folders that are Indexed within one or a few containing Index-captured folders. That may not be practical for some users.

Even a command that would invoke global synchronization of all Indexed items would be useful. Although that might take a bit of time on a large database, it would ensure that the entire content was updated at the time of the command.

Like you, I don’t know the coding complexities involved for automatic synchronization with individually modified external folders or files, versus a command to globally update all Indexed content.

Suggestion: Make a feature request, listing the reasons for suggested change and perhaps one or more options that would be improvements. File as a bug report.