verify and repair error

When I try to “verify and repair” my database, I get the following message:

Found 0 inconsistencies, 0 incorrect checksums, 19 missing and 2 orphaned files.

If I select “repair”, it says “reparation failed. 21 errors left.”

What does this mean? Is it something I need to worry about?

Yes, if there are errors in your database, from whatever cause, it’s time to undertake maintenance, as continuing to use the database with errors may cause still more errors.

Quit your DEVONthink application. In the Finder, make a zipped copy of the database and reserve it for possible future references. NOTE: Never make a Finder copy of an open database, as the copy may be incomplete or contain errors.

Relaunch DEVONthink. Choose Tools > Rebuild Database. When the Rebuild operation is complete, examine the Log (Window > Log) to see if there are files that are listed as having failed to be included. If so, save the Log list as you will probably want to try to find and re-import those files. (Perhaps they are recoverable from the copy of the database you had made, in Files.noindex within the database.)

Select Tools > Verify & Repair. If there are no errors, you have recovered a sound working database. I would recommend frequent external backups, such as via Time Machine and/or (for DT Pro/Office) the Backup Archive procedure (Scripts > Export > Backup Archive).

Now it’s time to worry about what may have caused problems. If you haven’t done so regularly, run a suite of OS X maintenance operations using a utility such as C ocktail or OnyX, including cleaning out caches. Run Apples Disk Utility check disk routine to see if there are problems with your disk directory.

Have you recently installed some new software, perhaps a utility that modifies the appearance or behavior of OS X in some way? Think about any third-party preference panes, or software that has installed an Input Manager plugin, a QuickTime plugin or a Safari add-on. Try removing such software at least temporarily. Monitor your computer’s operations and check your DEVONthink database regularly using the Verify & Repair routine.

Make certain that you are not running out of hard drive space. Apple engineers recommend keeping at least 15-20% of the nominal HD space free, to accommodate OS X needs and application needs for temporary and swap files. If OS X runs out of free HD space, it may start overwriting data, with obviously data-threatening results.

Are you having problems with electricity brownouts or outages? That can result in data damage if power fluctuates while data is being written to disk, unless you have an uninterruptible power supply (or use a laptop).

What would happen to your data were your hard drive to fail? It’s a good idea to keep external backups of your important data, using Backup Archive, Time Machine or one of the good backup applications.

Hey Bill,

Thanks for your help on this. Unfortunatly, after doing what you suggested I still have the same error. What should I do?


Hi Sappleba: Thanks for your message. Bill’s suggestion’s are once again, as usual, excellent. In addition, be extremely careful that you don’t have corrupt PDF files in your DTPO database or you could waste many hours on unnecessary debugging. My previous posts explaining this issue should be easy for you to find.

Hey Redacted,

Thanks for the suggestion. Is there an easy way to find out if I have corrupted PDF files in my database?



How do I figure out what is wrong with my database, if rebuilding it didn’t help? Do I need to be worried about losing all of my work? I am actually more concerned about losing the structure (especially replications) of my database than the actual content. I have put countless hours into organizing and replicating files into the folder I want them in. Do I need to be worried about losing this work?

I backup my database regularly, in three ways:

  1. I run the “backup and optimize” script whenever I have done significant work.

  2. I use Mozy for off-site backup.

  3. Once a day, I close DTP and drag a copy of my database to an external drive. I usually allow each copy to overright the previous one, but occasionally I do keep a separate zipped and dated version using the “Backup Archive” script.

Is this sufficient backup? I currently only have a laptop and do not have the money to buy Time Capsule or another external drive, so Time Machine is not an option for me.

Thanks for your help,

Sam, another procedure to try would be to switch to the Split view, Select All and choose File > Export > Files & Folders. Create a new folder to hold the exported content. When the export is complete, check the Log to see if there are files that failed the export. If so, save the list.

Now create a new, empty database with a different name. In that database, choose File > Import > Files & Folders. Select ALL the contents of the folder that holds the previously exported material. When the import is finished, check the Log to see if there were files that failed the import. If so, save the list.

This export/import procedure is similar to Rebuild, but gives the opportunity to check the fate of files in both steps.

Now run Verify & Repair. If there are no errors, good. If there are ‘orphan’ files, select them, then export and reimport them, then delete the Orphan files folder.

The only problem I see with your backup procedure (item 3 in your post) is that, if you already have a database problem, overwriting the previous backup means that you now have only a damaged backup. Same for Mozy, with the added caution that the database should always be closed before the backup. I like the fact that you run Backup & Optimize after significant changes in the database, but it might be prudent to first run Verify & Repair to check integrity before the backup. That’s one of the things that I like about Backup Archive, as it always checks the database as the first step.


Thanks. I will try what you suggested ASAP. The real question I have, though, is what is the significance of my database being damaged? It seems the only problem that the “verify and repair” script finds are these same 21 missing files (19 missing and 2 orphaned, actually). It lists the files, and they are from the same folder. They are all PDFs of an old Macjoural I used to keep. I have other copies of them, and losing them is no big deal. So far, I haven’t noticed any increase in the number of files that are missing and I’ve been running the script for over a week.

The database is in good enough shape, that the “backup archive” script seems to work fine.

Giving this, what is the actual concern? Is it simply a worry that I might lose more files, or is my database actually broken in some way that will impair future use? What I am trying to discover is, do I potentially need to revert back to the last non-corrupted database file I have? I am worried that I would lose hours and hours of organizational work if that is the case. What are the implications of simply continuing to use this database, even if I cannot do anything about these 22 files?

Any answers you could provide, would be greatly appreciated. I am really concerned about this.


Hi Sappleba,

I had problems with missing and orphaned files in early beta testing stage of DTP 2 (some were the result of some nasty things I do when beta testing :wink:
Like in your case the maintenance tools coughed and where not able to help me out.
I brought the database back to normal by manually deleting those files from the database (and empty the trash).

One word of caution in using import/export and rebuild:
in present beta both technics have two flaws:

  1. the unsorted order of your files in DTP will not be preserved
  2. empty files (only name, no content) will not be exported.
    So if you depend on those things you will loose a part of the information/structure.
    (rebuild will do an internal backup bevor rebuilding so you can witch back if you realize problems soon enough)

On your backup strategie:
The main issue with all database backups is, that it may take days or weeks to discover a mistake. Just imagine you simply accidently deleted a folder buried deep in your structure. Or an other example from early beta: I stumbled on plain text files I had recently worked on that suddenly were empty. We found the issue after some investigation. But I would have lost some content with your strategie)
To make a long story short: database backup has to keep more than one last version. Of cause Time Machine is one solution for this (with the flaw that a Time Machine backup of an open DTP database might cause trouble). I use a modified version of the Archive Backup Script that adds the Date to the name and places it in a certain folder. I call the Script manually at the end of each day I worked on my main database (in early beta stage I did that hourly). I keep those backups going back more than a year.



Thank you so much for your help.

I did what you suggested and it has taken care of the missing files, though when I run “verify and repair” it still tells me I have two orphan files. When I try to repair, it tells me that it failed and does not show anything in the log. Is this something I should be worried about?

Also, thanks for your backup advice. I think I will switch to keeping old zipped copies of my database. In your experience, how much smaller are the zipped archives than the unzipped ones? My database is about 5gb and my zipped archive was over 3. Is that about right?



I am not sure what makes the difference between missing and orphan fiels technically speaking. I guess orphan files are files that are stored in database package but no loner referred to by the database. If this is true the issue is harmless: two files that DTP does not know of are wasting some disc space and you will not be able to find them from within DTP.
Off course you can start digging for them manually. But this may take time depending on the number of files you have in your database and whether you have any idea which two files this could be.
Perhaps Christian can confirm whether my thoughts are correct.

The size of the zip files depends on several things: If you have set DTP to keep internal backups those backups will be in the original database but not in the zipped one. Then the size depends on the files you have imported. jpg and gif files are already compressed and (as far as I know) will not be compressed any further by zipping, text files can be compressed more than many other files types.

My main database is 26 MB, containing one internal backup with 8.5 MB. The zip is 6.1 MB. I have mainly plain text and rtf files in it and index most images (mainly to keep my daily backups small but also to access them easily via Finder).


Sam: As I mentioned in previous posts, the issue about corrupt pdfs was an Apple issue, not a Devonthink issue, and no longer exists in the current version of Leopard (10.5.6). See Apple may have corrected PDF import crashes & hangs, viewtopic.php?f=3&t=5321#p32125 .

When I was fighting with these issues in December, before 10.5.6 was released, I noticed that the DTPO Export function worked normally, but after creating the new database, DTPO’s Import function would hang at the first corrupt PDF. Activity Monitor would then show that Leopard’s ATServer (Apple Type Server) would then become a runaway process, sometimes taking up all of the available CPU resources. This is the basis for Apple’s characterizing this situation as a denial of service in their technical note, because the entire computer would become unusable at that point and the only way to recover was to restart.

What we did was to write down the filename of the PDF file that would hang the system during the DTPO import into a new database, then export it and the folder it was contained in separately, then delete that folder from the DTPO database. We then repeated the process until all the folders containing corrupt PDF’s were removed from the database. It wasn’t easy, but the corrupt PDFs seemed to appear only in clusters in a few folders.

After these procedures, the DTPO Import routine finished normally, and we were back in business.

So no, I’m not aware of any easy way to identify corrupt PDF files other than the trial-and-error method I explained above. Perhaps there might be an application that checks PDF files for integrity, but I’m not yet aware of one. Maybe some of our other forum members could recommend some possible solutions.