Why can't I move documents into repository using Sorter?

I noticed I can copy documents into the repository, but cannot hold down command and move documents into the repository. This causes me an extra step since I have to move documents in and then delete the originals.

Also, is there any way to save directly to the repository using “Save As”? This would also be a time saving workflow.


The reason we don’t support the “move” file drag operation is to safeguard against possible data loss. Even though the Sorter takes all kinds of precautions against it, until the dropped data has been deposited safely in the database it can be considered being in purgatory.

I don’t understand your “Save as” question.

I would like to save directly to DevonThink rather than have to save to an intermediate folder, then drag to DevonThink, then have to delete the source file.

Also, I think you should be able to move documents into DevonThink and have the original automatically deleted. If it needs to be recovered, it will be in the Trash.


I understand and I came to that conclusion as well. I’ll put it on the (longish) to-do list.

Rob, an important consideration in designing an application, especially a database, is data preservation in case anything goes wrong during a series of actions — especially things that are outside the control of the application such as an operating system error, a power supply glitch or user inattention.

In almost any context, an action could be described that would appear to be more efficient, but increases the riskiness of the action, perhaps in some cases turning out to be extremely inefficient.

Example: I’m walking along a sidewalk in the middle of a block in a busy urban environment. Across the street, I see a store that I would like to enter. What’s the most efficient route to get to that store? Obviously, the shortest path would be to jaywalk directly across the street, through traffic. If I do that, and don’t get killed, injured or arrested I would have saved time and effort in achieving my goal. But if I do get killed, injured or arrested then that pathway turns out to have been very inefficient.

Take the recommendation that, whenever a Finder file is Imported into a database, the original should be deleted. Most of the time, that would suit me fine, as that would save me the time and effort of deleting that original file, which is what I usually do.

But for a large number of users, who have different workflows than you and I, that could be disastrous. Perhaps that file that has just been deleted is being used by another application, or even a different database. Now, it’s gone and that user has lost information. Unless that other user is paying attention and immediately recovers that lost file from the Trash and replaces it into its proper location, DEVONthink has acted very inefficiently, even malignantly for that user. It has caused harm. Even creating a Preferences option for the user to choose, perhaps one way in one context, and differently in another context the option to choose whether the original file is deleted when Imported may not be a wise database design — for it increases the riskiness of the action and demands constant user attention, rather like jaywalking.

I, too, would often like the option to save a file, perhaps a newly created file in Papers or Keynote, directly into a database — just as I can do with my Web captures . Perhaps that may come in the future, when it can be done safely. Until then, a redundant step protects the time and effort I’ve spent in creating that file. For the time being, I can save that new file into a folder to which I’ve attached a Folder Action script to send it to a database. Even were the Folder Action script to fail, I still have access to that file.

Thanks, Bill. I’ve been an IT professional for over 32 years. I know what I’m asking for and since DevonThink is really using the file system anyway, you should still be able to move an imported file to the trash after importing. It’s no different than if I move a file from one folder to another. I appreciate your taking the time to express your point of view. I respectfully disagree. As a user, the choice should be mine. Longer workflows are never optimal.


IMO, these are two key points that would go a long way in streamlining many of Devon’s product (DT and DA) functionalities (of course, this holds true for any developer).

This is especially true when similar, successful products on the same platform already have the requested features.

Rob, in IT or anything else, the reality is that longer workflows are often more optimal than short workflows.

That’s why we have traffic lights and driving regulations. Yes, the workflow of driving from point A to point B then takes longer, but the likelihood of actually getting from point A to point B is greatly increased, so the longer workflow is more efficient.

That’s why we have checks and redundancies in moving packets of data around on the Internet. It’s certainly not the fastest workflow for moving information, but it’s more efficient than the fastest possible routings, because it is more likely to get the data properly transferred.

I used to teach analytical chemistry, which involves making observations of measurements. Every such observation has errors, and the effects of such errors on the precision and accuracy of the measurement can be quantitatively expressed. It can be statistically demonstrated that a good analytical chemist can consistently get better accuracy and precision of measurements than the nominal capabilities of the methodology and instrumentation. Such observations take a bit longer than a quick glance, but are enormously more productive.

A governmental agency in which I worked was losing too many enforcement actions because their technical basis could be called into question. I won’t belabor this, but I took on the job of forcing a quality assurance system that required identification of errors and documented procedures to mitigate them. The agency’s efficiency in enforcement cases and settlement agreements was greatly improved. Yes, it now took longer on average to develop that technical documentation in support of regulatory actions. But it had become obvious that inadequate technical support was a complete waste of time, effort and resources.

Christian Gruenenberg is properly cautious in introducing new procedures in DEVONthink 2, especially in the beta phase. I completely agree with you that I would like to move data from one database to another, with the result that the moved data is incorporated in the second database and then automatically removed from the first.

But it should be realized that people using such new procedures range from newbies to experts, on computers that range from “clean” to “totally frakked up”. (I’ve been amused to see cases of the latter type of computer being used by some IT academicians.)

Bottom line: It’s best to verify that users are not experiencing failure — for whatever reasons — of the transfer of data from the first to the second database before a procedure to delete the data from the first database is made automatic. Why? Because loss of data would be a more significant problem to the user than a slowdown of the full data movement procedure.

I understand your point, Bill. Especially while we are in BETA. However, in the long term, I don’t agree with you. I should not have to go to unnecessary steps to delete something I have moved to the repository. Further, I should be able to save directly to the repository. This is something that Together did very well and for which I applaud them.

The argument you are making is academic. The most important thing for a software developer to do is listen to its users, not try to argue its position. If you get enough requests for a feature/function, it should seriously be considered.

That is my professional opinion.


I put together a computer information center back in the 1960s, when input was done with punch cards and the storage medium was magnetic tape. I’ve followed some of the gremlins that can pop up in file system procedures with interest.

Copies used to be made of magnetic tapes, often to protect against data loss from flaking and other problems that could threaten data integrity. Sometimes the original tapes were wiped and reused for less critical purposes. Every now and then, someone would come up with the bright idea to streamline the process, so that a tape would be erased immediately after the copy. But for any of a number of reasons, a copy might fail (even if a verification read of the copy indicated – wrongly - that it was perfect). If that happened, the data was gone. Did that ever happen? Over and over. It’s happened to federal agencies, universities and businesses.

Computer technology has advanced greatly since those days. But there are still horror stories about simple file system procedures going awry, usually because someone assumed that nothing could go wrong.

During the beta period I’ve seen several user messages to Support, stating that a data copy from one database to another had failed. So far, that hasn’t turned out to be real, except in conjunction with a crash, resulting from something in the computer’s software environment that has caused OS errors.

Crash reports, together with Console messages, the profile and a description of what the user was doing can be very helpful to diagnose a problem.

Speaking of crash reports, one change in software environments that had not been anticipated was Apple’s release of a beta of Safari 4. It seems that Safari 4 can cause crashes. Unfortunately, it also causes problems with the logging of crash reports, resulting in useless crash reports, typically repeating the same line over and over.

I agree with this wholeheartedly. Now and again I discover I have imported files into DT and neglected to delete the originals, leading to a clean-up process. I realise this is human error, but when I’m working with a lot of PDFs (as I invariably am), it is an ever=present potentiality.

I don’t see why a checkbox, selecting between the copy and move options, which is by default set to copy – i.e., an ‘opt-in’ implementation – is not a straightforward solution to this issue.