Request that DT grab URL while "taking a note" (in

Well, that’s a pretty straightforward header. I believe StickyBrain does this, but I don’t use SB. I use DT. There’s that URL thingamabob down there, but it doesn’t do all that much.

I browse in Safari or Camino, and I like the ability to CMD + ) to take a rich text note. It’s a great feature, but DT should also capture the URL.

Good news!

DT PE 1.9 will be released soon, and will capture the URL of a Web page note captured from Safari or OmniWeb. The URL will also be captured when a rich text note is captured via contextual menu option from within DT’s WebKit browser view (from Safari or OmniWeb). However, URLs cannot be captured from a page displayed in Camino.

Is there any way to edit the URL that is grabbed when taking a note from a browser? I can’t seem to figure it out.



I don’t understand the question.

If you are using Safari, OmniWeb (recent) or DT’s own Web browser, the URL of a Web page is captured and stored in the Info pane for the captured note or page. Use Show Info to see the pane for a document stored in DT.

To capture URLs:

Your options are to use DEVONthink>Services>DEVONthink>[plain or rich note from selected text and/or images] when in Safari or OmniWeb, or to use DT’s WebKit browser and the contextual menu option for capturing either a page, or a note [from selected text and/or images, for note capture].

Most other browsers can capture a note (selection from Web page) via Services, but the URL isn’t captured.

Hope this helps.

It’s possible to edit the URL via the “Info” panel or just display the URL column (see menu “View > Columns > …”) which is editable too.

What happened to the editable URL field? The only way I could (in 1.8) get web pages into devonthink in the format I desired was to browse there in devonthink, and use the ‘capture page’ feature. (I like to keep the look of the original page.)

I had created an item called “placeholder”, which I would use like a regular browser. I would type in the url, and after it loaded I would ‘capture page’. Looks like I can still do this, but now I have to open the info window first, move to the right field, then enter the URL. A bit more tedious. I lament the loss of the editable URL field.

I like the capture … note’ options in Safari’s services menu, but why is there no ‘capture page’? I could avoid this problem altogether if I had that option. Also, any plans to support FireFox with this?

While I’m at it, could you guys throw in the kitchen sink? Just kidding. Just wanted to make sure you know that besides these few workflow issues, I’m very happy with the update.


There’s no “Capture Page” service because no browser (and probably no application at all) delivers the HTML source to services. In addition, supporting Firefox isn’t yet possible because it’s impossible to retrieve the URL of the frontmost window (supporting Camino on the other hand should be possible but doesn’t work due to bugs of Camino’s AppleScript support).

One of the next builds will also make the URL field editable & draggable again and finally, DT Pro will include some scripts like “Add image/page/linked images/links from Safari” which can be accessed via the script menu extra.

Thanks, Christian. Viewing the URL column helps, and I’ll try it for a few days. It seems to me better than having to use the Get Info panel, at least at first glance.

But why not have the thin strip at the top of the Vertical Split window be editable? This would take up the least space, and be most intuitive.

For Bill – the reason that I want to edit the URL is that I often grab a section of a blog page, and I copy a “PermaLink” to the clipboard. When I Take a Note of the page (using Services in OmniWeb), it DT grabs the home page URL, which I would like to replace with the PermaLink.

Thanks - now I understand your point.

I sometimes do the same thing, for blogs on the M.I.T. Technology Review site. A way to do this now is to Show Info and replace the URL for the captured note or page with the Permalink URL.

Note: I use Safari Services to capture blogs on the Technology Review site, because they often have an interesting Comments section. After capturing the blog, I go back and open Comments, then use Safari Services to append the comments to the previously captured note.

I don’t understand why this is not possible. The browser doesn’t need to send the HTML source to DT, only the URL. DT could then load the URL and do its “Capture Page” thing. Both “Capture Plain Note” and “Capture Rich Note” seem to be able to take the url with them, so why not add “Capture Page” that just ignores the selected text and takes only the URL. I admit I know little about how services are implemented, but it seems doable to me.

What exactly are the linked images added by those scripts and where are they saved in DT Pro? I got a message about no thumbnails when I ran them while viewing some random page that contained several images.

Thanks for bringing my temporarily blinded attention to the Add image from Safari script. I’ve recently been tested different ways to add images directly from Safari with the URL preserved and that script does the trick. And drag/drop seems to now, too. Coincidentally, I’d even made a note to ask about it only an hour ago. :slight_smile:

If you want to add all images of the page, then use the “Add images from Safari” script. But if the page displays for examples only thumbnails linking to the real high resolution images, then use the “Add linked images from Safari” script.

Or use the “Add linked images from Safari to downloads” and DTP’s download manager will download the images (and not the script). This is probably the desired method as it doesn’t block the application while downloading.

As of DT Pro 1.9.beta3 there’s only an “Add image …” script, for a single image, in the Scripts (Additional) directory of the distribution.

Okay, that makes sense.

Got it.

I’m still struggling with the Download Manager but finally managed to capture a page with images that was viewable offline. However, a couple gif images returned 404 errors during downloading that DT got stuck on them during offline viewing. The page I was testing is but for some reason I can’t get an Offline Archive of it working today (or any other combination of presets or options); only an index.html page shows up in the Archive group, minus any images hierarchy. The Download Manager remains the most puzzling and unpredictable part of DT for me but I can live without it for now.

beta 3?

Beta 3 should include 6 Safari scripts (and beta 4 will include 6 more scripts for DEVONagent 1.5). However, there are still some known issues related to downloading/archiving sites (especially when links are not correctly formatted).

I’ve got the same six Safari scripts that showed up months ago, some last modified on October 18th. Just noticed that a couple of the copies installed under ~/Library/Scripts were modified a couple days ago. They appear the same in Script Editor but running strings on them shows different filesystem paths, e.g.:

Modified: Macintosh

(Cesky :slight_smile:)

I didn’t know scripts were “self modifying” that way, if indeed that’s what happen to 'em.

. . .

Like I said, the site downloading/archiving isn’t a priority for me right now but I’ll report issues I find when tinkering with it. While rereading the ReadMe notes this morning I noticed:

- As the download manager is actually a site sucker, URLs already downloaded/accessed can’t be added anymore to the list during a session.

Made me wonder if the trouble I had downloading from yesterday was related to retrying during the same DT Pro session even though I’d removed all traces of the previous download from the day before. So…

I restarted DT Pro, ran the Add Page To Downloads CM on the captured HTML page, selected Offline Archive in the Download Manger, and ran the queue. That worked(!) Next I removed everything from the Archive group, redid the same steps as above, and only the index.html appeared.

If it’s intended to work like that it’s not how I’d expect it to work, especially without any sort of “not again during this session” feedback.

Enough with that for today. :slight_smile:

There are definitely issues in beta 3 which beta 4 will hopefully fix.