Archiving ENTIRE websites with all internal links

HI DA forum folks

Are you away if there an easy way to archive ENTIRE websites, with all their internal links, with DA-- that is, without clicking each link separately and hitting webarchive for each one?

Thank you,

Do you want to add the pages/links to DEVONagent’s archive or add them as web archives to DEVONthink? At least the second task should be scriptable.

Or do you want to download complete websites for offline viewing? This is doable using DEVONthink Pro’s download manager or using third-party software like SiteSucker.

HI-

Thank you for your reply. I’d be interested in downloading compete websites to DThink Pro, actually. THat’d be great. How would you do that?

Thanks!

b

Choose File > Import Site… and enter the desired address. Afterwards choose “Subdirectory (Complete)” in the action menu and select the desired “Download To…” destination (either a database or a folder). Finally start the download queue.

I’ve downloaded a website recently and it was a bigger file than I thought. Mainly bc of all the other indexes and non-substantive stuff. Is there a way to download only certain types of info off an entire webite? e.g. all pdfs, text and images?

Thanks!

Sure. Just choose “Options…” in the contextual/action menu of the download manager and en/disable the desired file types.

Can someone explain how this is currently done in DA Pro? I don’t see any File>Import>Site command, or anything labeled the Download Manager. Nor does the online help contain references to either interface.

EDIT: Oops. Those are in DTP, not DAP. Got it, works well. Seems a bit odd that “webwhacking” functionality isn’t in DAP? Links can be dragged out of the sidebar, so one can capture pages nested one level deep that way. Maybe this feature could be added as a side bar action: i.e. capture selected pages and children (specify level of nesting, or restrict to hostname/subdomain.)