Updating webarchives

@pete31,

Some of this may turn out to be superseded by the fresh input from yourself and @chrillek on whether webarchives are - after all - the best solution for what I’m trying to do.

When I stayed with webarchives from EagleFiler, they seemed to be the best compromise.

I need to think about that and post here again as soon as I have decided whether webarchives have any advantages over bookmarks after all.

Abbreviated workflow from your post - and many thanks for your persistence!

Because:

  1. I wasn’t fully aware of the relationship between ‘live’ URLs and what exactly webarchives contain
  2. I (mis?) understood - from a couple of threads here - that there is now only one way to confirm (or otherwise) the accuracy of any web pages stored in any form(at) in DT - and that is the Check links script
  3. this script appears to be deprecated, or superseded.

So, instead, I’d simply see the ‘Invalid URL’ in the ‘Invalid URLs’ Group (of replicants) which the Check Links script creates - and reload it? (And then Update Captured Archive if it’s OK?)

It’s looking that way, isn’t it.

It is. That’s what I’m aiming for as much automation as possible :slight_smile: .

Yes. But because it now appears necessary for me to rethink the question of webarchive vs Bookmark, I might find a script that converts webarchives to Bookmarks more useful - because it looks as though this doesn’t do what I thought it would.