Updating webarchives

Thanks to you!

That may well be the case, Pete; but of the seven webarchives which the script appeared to break, I’m 99% sure that I may have run Update Captured Archive only on one or two of them.

IOW I do believe that Update Captured Archive really is irrelevant in our case here.

No trouble at all!

Apart from anything else, your script is extremely useful.

In the back of my mind still lies the fact that the DT (internal) script which I’m running, 'Check Bookmarks’ may have that name for a reason - and doesn’t, in fact, reliably update webarchives (at least as its main job), but is designed to check and update (or mark as invalid) ‘regular’ URLs.

I have several thousand webarchives in my 10,000+ DT database (which is basically an import from EagleFiler).

Being the meticulous kind of person I am, I really want to have the contents of those sites as up to date and ready-for-use as I can.

The 'Check Bookmarks’ script is a real boon - as it seems to work. And helps me keep up to date. But I suspect I wouldn’t be having this trouble if I’d chosen one of the PDF options, particularly because - as @cgrunenberg keeps on pointing out - webarchives are officially deprecated by Apple. But my first few attempts with PDFs were not successful. Partial pages, missing images, corrupt content etc. As it happens EagleFiler is excellent at capturing webarchives.

If I may (and at the slight risk of going off topic), reading posts in this thread from this post from @cgrunenberg onwards, I’m still a little confused.

While, as Jim points out, there should be a ‘Check Bookmarks’ script, my installation doesn’t have it (maybe ‘Check URLs’ has superseded it); and that ‘Check links’ appears not to be documented.

I wonder if there is a better way altogether of checking for outdated links of all kinds.