Website backup CLI tool Monolith

I have been messing around with this new CLI tool I came across and wondered if fits anyone’s workflow. It’s called Monolith.
I don’t need to back up webpages that often but I have seen it discussed a lot on this forum. Webarchive is one thing I recall people discussing and the attendant problems with it.
I’ve experimented with a few news sites and It seems to do what it promises. I imported the resulting html file into DevonThink and opening the file from within DT presents a pretty clean version of the file.
Nothing earth shattering here but I thought the researcher/archivists might find it interesting.

While not always perfect, it’s often very good, but have you tried formatted notes?

Yeah sure.
I think the thing that made me think of DT in this instance was reading here about people’s problems with dynamic loading sites and all the lack of adherence to standardization that leads to incomplete copies of web pages. This thing that downloads all the dependencies into one simple HTML page seems to be answering some questions people have been asking here. But like I said, I don’t need to do this all that often.

Interesting approach – they simply embed all the referenced CSS, Javascript and image data in the HTML. That can become quite huge and probably not manageable with the usual tools, though. And fonts … those can’t be embedded.