My most important question is how can I export the whole list of metadata in some delimited format?
Is it possible to export all the metadata esp URLs, labels, etc from a database with missing images but intact metadata?
Then, I can figure out some way to download the images again using that data. If I have it.
I had a big database that I stupidly had let swell too large between full backups. It contained a great many images and web pages that Id saved.
It was being copied to my main subject DB - but at that point it froze and I had to reboot.
When I opened the program again, a lot of the images were broken. I made a backup and then tried to repair the DB, and it did not work, The images were still broken. I have not been able to find them in either of the database “files.noindex” folders - or anywhere on my hard disk. The backup is too old to be useful.
However, I still have the original database which - although (many) images themselves are missing, (thousands of images) it still contains their metadata which I now want.
The URLs, labels, filenames, etc. are significant for me. Using them I may be able to reconstruct important info. Some of the URLS are fairly long.
Of course, if only DTP could “refetch” ONLY the missing web-based files- but actually DTP is great, I just was pushing it too hard. I will be happy just to be able to save the info from that several days.
Is there some way to get just a list of the missing files- with their URLs and paths of the missing files, to run wget or curl on?