I’m having trouble with duplicates again, and I’m not sure whether I’m asking the same question in a different variation yet again, or if this is something different, though I did find out part of my previous duplicate issues was due to having done something silly when importing my databases initially to where I actually had several databases that were in and of themselves duplicates, anyways here’s what I’ve been noticing in regards to duplicates lately.
I will periodically notice duplicated folders and to a lesser extent files which I index from a relatively static finder folder structure from within my documents folder or my desktop on icloud where the rest of my materials related to those topics is stored. Sometimes it’s the folder with all of it’s contents, sometimes with some of it’s contents and sometimes with nothing but the folder being duplicated, but in each instance it’s always an indexed folder that is being duplicated and if there’s a rhyme or reason as to why one happens over the other, its entirely lost on me,
Anyways, the files and folders always have the same path as the duplicated version. I’ve tried wholesale removing duplicates through the script, I’ve tried deleting them manually, I’ve even essentially started that database over and adding individual files and folders from the janky one or adding them from finder all over again starting fresh by indexing the finder folder where all of my stuff is located, taking care to have already done some prep work and laying some basic organization down which should be easy enough to maintain and expand on when the time comes since I’ve made the mistake of moving an indexed folder and having it throw everything all out of sorts and taking what felt like forever to get all sorted out again, if it ever was in the first place.
Still, these duplicates will randomly reappear and I’m not sure whether it’s something I’m doing, since I feel like I notice it being a recently worked on database, or portion of a database though it’s (unclear whether this is simply as a result of it being where my focus or attention is at in that present moment, or whether this fact means anything at all, but I will ultimately end up with a duplicate file or folder that the only difference between the two files is that one will have a much more recent “added” date.
Other than that, all the metadata appears to be exactly the same, though there are sometimes where one of the two is one more nested folder deeply organized, and I havent the foggiest as to what would result in one over the other either, I simply notice that sometimes it’s one way and other times it’s the other way, though the otherwise identical metadata is a constant.
The other odd thing I noticed is that if I attempt to delete the more recent added file, I will often end up with the other file showing up as being missing or unable to be found which seems counterintuitive to me since I would think that the more recent one would be the obvious duplicate.
Also, not sure what this has to do with the price of rice, but it just popped into my mind and could be relevant, the duplicates will often have different thumbnail icons. For example, in the instance of.a word document, one might have the generic Word icon and one might have a thumbnail that looks more like a document, though the two files are otherwise exactly identical when looking at it via quickview or simply viewing it from within my database itself.
Has anyone else experienced anything similar, or do you have any hunches on what I’m doing wrong or what I could do in order to kill these duplicates once and for all? I feel like I waste more time trying to deal with taming the dupes than I do being productive since it’s drivign me nucking futs and is highly distracting making it difficult to tell which file I should be working with in a given moment.
I guess I could convert them to replicants, but I really have no need or desire for the additional copies to be in my database at all, and in the event, I did eventually want to create a replicant or a duplicate I’d much prefer a small batch approach than the present set up.
Sorry, sort of a brain-dump there, hopefully someone is able to make sense of it and steer me right.