I ought to know this but I don’t. Nor do I want to risk experimenting.
The more I use DT 3, the more I marvel at its functionality.
I enjoy making that most of DT so that I’ve almost (though not quite: I’d never do that ) begun to go through my meticulously named and ordered Group structure to take advantage of Replication.
When I can see - not so much that I want to ‘cover all bases’ by putting records/files/data in two places just because I can - but genuinely believe that a Group (and its documents) should rightfully be accessible in two places, I successfully Replicate it.
But I can’t see a way to unset the Replicate status when I realise that I’ve made a mistake for some reason… do I just delete the second (‘target’) items? Or is there a way to mark the first (‘source’) Group(s)/files/docs as having a Replicate to… destination as nowhere/null/clear, please?
So I can safely simply navigate to the Group (in)to which DT conceptually copied the Groups/docs (the ‘target’, so to speak) - and move them to the Trash?
I’m OK with that, thanks, because where I Replicated the ‘originals’ from is fresh in my mind .
There’s no such thing like an original, all replicants reference exactly the same item internally. Therefore it’s only important not to delete all replicants, one has to remain.
And nowadays, when you look at the contents of trash, you can see whether a copy remains or not: those items crossed through will be deleted permanently, those not crossed through have a copy somewhere in the database (which will not be removed on deleting).
So - following your directive not to think in terms of ‘originals’ - would it be safe to delete only the second file; that is, the file which (after Replication) can be seen in the place TO which I replicated the first file?
What you could do is set up a text database and just play with that. That’s what I do - the database is full of pointless files which I can lose without pain. That way I can test rules and functions and see what happens
In fact (I’ve been a DT addict for only six months) so well designed is everything about it that I’ve had to do much less dummy testing like that than is usual with such complex software.