Is it a good idea to sync 30 GB / 100,000 files?

Hello,

I read in some comments in a blog that a user hat problems syncing a large number of files to DTTG. I am currently evaluating DT and DTTG syncing over a Synology NAS WebDAV server.

A secure and solid file synchronization is absolutely essential for me.

Are there any opinions or experiences concerning the synchronization of 30 GB / 100,000 files? I have regular, but very few changes in the files.

Thanks for any help!

100,000 files should be fine. The biggest problems usually occur when single files have a very large text content (the file size of e.g. a PDF doesn’t matter; it’s the number of words inside the file that go into the index).

OK, thanks for the info. I indexed a part f the files (60.000) to see what happens. After a few hours it seems to run very fine and updates are synchronized within a few minutes. Great!

The only thing that makes me nervous is the search box which still shows a high number of objects to be indexed - and the number falls slowly to raise again later on.

I’ll do some testing on my PDFs also (they have only a few MB, but are up to 1000 pages long (text!).