How many connections can DevonAgent use?

I love the idea of deep searches and never want to wait for them to finish. I have a very good idea of the volume of work I’m asking of the tool. And nonetheless I want to push the limits. I have a brand new M3Max w/64GB RAM. My home internet connection is a 1GB Fibre, realistically I get 500mbps down.

The only control I seem to have is “connections” – how far can I push it usefully? I’ve set it to 40 and it the past 3+ hrs it has seen 120k files. What’s the useful # of connections 100? More? I assume more connections == more files in parallel. Looking at the network interface we’re not even hitting 1MB/sec.

1 Like

40 is actually the maximum per search. What’s the average network traffic (see Activity Monitor.app) and CPU usage while DEVONagent is searching? Actual network traffic depends of course also on your ISP, the host server and all the other servers in the middle.

Please note that by default DEVONagent honors robot instructions (Robots.txt) provided by websites and this might throttle the network traffic a lot. See Preferences > Search.

Thanks for the reply. The whole runtime was 4-5 hrs. I only glanced at the activity monitor occasionally.

  • Network Traffic never even hit 1Mb/s
  • CPU the highest spike was 33% and this was on an efficiency core

I get that robots.txt might have some effect, however it’s crawling 10 sites in parallel, that won’t matter overall.

Is there any reason not to set the hard limit to 150 and then put a warning up - anything over 40 may harm your network/CPU perf? This allows those of us with modern systems to search much faster.

Is there any reason not to set the hard limit to 150

Why would it be set this high, especially if “anything over 40 may harm your network/CPU perf” ? Why not 60? Or 75?

Because I don’t seriously think that it will have any affect on either. Having done this before on a plain M1 13" – 40 Connections never stressed any part of my system. Not CPU, RAM or Network.

Why not allow the users with systems capable of going much faster make use of that performance?

If you need an experiment feed me a special build and I will explore where the boundary is.

That would be something for development to assess.

Due to the low network & CPU usage the robots instructions might indeed be the main bottleneck, at least if all of the 10 sites throttle bot access that way.