No Content Error when search terms are present

I am creating a plugin for the American Chemical society, but unlike the example tutorial in the documentation I wish to search across the publications, not just the ACS website.

So far I have come up with the following,

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "">
<plist version="1.0">
	<string>Searches across all ACS Publications.</string>
	<string>ACS Plugin</string>
	<string>&lt;nav aria-label="Pagination" class="pagination"&gt;</string>
	<string>&lt;ul class="rlist search-result__body items-results "&gt;</string>
	<string>ACS-All Publications</string>
	<string>american,society,chemical,acs,issn,chemistry,volume,number,log, journals,cas,eni,join,network,store,about,contact,meeting,meetings,careers,membership,networks,policy,funding,awards,press,room,pacs,immediate,release,publications,presspac,news,items,this,edition,service,weekly,education,governance,journal,briefings,reporting,releases,article,episode,podcasts,inquiries,peer,reviewed,podcast,member,doi</string>

This seems to work all well and good for my search query (Cannabis OR Cannabinoid), however when I inspected the log a lot of entries have the “no content” error. Upon further inspection, these entries do contain the query terms.

How does one make sure that DA can " See" the webpage.

I just tried this and can reproduce the issue. This website seems to return HTTP 403 (“Forbidden”) errors in some cases, in this case DEVONagent cancels all further attempts to query this host. A higher value for CrawlDelay or reducing the number of concurrent connections (see Preferences > Search) might fix this.

Taking these steps:

  • Reducing concurrent connections to 2
  • Increasing CrawlDelay to 1000 units

Still results in the same outcome,

  • First 2 log entires are results with links
  • 3rd entry has 403 error
  • Entries have No Content error
  • Last entry has 403 error

Could this be a useragent issue?

On a side note, having the ability to have a different number of max connections based on search set, instead of globally would be nice.

That’s one possibility, another one that these are (currently) temporary server issues.