Logs and Plugin development

Is there an application “defaults” key to enable a debug mode to dump all attempts made and not just the log tab that shows why a result page was filtered?

Currently I use HTTPScoop (highly recommended for any HTTP protocol work BTW) to monitor all the traffic to figure out how wide I’m casting the net out in my search query, but it would be handy I could obtain this without having to continually switch between applications.

The “Log” tab lists all (possible) downloads and why they’re filtered, there are no additional attempts. Or do you want to know why links of a result page were ignored by an XML plugin?

The latter. It’s not exceedingly painful to develop plugins using HTTPScoop as my utility for mapping the breadth of an executed search, but I was simply wondering if an application defaults key existed that might show everything, which would prevent the need to round-robin app switch between DA and HTTPScoop.

If a link of a results page is ignored, then no HTTP request is sent and HTTPScoop shouldn’t be able to do anything about this. If a link is not ignored and a request is sent, then it should be either listed in the “Log” or in the “Results” tab.

What you’re looking for is probably that the “Log” tab lists also the results of the “Results” tab, right?

Yes the breadth of the search is both the results and the filtered pages – HTTPScoop is used so I can examine the collective whole because the lists are segmented across two tabs currently.

A hypothetical implementation of what I’m asking for would resemble something like this:

Changing the column header from “Error” to “Status”. Make a new value that resembles a result page and leave the rest of the view the same. A setting could default to showing filtered pages only since the results tab is where most people spend their time. But optionally a full log could be displayed on the Log tab with both results and filtered pages displayed in a similar fashion. The column header could be sorted in alphabetical or reverse alphabetical when clicked.

This would make it much easier to see the collective whole of what was attempted in an effort to identify a better granularity for link filtering. You could quickly identify common areas of a website it’s not following that you might want to add to LinksMatching (note: i’m implying that you might be altering the text and link start/stop markers to farm new links between updates so there’s a bit of “let’s try it and see where we go” happening here) - or maybe you note that you are getting matches to an area of a website where there is less signal and more noise such that it is poisoning your plugin’s effectiveness. There’s probably more reasons – but yes in the end I wind up staring at a glorified packet sniffer to obtain that list and it’s not as effective as it could be if DEVONagent was capable of repurposing data in this fashion.