Ken Krugler wrote:
Jeremy Bensley wrote:
There are posts every three or four days to the nutch-agent regarding bots submitting empty forms to websites. I don't think I've seen any regular devs
reply in-list to these issues, and am just wondering if these cases are
being analyzed.

1. Is there a known (resolved or current) bug regarding Nutch submitting forms? I could find no bug listings in JIRA for this. If it is known and
resolved, what versions of the bot exhibit this behavior?

Yes, there was a discussion on the list about this - I'm afraid this behavior is present in both 0.7.x and 0.8. I'm going to remove the offending code (or to put it as an option, turned off by default).

I think the biggest issue is following links for a form POST. This definitely seems wrong to me, and thus should never be done.

I don't think this is happening anymore, there is an explicit check for POST method in DOMContentUtils that should prevent this. However, some horribly broken HTML may be fooling Neko or TagSoup, so that they lose the 'method' attribute (in which case it defaults to GET).


There's a separate issue re whether it's OK to follow form links that do a GET, since that's what the guy complained to us about recently. He agreed that his form should be doing a POST, since it triggers a massive build process, but he also said that no other crawl besides Nutch was following these links.

I could see making that a configurable option, where it was false by default. But we'd probably need to modify this setting to be domain-specific, ie some sites we crawl require us to follow these types of links to get at content, but in general we'd want to not follow them.

For now I modified the code to skip form action URLs, depending on a boolean option. I'll commit this in a moment.


This brings up an issue I've been thinking about. It might make sense to require everybody set the user-agent string, versus it having default values that point to Nutch.

The first time you run Nutch, it would display an error re the user-agent string not being set, but if the instructions for how to do this were explicit, this wouldn't be much of a hardship for anybody trying it out.

I could write up some quick text for the Wiki re what a good user agent string should contain, and what should be on the web page that it refers to, since we also went through that same process not too long ago.

I like this idea. I know that I've been guilty of this in the past, out of pure laziness ...

--
Best regards,
Andrzej Bialecki     <><
___. ___ ___ ___ _ _   __________________________________
[__ || __|__/|__||\/|  Information Retrieval, Semantic Web
___|||__||  \|  ||  |  Embedded Unix, System Integration
http://www.sigram.com  Contact: info at sigram dot com


Reply via email to