Markus Jelsma commented on NUTCH-2730:

Hello [~wastl-nagel]!

Yes Crawler-Commons is definitely the better place. I patched it in Nutch for 
now so our crawler could deal with this weirdness. 

Once fixed in CC, and if you upgrade Nutch' dependency, please close this 
ticket :)


> SitemapProcessor to treat sitemap URLs as Set instead of List
> -------------------------------------------------------------
>                 Key: NUTCH-2730
>                 URL: https://issues.apache.org/jira/browse/NUTCH-2730
>             Project: Nutch
>          Issue Type: Improvement
>          Components: sitemap
>    Affects Versions: 1.15
>            Reporter: Markus Jelsma
>            Assignee: Markus Jelsma
>            Priority: Minor
>             Fix For: 1.16
>         Attachments: NUTCH-2730.patch
> https://archive.epa.gov/robots.txt lists 160k sitemap URLs, absurd! Almost 
> 160k of them are duplicates, no friendly words to describe this astonishing 
> fact.
> And although our Nutch locally chews through this list in 22s, for some weird 
> reason the big job on Hadoop fails, although it is also working on a lot more.
> Maybe this is not a problem, maybe it is. Nevertheless, treating them as Set 
> and not List makes sense.

This message was sent by Atlassian Jira

Reply via email to