[
https://issues.apache.org/jira/browse/NUTCH-2730?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16911694#comment-16911694
]
Sebastian Nagel commented on NUTCH-2730:
----------------------------------------
Hi [~markus17], should be fixed, no question. I just wonder whether
crawler-commons would be the better place for it - I've opened the issues
[#261|https://github.com/crawler-commons/crawler-commons/issues/261] and
[#262|https://github.com/crawler-commons/crawler-commons/issues/262].
> SitemapProcessor to treat sitemap URLs as Set instead of List
> -------------------------------------------------------------
>
> Key: NUTCH-2730
> URL: https://issues.apache.org/jira/browse/NUTCH-2730
> Project: Nutch
> Issue Type: Improvement
> Components: sitemap
> Affects Versions: 1.15
> Reporter: Markus Jelsma
> Assignee: Markus Jelsma
> Priority: Minor
> Fix For: 1.16
>
> Attachments: NUTCH-2730.patch
>
>
> https://archive.epa.gov/robots.txt lists 160k sitemap URLs, absurd! Almost
> 160k of them are duplicates, no friendly words to describe this astonishing
> fact.
> And although our Nutch locally chews through this list in 22s, for some weird
> reason the big job on Hadoop fails, although it is also working on a lot more.
> Maybe this is not a problem, maybe it is. Nevertheless, treating them as Set
> and not List makes sense.
--
This message was sent by Atlassian Jira
(v8.3.2#803003)