[ 
https://issues.apache.org/jira/browse/NUTCH-1331?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13535970#comment-13535970
 ] 

Julien Nioche commented on NUTCH-1331:
--------------------------------------

Any objections or shall I commit this new plugin?
                
> limit crawler to defined depth
> ------------------------------
>
>                 Key: NUTCH-1331
>                 URL: https://issues.apache.org/jira/browse/NUTCH-1331
>             Project: Nutch
>          Issue Type: New Feature
>          Components: generator, parser, storage
>    Affects Versions: 1.4
>            Reporter: behnam nikbakht
>         Attachments: NUTCH-1331.patch, NUTCH-1331-v2.patch
>
>
> there is a need to limit crawler to some defined depth, and importance of 
> this option is to avoid crawling of infinite loops, with dynamic generated 
> urls, that occur in some sites, and to optimize crawler to select important 
> urls.
> an option is define a iteration limit on generate,fetch,parse,updatedb cycle, 
> but it works only if in each cycle, all of unfetched urls become fetched, 
> (without recrawling them and with some other considerations)
> we can define a new parameter in CrawlDatum, named depth, and like score-opic 
> algorithm, compute depth of a link after parse, and in generate, only select 
> urls with valid depth.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to