Sure, you can remove the check from the code and recompile.

Under what circumstances would you need to ignore robots.txt ? Would something like allowing access by particular IP or user agents be an alternative ?

Tom


On 29/11/16 04:07, jyoti aditya wrote:
Hi team,

Can we use NUTCH to do impolite crawling?
Or is there any way by which we can disobey robots.text?


With Regards
Jyoti Aditya


______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________

Reply via email to