You may be experiencing data loss in your proxy.
How do you know that your crawl fails? Does it print a traceback?


On Saturday, 4 January 2014 15:16:45 UTC+2, Shivkumar Agrawal wrote:
>
> Hi 
>
> I have created a crawler in scrapy. It uses a list of proxies to crawl. I 
> have 3-4 sites to crawl daily. During crawling, scrapy logs show 
> ['partial'] message and my crawl is failed on those request. I have spent 
> lot of time on googling with no luck. 
> Can anybody help in this matter
>
>
>  
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to