Hello,
I am new to scrapy , hope to find some help here. I was using scrapy with 
unstable proxies,spider always closed early than expected. So, how to 
ignore all kinds of errors and let spider goes on. I have tried adding 
errback function to handle httperror, decreasing CONCURRENT_REQUESTS 
setting and increasing the DOWNLOAD_TIMEOUT setting,etc, it didn't work 
eventually. So, can anybody tell me how to handle errors like "user timeout 
caused connection failure"?
THANKS :) 

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at https://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to