Did you make it ? I have ran to similiary problem here. I was using scrapy with unstable proxies too, spider always closed early than expected for all kinds of network exception. I am curioused about how you managed it .Can you give me some help here. Thanks:)
On Thursday, April 4, 2013 at 9:44:22 PM UTC+8, lin di wrote: > > sorry to disturb you guys. > > finally, i found , process_exception function in download middleware can > capture it. if it doesn't work, i will try hook signal spider_error. > > 在 2013年4月4日星期四UTC+8下午9时40分27秒,lin di写道: >> >> >> i use free proxys at http://www.freeproxylists.net/. when one ip >> gets error, i need to switch to next low latency ip. >> >> there exists several kinds of error like >> >> "failed 1 times): An error occurred while connecting: 104: >> Connection reset by peer", >> "Connection to the other side was lost in a non-clean >> fashion.", >> Connection was refused by other side: 111: Connection refused, >> User timeout caused connection failure >> >> how can i capture them? >> >> thanks in advance. >> > -- You received this message because you are subscribed to the Google Groups "scrapy-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to scrapy-users+unsubscr...@googlegroups.com. To post to this group, send email to scrapy-users@googlegroups.com. Visit this group at https://groups.google.com/group/scrapy-users. For more options, visit https://groups.google.com/d/optout.