Am Mittwoch, 3. September 2014 11:51:21 UTC+2 schrieb Gabriel Birke:
>
> The crawler just sits there waiting instead of terminating. Can anyone 
> point out to me the correct way to collect the deferred objects into one?
>
>
After more research I found the correct solution: defer.DeferredList  
I made a new version of MultiFeedExporter at the gist address 
(https://gist.github.com/gbirke/abc10c81aca8242b880a ) It also fixes a 
reference bug when logging the number of scraped items.

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to