I think one cool way is to put everything in one file as scrapy can 
trivially do and then hook 
on http://doc.scrapy.org/en/latest/topics/signals.html#spider-closed and do 
a grep-like operation. It's more batch this way and it respects the 
"pipeline" design of scrapy rather than doing an item processing pipeline 
that save to 3 files which would be lots of manual code.

On Thursday, June 16, 2016 at 6:37:35 PM UTC+1, Felipe Eltermann wrote:
>
> Hello all, 
>
>
> I developed a spider that yields 3 kinds of items, let say 
> TypeOneItem, TypeTwoItem and TypeTreeItem. 
> I'd like to be able to execute the crawl and collect the results in 3 
> different files (one for each item type). 
>
> What is the proper way in doing so? 
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at https://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to