Dont worry about the crawler manager. Someone already emailed me a solution 
which is signals http://doc.scrapy.org/en/latest/topics/signals.html 

The manger im building is basically a dispatcher that will fire up scrapd 
instances on amazon ec2 and set them running the spider then terminate the 
instance when it is done.

On Wednesday, August 20, 2014 5:31:48 PM UTC+1, Nicolás Alejandro Ramírez 
Quiros wrote:
>
> You have to be more specific about your "crawler manager", there are 
> several channels of communication you can use. You should check scrapyd is 
> our "crawler manager".
>
> El miércoles, 20 de agosto de 2014 12:39:19 UTC-3, 
> le...@lewismcmahon.co.uk escribió:
>>
>> I want to be able to notify my crawler manager (some software im 
>> building) that the crawl has completed so it can close down the aws 
>> instance it is on. I notice you can do this with an extension but it seems 
>> a little hacky is it the best way to do this currently? if so should i look 
>> into building something into the spider to do this? and sending a pull 
>> request?
>>
>> Lewis
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to