After trial and error, finally was able to deploy my crawler to a remote 
EC2 instance using scrapyd.  Details are here: 
http://bgrva.github.io/blog/2014/04/13/deploy-crawler-to-ec2-with-scrapyd/

On Sunday, April 6, 2014 10:53:29 AM UTC-4, Michael Pastore wrote:
>
> Forgive the confusion, but I was hoping that someone could clear up the 
> following.  
>
> I would like to run/manage several spider instances on a remote server 
> using scrapyd, but I am developing the spiders on local system.  So, if I 
> am developing spiders on machineA and want to deploy them and run them 
> using scrapyd on machineB, do I need scrapyd installed on both machines?  
> For example, the scrapyd deploy 
> command<http://scrapyd.readthedocs.org/en/latest/deploy.html#deploying-a-project>
>   
> scrapyd-deploy scrapyd -p project1 implies you are on the same instance 
> in which scrapyd is running.  If that is the case, then the example command 
> also implies project1 is a scrapy project on that same instance.
>
> Thanks in advance for any clarification.
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to