One way to do it is to pass it to the spider, store it there, then in your 
middleware use crawler._spider.PROXY_NUM which is not the best way I think.

On Tuesday, December 31, 2013 4:58:50 PM UTC-8, Ali Bozorgkhan wrote:
>
> Hi,
>
> I know how to pass parameters to scrapyd in order to set settings or using 
> a parameter in my spider like:
>
> curl http://localhost:6800/schedule.json -d project=crawly -d 
> spider=my_spider -d setting=CONCURRENT_REQUESTS=20 -d 
> setting=CONCURRENT_REQUESTS_PER_DOMAIN=20 -d start_id=100
>
> which I override settings.CONCURRENT_REQUESTS and 
> settings.CONCURRENT_REQUESTS_PER_DOMAIN and pass start_id to my spider. 
> What I want to do is passing a parameter to my proxy middleware. I tried 
> '-d setting=PROXY_NUM=10' but it looks like -d settings only overrides the 
> known settings not custom ones.
>
> Is there anyway to do this?
> Cheers,
> Ali
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to