You are pretty close, just need to make it self.user_input_proxy in your 
spider and in the middleware call spider.user_input_proxy

On Sunday, February 8, 2015 at 2:15:32 PM UTC-8, Sungmin Lee wrote:
>
> Hi,
>
> Is there any way to take a proxy address as an argument, something like,
> $ scrapy crawl my_spider -a proxy=xxx.xxx.xxx.xxx:xxxx
>
> user argument comes in to spider class, and I wonder if there is a way to 
> pass the argument down to custom middleware level. (proxy middleware)
>
> The proxy from the user argument should apply to the process_request 
> method in the middleware. 
>
> #my_spider.py
> class MySpider(Spider):
>     def __init__(self, proxy="", *args, **kwargs):
>         user_input_proxy = self.proxy
>
> # my_middleware.py
> def process_request(self, request, spider):
>     request.meta['proxy'] = user_input_proxy
>
>
> Thanks all.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to