Have you run a profiler on your code?  What kind of hardware are you
running on?

I refer to Paul Graham's famous discussion on optimization:

The key to optimization is profiling. Don't try to guess where your code is
slow, because you'll guess wrong. *Look* at where your code is slow, and
fix that.



On Wed, Oct 15, 2014 at 9:21 PM, Shravan kumar <shrakum...@gmail.com> wrote:

> I've multiple spiders running in multiple instances (4) parallelly. All of
> them are using almost 100% cpu usage.
>
> I've deployed them using scrapyd. Tried changing scrapyd settings like
> max_concurrent_requests,CONCURRENT_REQUESTS,CONCURRENT_REQUESTS_PER_DOMAIN
> to minimum but no luck.
>
> I'm using python 2.7.5 and scrapy 0.24
>
> --
> You received this message because you are subscribed to the Google Groups
> "scrapy-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to scrapy-users+unsubscr...@googlegroups.com.
> To post to this group, send email to scrapy-users@googlegroups.com.
> Visit this group at http://groups.google.com/group/scrapy-users.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to