The maximum number of running jobs is given by the max_proc and
max_proc_per_cpu config settings:
http://scrapyd.readthedocs.org/en/latest/config.html#max-proc
http://scrapyd.readthedocs.org/en/latest/config.html#max-proc-per-cpu

After that number is reached, new jobs will stay in pending queue, and they
will start once the running jobs have finished.

You can increase those settings in order to run more in parallel. Is that
what you were asking?


On Sun, Sep 22, 2013 at 7:48 AM, mourad mourafiq
<[email protected]>wrote:

> Hy guys,
>
> I am new user of scrapy, I am managed to write some spiders, some custom
> extensions and I am using scrapy with django and celery.
>
> My problem is with scrapyd, when I schedule jobs via the API, scrapyd
> manage to run 16 jobs (which is normal, default settings), but this jobs
> are always running, which means that all new scheduled jobs stay pending
> waiting for this running jobs to end.
>
> Is this a normal behaviour? How can change their state to finished without
> killing scrapyd?
>
> thank you very much for your help,
> Mourad
>
> --
> You received this message because you are subscribed to the Google Groups
> "scrapy-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at http://groups.google.com/group/scrapy-users.
> For more options, visit https://groups.google.com/groups/opt_out.
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to