My eventual solution was to just rework the way my watcher scripts
execute. I now use celery and rabbitMQ, and let pyInotify send a new
job to any available worker. For my application, 20 or so workers is
fine and this can scale as needed.

I'm still looking for a definitive answer on whether  the built-in
process
modules cannot launch and detach a process in the background, (ala
"nohup <process> <args> &" ) and return control to the view.

Thanks for the responses.

-h




On Sep 6, 12:05 pm, kmpm <[email protected]> wrote:
> I have been using Celery some and from what I got I thought the
> concurrency/workers in the config controlled how many processes that
> would be at maximum started from the worker that checks the queue but
> I could be wrong. Another twist on the celery thing is that you could
> actually have several machines looking at the same config+queue and
> spreading the load among them and if the queue grows to big you could
> start up even more servers/machines dealing with the increased load
> until things cool down again.
> This is a blog entry I found talking about some of the advantages of
> Celery.  
> http://ericholscher.com/blog/2010/jun/23/large-problems-django-mostly...
>
> /kmpm
>
> On Sep 6, 7:20 pm, Heath <[email protected]> wrote:
>
> > Thanks! yes,
>
> > os.system() will launch the process and return control, but then I'd
> > have to write a utility to get the PID and other data about the
> > process.
>
> > I guess I'm looking for a definitive answer that the built-in process
> > modules cannot launch a process in thebackground.
>
> > Here are my findings:
>
> > subprocess.Popen and multiprocessing offer properties and methods
> > ideal for my task, but won't return to the view until the process
> > ends.
> > Daemonizing these will allow the calling script (django, in this case)
> > to quit, but still won't return from the view.
>
> > celery/gettoq/rabbitMQ allows for the creation of a queue, but you
> > must specify a predefined (as far as I know) number of workers in the
> > config. This is great for processor-intensive tasks that may only run
> > for a short period of time, as the worker pool stays the same size,
> > and processes new requests when there is a worker available. But, in
> > my case, the processes are not doing much until there is folder
> > activity, and I need to add worker instances dynamically.
>
> > I feel like I'm just missing something..Launchingsystem processes
> > must be a common idiom for a web app, no?
>
> > I do appreciate the response,
>
> > -heath carlisle
>
> > On Sep 6, 12:25 am, Aljoša Mohorović <[email protected]>
> > wrote:
>
> > > On Mon, Sep 6, 2010 at 7:19 AM, Heath <[email protected]> wrote:
> > > > What I require seems simple, just run the requested process in the
> > > >background. The terminal equivalent would be:
>
> > > > "nohup <process> <args> &" and return control to the view.
>
> > > > Any ideas on how to achieve this?
>
> > > if this already works in a shell is there some reason why
> > > "os.system('cmd')" or something similar doesn't work for you?
>
> > > Aljosa Mohorovic

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en.

Reply via email to