This is a copy & paste of something I wrote a month ago or so:

I was able to get background processes to work pretty well. I have an import 
script that takes a minute or two to run and is started by an ajax call from 
a button in one of my views, and the progress is reported back to the page 
as the script runs. The way I did this was to use multiprocessing. Here is 
an example class:

from miltiprocessing import Process, Queue

class ImportScript(Process):
    queue = Queue()
    progress = (0, 0, 'Idle')

    def __init__(self, environment, db):
        Process.__init__(self)
        self.db = db

    def run(self):
        self._do_import(arg1, arg2)

    def _do_import(self, arg1, arg2):
        # long running task here
        self._update_progress((1,100, 'Importing'))

    def _updatre_progress(self, progress):
        ImportScript.queue.put(progress)
        print '%s: %s of %s' % (progress[2], progress[0], progress[1])

    @staticmethod
    def get_progress():
        queue = ImportScript.queue
        progress = ImportScript.progress
        while not queue.empty():
            progress = queue.get(False)

        ImportScript.progress = progress
        return progress

I called this file scripts.py, and added to my modules folder. Then in 
db.py, I initialize it by calling:

scripts_module = local_import('scripts', reload=False)  # reload=False is 
important

In my controller, when I want to start this process, I call:

import_script = scripts_module.ImportScript(globals(), db)
import_script.start()

The script starts and the controller continues execution while the script 
runs in the background. Then if I want to check progress on the script, I 
make an ajax call to another action in the controller that returns progress:

progress = scripts_module.ImportScript.get_progress()
return '%s: %s of %s' % (progress[2], progress[0], progress[1])

This would return something similar to:
Importing records: 5 of 100

Reply via email to