I've got following error:
  File "/usr/local/lib/python2.7/multiprocessing/managers.py", line 500, in 
connect
    conn = Client(self._address, authkey=self._authkey)
  File "/usr/local/lib/python2.7/multiprocessing/connection.py", line 169, 
in Client
    c = SocketClient(address)
  File "/usr/local/lib/python2.7/multiprocessing/connection.py", line 304, 
in SocketClient
    s.connect(address)
  File "/usr/local/lib/python2.7/socket.py", line 224, in meth
    return getattr(self._sock,name)(*args)
error: [Errno 2] No such file or directory

Maybe I mixed directories? Is layout of files and code strict? 
Code lies in the same directory, but it is not root directory.

On Thursday, February 26, 2015 at 3:00:09 PM UTC+2, Paul Royik wrote:
>
> Thank you very much for you response.
> I will explore it, however, it is hard fo me to imagine, how to insert my 
> long-running functions into manager class.
>
> On Thursday, February 26, 2015 at 12:15:28 PM UTC+2, Graham Dumpleton 
> wrote:
>
> You will need to install latest mod_wsgi from git repo as I made some 
> tweaks to make this easier.
>
> So on your own personal system (not Webfaction) so you can experiment, do:
>
>     pip install -U 
> https://github.com/GrahamDumpleton/mod_wsgi/archive/develop.zip
>
> Now create three files.
>
> The first called 'task-queue-client.py'.
>
> This contains the following and is a simple WSGI hello world program.
>
> import os
>
> from multiprocessing.managers import BaseManager
>
> class MyManager(BaseManager):
>     pass
>
> MyManager.register('Maths')
>
> sockpath = os.path.join(os.path.dirname(__file__), 
> 'task-queue-manager.sock')
>
> def application(environ, start_response):
>     status = '200 OK'
>     output = b'Hello World!'
>
>     m = MyManager(address=sockpath, authkey='abracadabra')
>     m.connect()
>
>     maths = m.Maths()
>     print maths.add(1, 2)
>
>     response_headers = [('Content-type', 'text/plain'),
>                         ('Content-Length', str(len(output)))]
>     start_response(status, response_headers)
>
>     return [output]
>
> The second called 'task-queue-manager.py'.
>
> import os
>
> from tasks import MyManager
>
> sockpath = os.path.join(os.path.dirname(__file__), 
> 'task-queue-manager.sock')
>
> try:
>     os.unlink(sockpath)
> except OSError:
>     pass
>
> m = MyManager(address=sockpath, authkey='abracadabra')
> s = m.get_server()
> s.serve_forever()
>
> And finally 'tasks.py'. This can't be in 'task-queue-manager.py' as can't 
> have functions you want to pickle in the service script because of how 
> mod_wsgi names modules.
>
> import os
> import multiprocessing
> import multiprocessing.managers
> import signal
> import time
> import os
>
> class TimeoutException(Exception):
>     pass
>
> class RunableProcessing(multiprocessing.Process):
>     def __init__(self, func, *args, **kwargs):
>         self.queue = multiprocessing.Queue(maxsize=1)
>         args = (func,) + args
>         super(RunableProcessing, self).__init__(target=self.run_func,
>                 args=args, kwargs=kwargs)
>
>     def run_func(self, func, *args, **kwargs):
>         try:
>             result = func(*args, **kwargs)
>             self.queue.put((True, result))
>         except Exception as e:
>             self.queue.put((False, e))
>
>     def done(self):
>         return self.queue.full()
>
>     def result(self):
>         return self.queue.get()
>
> def timeout(seconds, force_kill=True):
>     def wrapper(function):
>         def inner(*args, **kwargs):
>             now = time.time()
>             proc = RunableProcessing(function, *args, **kwargs)
>             proc.start()
>             proc.join(seconds)
>             if proc.is_alive():
>                 print 'still alive'
>                 if force_kill:
>                     print 'kill it'
>                     proc.terminate()
>                     proc.join(3)
>                     print 'is it dead', proc.is_alive()
>                     if proc.is_alive():
>                         try:
>                             print 'force kill'
>                             os.kill(proc.pid, signal.SIGKILL)
>                         except Exception:
>                             pass
>                         proc.join(1)
>                         print 'alive', proc.is_alive()
>                 runtime = int(time.time() - now)
>                 raise TimeoutException('timed out after {0} 
> seconds'.format(runtime))
>             assert proc.done()
>             success, result = proc.result()
>             if success:
>                 return result
>             else:
>                 raise result
>         return inner
>     return wrapper
>
> class MathsClass(object):
>     @timeout(5.0)
>     def add(self, x, y):
>         print os.getpid(), 'add', x, y
>         return x + y
>     @timeout(5.0)
>     def mul(self, x, y):
>         print os.getpid(), 'mul', x, y
>         return x * y
>
> class MyManager(multiprocessing.managers.BaseManager):
>     pass
>
> MyManager.register('Maths', MathsClass)
>
> With those all in the same directory run with that latest mod_wsgi repo 
> version:
>
>     mod_wsgi-express start-server task-queue-client.py  --service-script 
> tasks task-queue-manager.py
>
> When you start the server what will happen is that in addition to creating 
> a process to run your WSGI application, mod_wsgi-express will create an 
> extra daemon process labelled as 'tasks'. This will not handle any web 
> requests. Instead into that process will be loaded 
> the task-queue-manager.py script.
>
> So Apache/mod_wsgi is simply acting as a process supervisor for that 
> process and will keep it running.
>
> The task-queue-manager.py script is going to use multiprocessing module to 
> run a server that accepts client requests made from some other process 
> using the multiprocessing module mechanisms for remote calls.
>
> When that call is made from the client code, the manager service using 
> your existing timeout decorator, will actually run the target function in a 
> forked subprocess rather than the manager process. When the result is 
> available it gets pulled back from the forked worker process into the 
> manager process and then back across to the client.
>
> So the key thing here is that the manager process which is running the 
> worker as forked subprocesses, is an independent process to your web 
> application. Don't include your web application into the manager. Keep the 
> manager to the bare minimum of code it just needs to run your algorithm, or 
> even delay importing that code until needed in the forked sub process when 
> function runs. That way it will be light on memory.
>
> So look very carefully through that and study 
> what multiprocessing.managers.BaseManager is all about.
>
> When you totally understand it, look at working out how to integrate that 
> into your web application.
>
> Remember that when trying to use on WebFaction, first update mod_wsgi to 
> that repo version. I
>
> ...

-- 
You received this message because you are subscribed to the Google Groups 
"modwsgi" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/modwsgi.
For more options, visit https://groups.google.com/d/optout.

Reply via email to