> We are using the pymongo driver with mongodb and as it stands, it looks
> like each worker gets its own connection.  Is there any way to share a
> database connection in memory among the workers?  I've read about ksm on
> the readthedocs page, is that applicable here?

No, you cannot share mongodb connections, you need at least one connection
per-worker. If you are in multithread you can use connection pools (i do
not know if they are supported by pymongo)

>
> We also have a bunch of python objects being stored in memory, would KSM
> be
> helpful in sharing memory among the workers for this?

>
> For example:
> Currently we have an identical list being stored on each worker (we keep
> the list stored in a python variable because its large and we need to
> access it quickly so storing in the uwsgi cache and pickling and
> unpickling
> (the only way I can think of to store a python object in the cache) each
> request, would be too slow) and we have 6 workers.  This means we are
> using
> several times the memory we actually need to.   Is there any way to share
> a
> global python object among all of the workers?  All we do is read from it,
> we only modify it when we are completely refreshing it periodically.
>
> The overarching question is, how do i share db connections and python
> objects globally, so each worker doesn't need to have their own copy of
> the

You will hardly find something faster than the uWSGI cache (there are
quite astonishing benchmark of it over memcached and redis), so i would
focus on finding a better serialization system. Can't you represent it in
a more binary-friendly way without using pickle/marshalling ?

In the ruby world the approach is having a server storing the object and
using a specialized framework for remotely manipulating it (drb). I do not
know if there is something similar in python.


-- 
Roberto De Ioris
http://unbit.it
_______________________________________________
uWSGI mailing list
[email protected]
http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi

Reply via email to