On Oct 19, 3:06 pm, "Jeremy Dunck" <[EMAIL PROTECTED]> wrote: > There is an ongoing discussion on the psycopg list and trac regarding > the use of psycopg with multiple interpreters. > > fog is proposing dropping support for multiple interpreters rather > than dealing with the (notably sticky) issues. > > Trac:http://www.initd.org/tracker/psycopg/ticket/192 > > Mail archives (look for 192; the threads are > broken):http://lists.initd.org/pipermail/psycopg/2007-October/thread.htmlhttp://lists.initd.org/pipermail/psycopg/2007-September/thread.html > > This is FYI. If the support is dropped, Django will need to either > advise people to not do this (where the alternatives seem to be to run > wsgi out of process or run separate apache processes for each vhost), > or we may need to do something crazy like writing a C extension and > making CursorWrapper return converted decimals for the running > interpreter. > > Notably, Graham Dumpleton reported the original bug and raised the > issue on the dj-dev list a while ago-- it's just now coming to a head.
FWIW, I have suggested to them what they should do. It would result in fastest performance possible for command line Python and for where web application used in main interpreter of Apache/mod_python/mod_wsgi process. It would incur a minimal performance hit for where web application run in non main interpreter of Apache/mod_python/mod_wsgi process. My argument here is that people who need best performance possible would be dedicating the Apache server to run the one application. As such, they would have the choice to delegate the application to run in the main interpreter. Anyone who is trying to run many application instances on the one Apache server in embedded mode, would already have to accept that performance is going to be less anyway, so a minimal additional overhead from some dynamic lookups in psycopg isn't going to be noticeable. I am also skeptical that the dynamic lookups would be noticeable in the first place and they are perhaps trying to over optimise it, but as I explained to them, optimise the main interpreter case and you have covered the vast majority of usage anyway. To see my actual reply to them where I describe all this, you will have to wait until it clears their moderation queue for the mailing list as I am not a member of their list. When that happens it should turn up on: http://lists.initd.org/pipermail/psycopg/2007-October/thread.html in the 'Ticket 192: patch' thread. Or at least I hope it turns up there, as other messages I have posted don't appear to end up on their archive. Must mean the moderator isn't actually passing them onto the list. BTW, if this emphasises anything, it is that Django really needs to perhaps move away from being dependent on the environment variable DJANGO_SETTINGS_MODULE and allow the settings module to be dynamically looked up via value supplied with request from WSGI or mod_python request environment. This is possible with Trac, and means that it is possible to easily host many distinct Trac instances within the same Python interpreter instance, with the location of the Trac data in the file system being passed through from Apache configuration automatically. For an example of this, see end of: http://code.google.com/p/modwsgi/wiki/IntegrationWithTrac Alternatively, within the existing configuration structure, Django should support some other means of hosting multiple sites out of the one Django instance. I know that some have done some work on this, but don't know how it has progressed. In either case, trick is ensuring any sites are separated enough that they don't interfere with each other and leak data/pages between the sites. Graham --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Django developers" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-developers?hl=en -~----------~----~----~----~------~----~------~--~---
