On 06/09/2011 03:50 PM, Graham Dumpleton wrote:
> In short, Apache is not necessarily the best platform for long polling
> (comet) style applications. Possible for smaller numbers of concurrent
> requests, bit not when number of concurrent requests can balloon out.
>
> The important thing to note is that although you may have certain URLs
> which require long polling, and so a solution such as gunicorn with
> eventlet worker may be better, or even Tornado, this doesn't mean that
> you need to or should as a result convert you whole application to
> such a mechanism. What some large systems do is still use
> Apache/mod_wsgi behind nginx to do the heavy lifting and only proxy
> certain URLs from nginx through to separate Python web application
> which handles long polling.
>
> In other words, recognise that different parts of your application
> have different requirements and use different hosting technologies for
> each as appropriate.
>
> Graham
Thanks Damjan and Graham,
I never looked at nginx, but is this what you propose (assuming only one
external port can be used)?
- nginx listening on external port
- use the nginx ush module for the 'long poll' urls
- hide apache behind nginx
- use mod_wsgi behind apache
- the web clients will use a tiny piece of javascript to do the longpoll
- the wsgi modules can do a POST request to the push-module
Are there any threads / urls explaining the sweet spots of nginx /
apache / wsgi modules?
contents I had to provide behind one external https port
- static contents without auythenification (nginx?)
- dynamic contents wsgi
- static contents with user based authentification (nginx? apache?)
- static contents with fine grain session based access control
(wsgi grants permission and sends xsendfile headers? to apache
or nginx?)
- contents requiring htps client certificates (??)
It really seems I should start looking at nginx. So far I only used
apache (or for testing twisted )
>
> On 9 June 2011 19:11, Damjan <[email protected]> wrote:
>>> Thus I thought about long polling.
>>>
>>> As I never used this technology I am not sure about the load such an
>>> implementation would put on the server host and whether I could use
>>> mod_wsgi for such an appliation
>>>
>>> My setup / intended application.
>>> - the web server has only port 443 (https) as incoming port
>>> -- a max of about 100 web browsers might be connected at the same time
>>
>> This means you will have 100 mod_wsgi threads blocked with client
>> connections, possible but not optimal at all.
>>
>>> Are there any other suggestions how to setup such a server in a way,
>>> that it doesn't waste too many resources?
>>
>> There're a lot of ways to make this.
>>
>> One is to use nginx and the optional push module[1]. With it, nginx
>> keeps the long-poll connections active, and you just http POST to a
>> special (private) url to notify all the clients. The good thing is
>> that this can work with any backaend language. And it's generally
>> preferential to use nginx in front of Apache anyway.
>>
>> Other option is to use some specially crafted backend (python gevent,
>> erlang?) to handle the long-polls. There are also special servers that
>> are supposed to generically handle long-poll connections (but they
>> seem complicated).
>>
>> [1] http://pushmodule.slact.net/
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "modwsgi" group.
>> To post to this group, send email to [email protected].
>> To unsubscribe from this group, send email to
>> [email protected].
>> For more options, visit this group at
>> http://groups.google.com/group/modwsgi?hl=en.
>>
>>
>
--
You received this message because you are subscribed to the Google Groups
"modwsgi" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/modwsgi?hl=en.