Howdy!

I recently noticed the 'gunicorn'-hype on this list. Could anybody explain to me, what the great advantage of gunicorn (as an extra HTTP-Server) is over running webpy in FastCGI-mode (via flup)?

I deploy in production using FastCGI; the reduced overhead of header processing is nice.

On the other hand, I've developed my own fully HTTP/1.1-compliant HTTP server using a preforking async model and Tornado's IOLoop/IOStream "under the hood" that outperforms Tornado ~1Krsecs (~50%) benchmarked on the same (select-based) machine. (On an epoll-based server, forking one process per core, it is also able to survive C10K bombardment and serve > 10Krsecs at smaller concurrency rates.)

When the underlying HTTP server is efficient enough, the savings of FastCGI can be negligible.

        — Alice.


--
You received this message because you are subscribed to the Google Groups 
"web.py" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/webpy?hl=en.

Reply via email to