Cliff Wells wrote:
> I was doing some rather simplistic benchmarks between Pylons and
> TurboGears (not only simplistic, but unfair since they are two different
> sites) and was a bit surprised to see that the TurboGears site handily
> outperformed the Pylons site.  The sites are dissimilar so it isn't a
> true benchmark, but frankly the TurboGears page I was benchmarking was
> actually larger and more complex than the Pylons page.  I'll probably do
> some better tests later, but my initial suspicion is that the default
> "paster serve" isn't as fast as CherryPy (both are proxyied to via
> Nginx).

In my simplistic tests CherryPy 3 is about 50% faster than 
paste.httpserver.  That's keeping everything else equivalent, and not 
including any framework.  I don't know how other factors would effect that.

Pylons has some performance gotchas if you use threadlocals in certain 
ways.  Otherwise I would expect it to be better performing than 
TurboGears, but it's hard to generalize these things.

> I've considered trying to use CherryPy's WSGI server but came across
> this:
> 
> http://william-os4y.livejournal.com/2924.html
> 
> Since TurboGears currently uses CP2, and CP3 is much faster than CP2 and
> *this* (as the author claims) is again faster than CP3, perhaps this
> project might be of interest to Paste/Pylons.  However, as the author
> notes, it needs some help with WSGI support (so naturally I thought you
> might be the one to best provide that help ;-)

I'm not much of a performance guy myself, but there's a good chance 
someone involved with Pylons might be interested.

The first things to do with FAPWS would be to create a Paste Deploy 
entry point for it.  The author might want to include that directly in 
FAPWS (though he'd have to move to setuptools from distutils), or we can 
put it in Paste Script (there's a bunch of them there already).

Then make sure it validates, and try a performance test.  Hopefully he's 
not comparing FAPWS speed using an async backend and a threaded CherryPy 
WSGI backend.  But if not, and the speed is as he says, that could be 
quite useful.  I'd also be somewhat pleased to see a simple async HTTP 
server supporting WSGI, but more for the ability to handle lots of slow 
clients and things like that.  Boring old Medusa is still a contender 
there as well.  Twisted would be, but they can't seem to get their act 
together.

-- 
Ian Bicking | [EMAIL PROTECTED] | http://blog.ianbicking.org

_______________________________________________
Paste-users mailing list
[email protected]
http://webwareforpython.org/cgi-bin/mailman/listinfo/paste-users

Reply via email to