Google Summer of Code 2007
Hi Pylonist ? Google Summer of Code 2007 is on. Why on list of projects, Pylons is missing (Django is included)? Django isn't better than Pylons but their marketing are much better :( Cheers, Artur --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
Hi Robert, Ian I'm going to be doing some performance tests on my setup in the next few days, but one thing I've noticed in preliminary playing is that using fastcgi/flup with nginx is noticeably faster than a straight proxy. I've been interested to see how well Pylons works with Nginx and FastCGI too because I've heard lots of people say good things about the setup. On my server with a fairly simple app, using Nginx 0.4.13 to proxy to a paste http server takes about 11.5ms per request (over 1000 requests). With Nginx and FastCGI the same app took about 51ms per request. Using the paste without the proxy takes about 10.6ms per request. Unless I've set something up very wrong, that means that there is about a 1ms overhead using Nginx as a proxy compared to doing the requests directly but that using HTTP is about 5 times faster than using FastCGI. Should I be using a different version of Nginx? I would say though that Nginx is very easy to setup and I do like it, even if the FastCGI setup doesn't seem faster than the HTTP setup with Pylons. FastCGI doesn't seem substantially easier to parse than HTTP, so I'm not sure why that'd be. Maybe flup is just faster than paste.httpserver. Or maybe there's something different about the way connections are handled (are FastCGI connections persistent in any way?). I would have been surprised if the FastCGI version was faster too, and in my tests it isn't. I'd be very interested to hear your results though Robert. Cheers, James P.S. For anyone interested, this is still my favorite method of deploying Pylons/Paste apps in production: http://pylonshq.com/project/pylonshq/wiki/DaemonTools --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
On Sun, 2007-02-18 at 17:50 +, James Gardner wrote: Unless I've set something up very wrong, that means that there is about a 1ms overhead using Nginx as a proxy compared to doing the requests directly but that using HTTP is about 5 times faster than using FastCGI. Should I be using a different version of Nginx? 0.4.13 is pretty old, but I'm still surprised by that amount of overhead. Maybe you could try with the latest (0.5.12) version? I would say though that Nginx is very easy to setup and I do like it, even if the FastCGI setup doesn't seem faster than the HTTP setup with Pylons. Most of the benchmarks I've seen (which is only a couple) have shown FastCGI on Nginx to be slightly faster than proxying, although by a very small amount. Regards, Cliff --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
On 2/18/07, Cliff Wells [EMAIL PROTECTED] wrote: On Sun, 2007-02-18 at 17:50 +, James Gardner wrote: Unless I've set something up very wrong, that means that there is about a 1ms overhead using Nginx as a proxy compared to doing the requests directly but that using HTTP is about 5 times faster than using FastCGI. Should I be using a different version of Nginx? 0.4.13 is pretty old, but I'm still surprised by that amount of overhead. Maybe you could try with the latest (0.5.12) version? Well by default nginx is going to cache the proxied server's response before sending it to the client, that could explain the 1msec or so. I'm sure you'd have different results over a slower link where it makes more sense to do that kind of caching. I would say though that Nginx is very easy to setup and I do like it, even if the FastCGI setup doesn't seem faster than the HTTP setup with Pylons. Most of the benchmarks I've seen (which is only a couple) have shown FastCGI on Nginx to be slightly faster than proxying, although by a very small amount. However, proxying is a lot easier to set up than FastCGI. I'm sure there's things that can be done to paste.httpserver to make it come closer to FastCGI in performance. -bob --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
On Sun, 2007-02-18 at 12:20 -0800, Bob Ippolito wrote: However, proxying is a lot easier to set up than FastCGI. Absolutely. That's what I always use. I doubt the small performance gain is going to add up to much in the way of scalability anyway ;-) What I typically use is a small cluster of Pylons servers proxied to by Nginx, which is something else not as easily done with FastCGI. I'm sure there's things that can be done to paste.httpserver to make it come closer to FastCGI in performance. Maybe. I'm going to be investigating fapws (and perhaps CherryPy's WSGI server as well) to see if there's any significant gain by using those rather than paste.httpserver (although I suspect most of the overhead is in the framework and application, not the HTTP server itself, so even significant gains in HTTP performance might not add up to much overall). Regards, Cliff --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
Hi guys, Maybe you could try with the latest (0.5.12) version? I've rerun the tests with 0.5.12 and the difference is exactly the same. FastCGI is 5 times *slower* than simple HTTP! Well by default nginx is going to cache the proxied server's response before sending it to the client, that could explain the 1msec or so. I'm sure you'd have different results over a slower link where it makes more sense to do that kind of caching. I'm not so worried about the 1ms difference. There is bound to be an overhead adding another component to the stack. What does seem strange is that people are claiming FastCGI performance is better than HTTP performance when in my tests the HTTP performance is 5 times faster! I'm sure there's things that can be done to paste.httpserver to make it come closer to FastCGI in performance. Hang on a sec, FastCGI is 5 times slower than HTTP, surely it is the FastCGI implementation that requires improvement? Maybe. I'm going to be investigating fapws (and perhaps CherryPy's WSGI server as well) to see if there's any significant gain by using those rather than paste.httpserver (although I suspect most of the overhead is in the framework and application, not the HTTP server itself, so even significant gains in HTTP performance might not add up to much overall) Again, my tests clearly show HTTP is 5 times faster than FastCGI. Am I missing a trick? Cliff: Do you still have the links to the benchmarks you mentioned, I'd be interested to see the setup being tested. Perhaps you are proxying to multiple FastCGI backends which is of course going to be faster than a single HTTP backend but you can also proxy to multiple HTTP backends so that isn't comparing like with like. I'd really like to get to the bottom of these rumours because if there is a faster way of serving a Pylons app I'm keen to document it so that everyone can benefit. Cheers, James --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
James Gardner wrote: Hi guys, Maybe you could try with the latest (0.5.12) version? I've rerun the tests with 0.5.12 and the difference is exactly the same. FastCGI is 5 times *slower* than simple HTTP! Someone mentioned caching -- are you sure that the HTTP server is getting all the requests? If Nginx is caching some responses and not passing them through, it will of course be much faster. Perhaps you are proxying to multiple FastCGI backends which is of course going to be faster than a single HTTP backend but you can also proxy to multiple HTTP backends so that isn't comparing like with like. It shouldn't dramatically improve performance to use multiple FastCGI backends. Unless you have a SMP machine or something, which could change performance in all kinds of ways. -- Ian Bicking | [EMAIL PROTECTED] | http://blog.ianbicking.org --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
Hi Ian, I've rerun the tests with 0.5.12 and the difference is exactly the same. FastCGI is 5 times *slower* than simple HTTP! Someone mentioned caching -- are you sure that the HTTP server is getting all the requests? If Nginx is caching some responses and not passing them through, it will of course be much faster. Yes, I'm sure there is no caching when using HTTP because I can benchmark the paste server directly and it is about 1ms faster per request than using Nginx to proxy to it. Perhaps you are proxying to multiple FastCGI backends which is of course going to be faster than a single HTTP backend but you can also proxy to multiple HTTP backends so that isn't comparing like with like. It shouldn't dramatically improve performance to use multiple FastCGI backends. Unless you have a SMP machine or something, which could change performance in all kinds of ways. Fair point, but it depends a bit on what is going on that is making the FastCGI performance so poor. James --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
On 2/18/07, Ian Bicking [EMAIL PROTECTED] wrote: James Gardner wrote: Hi guys, Maybe you could try with the latest (0.5.12) version? I've rerun the tests with 0.5.12 and the difference is exactly the same. FastCGI is 5 times *slower* than simple HTTP! Someone mentioned caching -- are you sure that the HTTP server is getting all the requests? If Nginx is caching some responses and not passing them through, it will of course be much faster. It definitely does not cache any proxied requests ever, unless you're using the memcached module, but that's very explicit. Perhaps you are proxying to multiple FastCGI backends which is of course going to be faster than a single HTTP backend but you can also proxy to multiple HTTP backends so that isn't comparing like with like. It shouldn't dramatically improve performance to use multiple FastCGI backends. Unless you have a SMP machine or something, which could change performance in all kinds of ways. I'm curious as to why anyone would want to use FastCGI in the first place if proxying is available? Implementation wise, there's very little reason why FastCGI would be markedly faster or slower than the HTTP protocol. -bob --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
Hi Bob, I'm curious as to why anyone would want to use FastCGI in the first place if proxying is available? Implementation wise, there's very little reason why FastCGI would be markedly faster or slower than the HTTP protocol. Agreed, there's no point unless FastCGI is significantly faster which would be unexpected. Since the tests show it isn't faster I'm going to carry on using my existing HTTP setup. We can put the FastCGI rumour to bed. Cheers, James --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Pagination research
Sorry... I was offline for a while (thanks, telco)... On Tuesday 06 February 2007 00:43, David Smith wrote: Regarding queries, you don't need them when you're using assign_mapper For normal queries like model.mytable.get_by(name='Chris') it's great to use the additional methods that assign_mapper gives me. [1] And it's also easy to paginate a whole database table. But what if I want to page only through mytable where certain criteria match? Like I just want to see pages of entries that start with 'C'. As soon as I give the paginator something like model.mytable.select() I select everything and don't use proper LIMITs. Understood. But what is the correct way to pass something to the paginator? I thought of something like: paginator.paginate(model.mytable.query(name.like='C%') Anyway, I sent up a patch to fix the paginator. Try it out. With the patch, you send the paginator anything with count and select methods, like model.Person or a select and where clause. I'm not sure why you/we don't just take the object class that a query is derived from. Why is isinstance(obj, 'sqlalchemy.orm.query') wrong? Cheers Christoph (who still didn't give up with his paginator redesign) [1] By the way I found the URL where these methods are documented just in case anyone wonders... http://www.sqlalchemy.org/docs/plugins.myt#plugins_assignmapper --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
Robert Leftwich wrote: As I'm the one that said it was faster earlier in the thread, I think I should be the one to put the rumour to bed :-)) Sure. I've heard the same rumour in other places too though actually, particularly related to rails but also with Pylons eg: http://www.rkblog.rk.edu.pl/w/p/pylons-benchmark-various-servers/ As mentioned in the earlier post, I was playing around with some preliminary configurations on my laptop and it was faster using flup/fastcgi (via ab, not wall clock). Interesting. I was using ab too. I have been waiting for another server to be setup side by side at my hosting company before I did any 'real' testing, as any testing done outside the host network just saturated the b/w I had available from my office (nothing faster than 128k ISDN where I live/work) w/o getting the server warmed up. OK, I was testing on localhost. I probably should have just kept my email shut until doing some real testing - stay tuned. I look forward to the results. Could you let me know the platform you are on too please? Sometimes Debian etch is a bit weird! Cheers, James --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
James Gardner wrote: OK, I was testing on localhost. So was I, on the laptop. Any testing to the real box was useless at 128k. Could you let me know the platform you are on too please? Sometimes Debian etch is a bit weird! Ubuntu 6.06 on both servers and laptops. Robert --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
On Sun, 2007-02-18 at 23:42 +, James Gardner wrote: I've heard the same rumour in other places too though actually, particularly related to rails but also with Pylons eg: http://www.rkblog.rk.edu.pl/w/p/pylons-benchmark-various-servers/ This is the benchmark I remember seeing, although I've heard others mention similar results. OK, I was testing on localhost. I don't think testing on localhost is ever going to give accurate results since the client load is added to the server load on the same machine. Regards, Cliff --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
[Routes] Default protocol?
I'm just switching access to the account management section of the site to use https and have a problem. If I set the protocol in the h.url_for() call on a page, then when the browser navigates to that https url, all the url's generated for the destination page (navigation, images, etc) are also https, as they are all relative. It looks as though it is only possible to specify the protocol in the url_for() call, which means I have to find every instance of it and add protocol='http'/'https' as appropriate. Is it possible to set the protocol in the configuration, i.e. mapper.connect()?, rather than in every call to url_for() or have I missed something? Robert --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: [Paste] problems with paster serve reload
I've been getting this on Gentoo ever since I started using Pylons two months ago, on two different versions of Pylons and Paste (both Python 2.4.x). Only I don't have to modify any files; merely running paster serve --reload and ctrl-c is enough to leave threads running which have to be manually killed. On the other hand it *does* reload files properly I think. However, it works properly on my Kubuntu 6.10 system, using the system Python 2.5 and a new copy of Pylons 0.9.4.1, and the default paster create application. I also tried a workingenv of Python 2.4.4c1 and it workes in that. -- Mike Orr [EMAIL PROTECTED] --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: ANN: AuthKit 0.3.0pre4
Cliff Wells wrote: On Fri, 2007-02-16 at 15:02 +, James Gardner wrote: I'd encourage anyone who is currently using the dev version to upgrade to 0.3.0pre4 so we can see if there are any issues before 0.3.0 is formally released in a couple of weeks time. So I rolled back to the previous version and all is working well again, but I'm left wondering if this was a bug in AuthKit or if I was doing something wrong and the new version suddenly exposed my mistake. To add another data point, I upgraded to svn version 57 (pre5?) from v43 and it all worked as expected. It even appears to have corrected an intermittent problem someone reported while testing where the authkit cookie was being set to an empty string, using a host with a leading dot in addition to an existing authkit cookie set to the correct value, using a host w/o a leading dot. I hadn't had a chance to try and chase down this issue, so having it go away is a good thing. FWIW, I'm using forwarding along with RemoteUser() and middleware. Robert --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
download huge collection of free ebooks for free
Free Computer Education Ebooks,Tutorials and much more ASP, Business, C++, Careers, CISCO, e-books, Engineering, English, Filmmaking, Finance, Health, Leadership, Management, Marketing, Mathematics, Mobile, Oracle, Perl , Photography, PHP, Programming, VOIPand much more visit http://ebooks2download.blogspot.com today --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: bug in webhelpers.rails.prototype.form_remote_tag()
Argh! I forgot one more point of evidence... to rule out that Pylons or Paste itself was implicated, I fired up Live HTTP Headers and watched the wire: save=saveproduct=(Not%20a%20Value!)_= That's what gets sent. I'm not sure what the trailing underscore is all about either, but it doesn't affect this problem. Thanks! --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
On Feb 18, 6:09 pm, Robert Leftwich [EMAIL PROTECTED] wrote: James Gardner wrote: Agreed, there's no point unless FastCGI is significantly faster which would be unexpected. Since the tests show it isn't faster I'm going to carry on using my existing HTTP setup. We can put the FastCGI rumour to bed. As I'm the one that said it was faster earlier in the thread, I think I should be the one to put the rumour to bed :-)) As mentioned in the earlier post, I was playing around with some preliminary configurations on my laptop and it was faster using flup/fastcgi (via ab, not wall clock). Benchmarking on a laptop can be misleading since the CPU scaling makes it impossible to get consistent results. -- Matt Good --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---
Re: Fast Python webserver
Matt Good wrote: Benchmarking on a laptop can be misleading since the CPU scaling makes it impossible to get consistent results. Yep. FWIW I kept on repeating the test until I got consistent results with the CPU maxed out, but I wasn't putting any weight in the results as testing on the same machine as the server is not going to give meaningful results anyway - it was just a passing comment...honest :-) Hopefully, all will be revealed later this week with 2 separate servers to test with. Robert --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups pylons-discuss group. To post to this group, send email to pylons-discuss@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/pylons-discuss?hl=en -~--~~~~--~~--~--~---