Here is the configuration files and the code used for both tests.
--------------------------------------------
Configuration file for Lighttpd
--------------------------------------------
server.modules = (
"mod_rewrite",
"mod_fastcgi",
)
server.document-root = "/var/www/"
server.errorlog = "/var/log/lighttpd/error.log"
server.pid-file = "/var/run/lighttpd.pid"
## virtual directory listings
dir-listing.encoding = "utf-8"
server.dir-listing = "enable"
server.username = "www-data"
server.groupname = "www-data"
fastcgi.server = ( "/code-fastcgi.py" =>
(( "socket" => "/tmp/fastcgi.socket",
"bin-path" => "/var/www/code-fastcgi.py",
"max-procs" => 1
))
)
url.rewrite-once = (
"^/favicon.ico$" => "/static/favicon.ico",
"^/static/(.*)$" => "/static/$1",
"^/(.*)$" => "/code-fastcgi.py/$1",
)
--------------------------------------------
Configuration file for Nginx
--------------------------------------------
worker_processes 2;
error_log logs/error.log info;
pid logs/nginx.pid;
events {
worker_connections 1024;
}
env HOME;
env PYTHONPATH=/usr/bin/python;
http {
include conf/mime.types;
default_type application/octet-stream;
sendfile on;
keepalive_timeout 65;
wsgi_python_optimize 2;
wsgi_python_executable /usr/bin/python;
wsgi_python_home /usr/;
wsgi_enable_subinterpreters on;
server {
listen 80;
server_name localhost;
include conf/wsgi_vars;
location / {
#client_body_buffer_size 50;
wsgi_pass /usr/local/nginx/nginx.py;
wsgi_pass_authorization off;
wsgi_script_reloading on;
wsgi_use_main_interpreter on;
}
location /wsgi {
#client_body_buffer_size 50;
wsgi_var TEST test;
wsgi_var FOO bar;
wsgi_var EMPTY "";
# override existing HTTP_ variables
wsgi_var HTTP_USER_AGENT "nginx";
wsgi_var HTTP_COOKIE $http_cookie;
wsgi_pass /usr/local/nginx/nginx-2.py main;
wsgi_pass_authorization on;
wsgi_script_reloading off;
wsgi_use_main_interpreter off;
}
location /wsgi-webpy {
wsgi_pass /usr/local/nginx/webpy-code.py;
}
}
}
--------------------------------------------
Code for Lighttpd
--------------------------------------------
#!/usr/bin/env python
import web
urls = (
'/(.*)', 'hello'
)
class hello:
def GET(self, name):
i = web.input(times=1)
if not name: name = 'world'
for c in xrange(int(i.times)):
print 'Hello,', name+'!'
if __name__ == "__main__": web.run(urls, globals())
--------------------------------------------
Code for Nginx
--------------------------------------------
import web
urls = (
'/(.*)', 'hello'
)
class hello:
def GET(self, name):
i = web.input(times=1)
if not name: name = 'world'
for c in xrange(int(i.times)): print 'Hello,', name+'!'
application = web.wsgifunc(web.webpyfunc(urls, globals()))
On Jan 7, 11:29 am, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
> hei david,
> thanks for the benchmarks....
> can you give the configuration files.. for both nginx and lighttpd that you
> used....
>
> On Jan 7, 2008 7:55 AM, David Cancel <[EMAIL PROTECTED]> wrote:
>
>
>
> > I know.. I know... Simple benchmarks mean nothing but I couldn't help
> > playing with the new(ish) mod_wsgi module for my favorite webserver
> > Nginx.
>
> > Nginx:http://nginx.net/
> > Nginx mod_wsgi module:http://wiki.codemongers.com/NginxNgxWSGIModule
>
> > I tested Nginx vs. the recommended setup of Lighttpd/Fastcgi. These
> > very simple and flawed tests were run on Debian Etch running under
> > virtualization (Parallels) on my Macbook Pro. Hey I said they were
> > flawed.. :-)
>
> > The results show Nginx/WSGI performing 3x as fast as Lighttpd/Fastcgi,
> > over 1000 requests per second!!
>
> > I tested both with Keep-Alives on and off. I'm not sure why Nginx/WSGI
> > performed 2x as fast with keep-alives on.
>
> > *********** Full results below *************
>
> > --------------------------------------------
> > Nginx 0.5.34 - Keepalives On
> > ---------------------------------------------
> > ab -c 10 -n 1000 -khttp://10.211.55.4/wsgi-webpy/david
> > This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3
>
> > Server Software: nginx/
> > 0.5.34
> > Server Hostname: 10.211.55.4
> > Server Port: 80
>
> > Document Path: /wsgi-webpy/david
> > Document Length: 14 bytes
>
> > Concurrency Level: 10
> > Time taken for tests: 0.970 seconds
> > Complete requests: 1000
> > Failed requests: 0
> > Broken pipe errors: 0
> > Keep-Alive requests: 1001
> > Total transferred: 136136 bytes
> > HTML transferred: 14014 bytes
> > ** Requests per second: 1030.93 [#/sec] (mean) **
> > Time per request: 9.70 [ms] (mean)
> > Time per request: 0.97 [ms] (mean, across all concurrent
> > requests)
> > Transfer rate: 140.35 [Kbytes/sec] received
>
> > Connnection Times (ms)
> > min mean[+/-sd] median max
> > Connect: 0 0 0.4 0 5
> > Processing: 1 9 4.3 9 26
> > Waiting: 0 9 4.2 9 25
> > Total: 1 9 4.3 9 26
>
> > Percentage of the requests served within a certain time (ms)
> > 50% 9
> > 66% 11
> > 75% 12
> > 80% 13
> > 90% 15
> > 95% 17
> > 98% 20
> > 99% 22
> > 100% 26 (last request)
>
> > --------------------------------------------
> > Nginx 0.5.34 - No Keepalives
> > ---------------------------------------------
> > ab -c 10 -n 1000 http://10.211.55.4/wsgi-webpy/david
> > This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3
>
> > Server Software: nginx/
> > 0.5.34
> > Server Hostname: 10.211.55.4
> > Server Port: 80
>
> > Document Path: /wsgi-webpy/david
> > Document Length: 14 bytes
>
> > Concurrency Level: 10
> > Time taken for tests: 2.378 seconds
> > Complete requests: 1000
> > Failed requests: 0
> > Broken pipe errors: 0
> > Total transferred: 131131 bytes
> > HTML transferred: 14014 bytes
> > ** Requests per second: 420.52 [#/sec] (mean) **
> > Time per request: 23.78 [ms] (mean)
> > Time per request: 2.38 [ms] (mean, across all concurrent
> > requests)
> > Transfer rate: 55.14 [Kbytes/sec] received
>
> > Connnection Times (ms)
> > min mean[+/-sd] median max
> > Connect: 0 4 2.9 3 26
> > Processing: 8 19 8.8 18 136
> > Waiting: 0 19 8.8 17 135
> > Total: 8 23 8.9 21 142
>
> > Percentage of the requests served within a certain time (ms)
> > 50% 21
> > 66% 24
> > 75% 26
> > 80% 28
> > 90% 34
> > 95% 40
> > 98% 45
> > 99% 47
> > 100% 142 (last request)
>
> > *********************************************************************
>
> > --------------------------------------------
> > Lighttpd 1.4.13 - Keepalives On
> > ---------------------------------------------
> > ab -c 10 -n 1000 -khttp://10.211.55.4/david
> > This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3
>
> > Server Software: lighttpd/
> > 1.4.13
> > Server Hostname: 10.211.55.4
> > Server Port: 80
>
> > Document Path: /david
> > Document Length: 14 bytes
>
> > Concurrency Level: 10
> > Time taken for tests: 2.901 seconds
> > Complete requests: 1000
> > Failed requests: 1
> > (Connect: 0, Length: 1, Exceptions: 0)
> > Broken pipe errors: 0
> > Keep-Alive requests: 942
> > Total transferred: 138711 bytes
> > HTML transferred: 14001 bytes
> > ** Requests per second: 344.71 [#/sec] (mean) **
> > Time per request: 29.01 [ms] (mean)
> > Time per request: 2.90 [ms] (mean, across all concurrent
> > requests)
> > Transfer rate: 47.81 [Kbytes/sec] received
>
> > Connnection Times (ms)
> > min mean[+/-sd] median max
> > Connect: 0 0 1.1 0 21
> > Processing: 3 28 29.3 22 385
> > Waiting: 3 28 29.3 22 385
> > Total: 3 28 29.3 22 385
>
> > Percentage of the requests served within a certain time (ms)
> > 50% 22
> > 66% 26
> > 75% 31
> > 80% 34
> > 90% 48
> > 95% 60
> > 98% 100
> > 99% 164
> > 100% 385 (last request)
>
> > --------------------------------------------
> > Lighttpd 1.4.13 - No Keepalives
> > ---------------------------------------------
> > ab -c 10 -n 1000http://10.211.55.4/david
> > This is ApacheBench, Version 1.3d <$Revision: 1.73 $> apache-1.3
>
> > Server Software: lighttpd/
> > 1.4.13
> > Server Hostname: 10.211.55.4
> > Server Port: 80
>
> > Document Path: /david
> > Document Length: 14 bytes
>
> > Concurrency Level: 10
> > Time taken for tests: 4.017 seconds
> > Complete requests: 1000
> > Failed requests: 1
> > (Connect: 0, Length: 1, Exceptions: 0)
> > Broken pipe errors: 0
> > Total transferred: 134269 bytes
> > HTML transferred: 14029 bytes
> > ** Requests per second: 248.94 [#/sec] (mean) **
> > Time per request: 40.17 [ms] (mean)
> > Time per request: 4.02 [ms] (mean, across all concurrent
> > requests)
> > Transfer rate: 33.43 [Kbytes/sec] received
>
> > Connnection Times (ms)
> > min mean[+/-sd] median max
> > Connect: 0 3 4.9 2 68
> > Processing: 3 36 49.6 28 852
> > Waiting: 2 35 49.6 28 852
> > Total: 3 39 50.1 30 855
>
> > Percentage of the requests served within a certain time (ms)
> > 50% 30
> > 66% 36
> > 75% 41
> > 80% 44
> > 90% 61
> > 95% 87
> > 98% 148
> > 99% 252
> > 100% 855 (last request)
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"web.py" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/webpy?hl=en
-~----------~----~----~----~------~----~------~--~---