Ok, a few more tests with pypy (jit enable) in the welcome app with the
Web2py 1.99.3 (2011-12-09 16:18:03) stable - *not the latest
*
*PyPy*
ab -n 50 -c 10 http://127.0.0.1:8000/welcome/
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking 127.0.0.1 (be patient).....done
Server Software: Rocket
Server Hostname: 127.0.0.1
Server Port: 8000
Document Path: /welcome/
Document Length: 11187 bytes
Concurrency Level: 10
Time taken for tests: 2.197 seconds
Complete requests: 50
Failed requests: 0
Write errors: 0
Total transferred: 580350 bytes
HTML transferred: 559350 bytes
Requests per second: 22.76 [#/sec] (mean)
Time per request: 439.358 [ms] (mean)
Time per request: 43.936 [ms] (mean, across all concurrent requests)
Transfer rate: 257.99 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.4 0 1
Processing: 243 415 66.8 420 545
Waiting: 243 415 66.8 419 544
Total: 244 415 66.7 420 546
Percentage of the requests served within a certain time (ms)
50% 420
66% 439
75% 459
80% 468
90% 515
95% 537
98% 546
99% 546
100% 546 (longest request)
*PYTHON*
ab -n 50 -c 10 http://127.0.0.1:8181/welcome/
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking 127.0.0.1 (be patient).....done
Server Software: Rocket
Server Hostname: 127.0.0.1
Server Port: 8181
Document Path: /welcome/
Document Length: 11187 bytes
Concurrency Level: 10
Time taken for tests: 2.984 seconds
Complete requests: 50
Failed requests: 0
Write errors: 0
Total transferred: 580400 bytes
HTML transferred: 559350 bytes
Requests per second: 16.75 [#/sec] (mean)
Time per request: 596.850 [ms] (mean)
Time per request: 59.685 [ms] (mean, across all concurrent requests)
Transfer rate: 189.93 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 0
Processing: 218 496 368.8 445 2984
Waiting: 211 493 368.9 443 2984
Total: 218 496 368.8 445 2984
Percentage of the requests served within a certain time (ms)
50% 445
66% 467
75% 491
80% 512
90% 558
95% 620
98% 2984
99% 2984
100% 2984 (longest request)
*NGINX + uWSGI*
ab -n 50 -c 10 http://127.0.0.1:80/welcome/
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking 127.0.0.1 (be patient).....done
Server Software: nginx/1.1.4
Server Hostname: 127.0.0.1
Server Port: 80
Document Path: /welcome/
Document Length: 11187 bytes
Concurrency Level: 10
Time taken for tests: 0.639 seconds
Complete requests: 50
Failed requests: 0
Write errors: 0
Total transferred: 578450 bytes
HTML transferred: 559350 bytes
Requests per second: 78.23 [#/sec] (mean)
Time per request: 127.829 [ms] (mean)
Time per request: 12.783 [ms] (mean, across all concurrent requests)
Transfer rate: 883.82 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 0
Processing: 24 116 22.9 123 143
Waiting: 24 116 22.9 123 143
Total: 24 116 22.9 123 143
Percentage of the requests served within a certain time (ms)
50% 123
66% 124
75% 127
80% 127
90% 131
95% 131
98% 143
99% 143
100% 143 (longest request)
PyPy is faster, but probably the gain isn't justified yet.
I've also include the combination of nginx + uwsgi that I found to be the
fastest setup till now.
On Monday, March 12, 2012 9:48:58 PM UTC, Anthony wrote:
>
> On Monday, March 12, 2012 5:02:00 PM UTC-4, Francisco Costa wrote:
>>
>> On Monday, March 12, 2012 7:46:50 PM UTC, Anthony wrote:
>>>
>>> Maybe related to this: https://bugs.pypy.org/issue1051. Have you tried
>>> 1.7?
>>>
>>
>> Nop, i've tried the last source from bitbucket
>>
>
> Sorry, didn't notice you said "no jit" -- that's the issue.
>
> Anthony
>