Could you post your code with aiohttp. 10x difference seems too big. 

Sent from Outlook




On Thu, Mar 26, 2015 at 4:36 PM -0700, "Ludovic Gasc" <[email protected]> wrote:










Since two weeks, I'm trying to use AsyncIO on top of PyPy3.3
>From my experience, two main elements aren't present on PyPy3.3:1. pip doesn't 
>work on PyPy3 => For pure Python libraries, you can install Python packages in 
>CPython pyvenv and change PYTHONPATH2. Monotonic clock and 
>time.get_clock_info() aren't implemented => The workaround I've found is to 
>use a standard clock (I know it's important to use monotonic, it's only for 
>tests) and hardcode the answer of time.get_clock_info(). At least for me, it 
>isn't very easy to implement monotonic clock in PyPy.
Nevertheless, I've tested:1. aiotests: All tests passed2. Several random 
scripts from AsyncIO doc: everything is ok3. aiohttp examples: no issues4. 
API-Hour+aiohttp.web: It runs like on CPython 3.4
It's a good surprise for me, I didn't think it should be possible to use PyPy 
directly right now: I found only one bug with "yield from" quickly fixed by 
PyPy developers.
As usual, I've did some benchmarks. You must note four points:1. PyPy 3.3 is 
not yet released: certainly some improvements should be available when it will 
be released2. My PyPy benchmark doesn't use ujson, because ujson works only 
with CPython. It should change some values if ujson is ported for PyPy.3. More 
I launch benchmarks on PyPy daemon, more I've performances. I've launched 
several 5 minutes benchmarks before to launch theses 1 minutes.4. this use case 
it's a micro-benchmark, you don't have any connections to a backend like 
PostgreSQL or Redis, more realistic use cases where AsyncIO has better results.
I've redid the same benchmark I've told you few days ago with simple JSON 
payload.
PyPy + API-Hour + aiohttp.web:
$ wrk -t8 -c256 -d1m http://192.168.2.100:8008/jsonRunning 1m test @ 
http://192.168.2.100:8008/json  8 threads and 256 connections  Thread Stats   
Avg      Stdev     Max   +/- Stdev    Latency    32.89ms   18.58ms 260.08ms   
78.10%    Req/Sec     0.99k   117.11     1.42k    71.97%  472966 requests in 
1.00m, 87.96MB readRequests/sec:   7883.06Transfer/sec:      1.47MB
PyPy + API-Hour + AsyncIO:
$ wrk -t8 -c256 -d1m http://192.168.2.100:8009/jsonRunning 1m test @ 
http://192.168.2.100:8009/json  8 threads and 256 connections  Thread Stats   
Avg      Stdev     Max   +/- Stdev    Latency     3.32ms   11.99ms 224.86ms   
95.26%    Req/Sec    12.63k     4.46k   43.67k    70.53%  5744657 requests in 
1.00m, 0.96GB readRequests/sec:  95760.07Transfer/sec:     16.35MB

To remember, the results I give you for WSGI and API-Hour on CPython:
WSGI:
$ wrk -t8 -c256 -d1m http://192.168.2.100:8080/jsonRunning 1m test @ 
http://192.168.2.100:8080/json  8 threads and 256 connections  Thread Stats   
Avg      Stdev     Max   +/- Stdev    Latency     1.81ms    2.24ms  32.56ms   
99.04%    Req/Sec    20.49k     3.09k   52.56k    81.39%  9300719 requests in 
1.00m, 1.59GB readRequests/sec: 155019.04Transfer/sec:     27.05MB
API-Hour + aiohttp.web:
$ wrk -t8 -c256 -d1m http://192.168.2.100:8008/jsonRunning 1m test @ 
http://192.168.2.100:8008/json  8 threads and 256 connections  Thread Stats   
Avg      Stdev     Max   +/- Stdev    Latency    18.36ms   11.36ms 117.66ms   
67.44%    Req/Sec     1.79k   238.97     2.65k    74.02%  854843 requests in 
1.00m, 158.16MB readRequests/sec:  14248.79Transfer/sec:      2.64MB
API-Hour + AsyncIO:

$ wrk -t8 -c256 -d1m http://192.168.2.100:8009/jsonRunning 1m test @ 
http://192.168.2.100:8009/json  8 threads and 256 connections  Thread Stats   
Avg      Stdev     Max   +/- Stdev    Latency     1.96ms    3.55ms  60.51ms   
99.05%    Req/Sec    19.77k     3.06k   55.78k    85.49%  8972814 requests in 
1.00m, 1.49GB readRequests/sec: 149565.74Transfer/sec:     25.39MB
As you can see, in this specific benchmark with theses hit values, PyPy3.3 is 
slower than CPython with AsyncIO for now.But, I imagine that the PyPy dev team 
goal is more to be fully compliant with CPython 3.3 than have the best 
performances.
Finally, if you are interested in to help PyPy project but you have no time, 
you can donate money: http://pypy.org/py3donate.htmlI've did a recurring 
donation for the PyPy project.--
Ludovic Gasc (GMLudo)http://www.gmludo.eu/

Reply via email to