Hi,

Mike Bayer, the author of SQLAchemy, wrote a long article about
asyncio and databases:
http://techspot.zzzeek.org/2015/02/15/asynchronous-python-and-databases/

IMO the most interesting part if the benchmark written for this article:
https://bitbucket.org/zzzeek/bigdata/

The benchmark inserts a lot of rows (more than 9 millions) in a
PosgreSQL server using psycopg2 and aiopg. It compares performances of
 threads, gevent and asyncio. Bad news: asyncio is much slower on this
benchmark (between 1.3x and 3.3x slower).

It's not easy to create the setup to run the benchmark (ex: you have
to install a PostgreSQL server and configure it to run the benchmark),
you have to find the best pool size for your setup and then you have
to analyze bencmark results (there is no unique number at the end,
just a long list of numbers). On my first setup (desktop: benchmark,
laptop: server, slow LAN), I had to stop the benchmark after 2 hours.

Mike see between 6,000 and 26,000 SQL INSERT queries per second
depending on his setup and on the benchmark parameter. Ah yes, there
are also options to tune the benchmark, but I don't think that you are
supposed to use them.

I'm trying to reproduce the benchmark to check if I get similar
results and then to try to run asyncio in a profiler. I never used
aiopg, nor psycopg2, and I don't remember when I installed a
PostgreSQL server for the last time :-)

Victor

Reply via email to