I have just tried both drivers -- but in an apples-and-oranges comparison. I used pg8000 with pypy and web2py because it is pure Python and can be used with pypy. I used psycopg2 with python 2.7 on the same database and application.
My application begins with a bulk-load of a CSV file. The file has about 450,000 records of about 10 fields each. Inserting the file using psycopg2 and python 2.7 took about 4-5 minutes on a quad-core i7 iMac. The memory used was about 20M for postgres (largest thread) and about an equal amount for python. The task was handled by the web2py scheduler. The pypy-pg8000 version of the file load took almost an hour, but that is deceptive. The problem is that it overwhelmed the 12GB of memory in the computer. Both the pypy task and the postgres task ran amok with memory requirements. The postgres task took >8GB and forced the computer into swapping, killing the response time. Pypy is known for being somewhat of a memory hog (I was trying version 2.0.2). It worked darned well in web2py, with this being the only problem I encountered. Since my code heavily relies on modules, the speedup was noticible using pypy. Some of my longer tasks include creating pdf files and this took about 1/3 to 1/5 the time under pypy as compared to cpython 2.7.1. I know this is not an accurate comparison (because of the pypy component), but the runaway memory use of postgres under pg8000 concerned me so I thought I'd mention it. -- Joe B. On Wednesday, May 1, 2013 4:59:26 PM UTC-7, Marco Tulio wrote: > > Are there any advantages on one or another or are they basically the same > thing? > I'm using psycopg2 atm. > > -- > []'s > Marco Tulio > -- --- You received this message because you are subscribed to the Google Groups "web2py-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/groups/opt_out.

