D'Arcy J.M. Cain wrote:
> On Tue, 04 Sep 2007 11:37:14 -0400
> Christopher Hilton <[EMAIL PROTECTED]> wrote:
>> I'm trying to find a place to search the list before I ask a question.
>> Is there a place to do this?
>
> Check the link at the bottom of each list message.
>
I see the archive of the list's postings but I don't see any way to
search it outside of manually downloading each month's archive and then
using grep. I have no problem putting a ksh wrapper around wget to do
just that but it seems to be just as rude as asking my question without
searching the list first.
Anywho, here's my question:
I'm using mod_python to accept an XLS file from a web client. The XLS
file has a bunch of data that I need to relate to some rows in a
postgresql database. I need to "SELECT" the result of the relationship
and send it back out as a text/csv file.
This is pretty straightforward problem. Use the csv module to handle the
input and possibly output of csv data. Use pg or pgdb to handle the
database work and work mod_python plus whatever bits you need from the
os and other modules to glue the whole thing together. The problem is
that I have to balance portability against performance. The way I see it
pgdb, being a DBI 2.0 compliant API is more portable then the pg or
classic Postgresql API. However, if I have to insert a lot of data then
pgdb's executemany() method with an "INSERT" statement will be slower
than pg's "COPY FROM STDIN".
I was digging through the implementation of the two modules, pg and
pgdb, and hoping that pgdb was a "superclass" (sic) of pg. If it were
then I could use pgdb's class structure such that I could extract a
connection compatible with the pg modules calling sequence. That would
allow me to use the nice shiny portable DBI code for most of my work
while still being able to use the pg module for the one "COPY FROM"
statement that I need to do.
<breath />
All that being, my question is am I forced to use one API or the other
exclusively? Or is there a way to instantiate either a pgdb from a
previously connected pg or extract the pg from a previously instantiated
pgdb?
For more info, I pretty much need to run:
CREATE TEMPORARY TABLE foo (
foo_id serial not null primary key,
[customer_params, ...]);
-- Load table foo from a list of python tuples.
-- Either: [ do_insert(t) for t in inputData ]
SELECT * FROM my_relation INNER JOIN foo on matching_customer_params
WHERE conditions;
Thanks for your time,
-- Chris
--
__o "All I was doing was trying to get home from work."
_`\<,_ -Rosa Parks
___(*)/_(*)___________________________________________________________
Christopher Sean Hilton <chris | at | vindaloo.com>
pgp key: D0957A2D/f5 30 0a e1 55 76 9b 1f 47 0b 07 e9 75 0e 14
_______________________________________________
PyGreSQL mailing list
[email protected]
http://mailman.vex.net/mailman/listinfo/pygresql