On Apr 21, 3:36 pm, Scott David Daniels <scott.dani...@acm.org> wrote:
> Philip Semanchuk wrote:
> > ... If you're doing a mass insert to populate a blank table it also often
> > helps to postpone index creation until after the table is populated....
>
> I forget the name of the SQL Server bulk loader, but for large loads, I
> used to populate a fresh table with the bulk data, then do UPDATEs and
> INSERTs to get the data spread out into the main tables.  You (the OP)
> might try a scheme like that.
>
> --Scott David Daniels
> scott.dani...@acm.org

Hmm..I think I'm going to move my question over to a SQL forum because
this is starting to feel like a SQL, rather than a python issue to me.

Three times now after letting the system "rest" where I go off an do
other things and then run my script it completes in 10 seconds.  If I
drop tables and start fresh immediately after that it takes 35
seconds.  If I drop the tables and wait an hour and then run the
script it'll finish in 10 seconds again.

That makes me think it's a SQL config or optimization issue more than
a python issue.

oh and the times I listed above were totals from the start of
execution so the string.join() was taking 0.047 seconds to run.  It
was taking 9 seconds to get my data from the com object and format it
but the join was quite fast.
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to