There is no commit because i'm only SELECTing the data from Oracle.  I'm
writing to ASCII files.

-----Original Message-----
From: Steve Sapovits [mailto:[EMAIL PROTECTED]]
Sent: Friday, June 08, 2001 1:02 PM
To: Curt Russell Crandall; Pettit, Chris L
Cc: '[EMAIL PROTECTED]'
Subject: RE: Data Extract



We've also found that committing in blocks speeds up heavy
inserts with Perl/DBI/Oracle.  How big depends on the job.
Most people do all or nothing:  AutoCommit or no commit and
a rollback or commit at the end.  Something in between usually
is better for most inserts.

----
Steve Sapovits
Global Sports Interactive
Work Email: [EMAIL PROTECTED]
Home Email: [EMAIL PROTECTED]
Work Phone: 610-491-7087
Cell:       610-574-7706
Pager:      877-239-4003

> -----Original Message-----
> From: Curt Russell Crandall [SMTP:[EMAIL PROTECTED]]
> Sent: Friday, June 08, 2001 12:36 PM
> To:   Pettit, Chris L
> Cc:   '[EMAIL PROTECTED]'
> Subject:      RE: Data Extract
> 
> First of all, I've never used Oracle so I'm not aware of any fancy tools
> to make life easier to accomplish this.
> From the original post, asking about optimizing Perl to do this task, I
> inferred that speed might be an issue.  With Sybase, the difference
> between using Perl and the DBI versus using C and the OpenClient libraries
> is negligible for almost everything we do... but we aren't trying to
> select and insert 1/2 billion rows.  I recall overhearing a senior
> programmer at this shop remark that the difference in using C vs. Perl for
> DB apps. amounts to less than a 10th of a millisecond per select or insert
> (don't remember which one of if there's really a difference in speed
> between the 2 ops).
> Given your figures and the ones Michael gave, I would assume moving 1/2
> billion rows would take in the neighborhood of 7+ hours.  If you could
> shave 1x10^-6 seconds off from moving each row, you'd save about an hour
> and a half.  Depending on how critical time requirements are... and how
> often you'd need to do something like this... writing the code in C might
> be worth it and I wouldn't think writing the code in C would really take a
> whole lot longer for an application like this (unless you know Perl very
> well and you know C not so well).
> Of course if Oracle came if a nice tool to simplify this, it would
> probably be faster not just to write the app but also in terms of
> execution speed.
> 
> Now, if you want to optimize the Perl code, I'd just make sure to prepare
> any select/insert statement up front and in your loop just do an execute
> on the prepared statement.  I've also found that committing every 100-200
> transactions (in Sybase) seems to give me the best performance.
> 
> On Fri, 8 Jun 2001, Pettit, Chris L wrote:
> 
> > 
> > I've used Perl to prepare large files for SQL Loader.
> > 1.9 million. SQL loader is fast, I was able to load the 1.9 mill records
> in
> > less than six minutes.
> > We had to do text extraction for the data.
> > DBI is absolutley awesome for that IMHO.
> > clp
> > 

Reply via email to