Oracle export (exp) creates a binary file that can only be loaded using the
import (imp) program.  The problem with import is that it uses insert
statements to load the data.
 
Oracle's SQL*Loader uses formatted ascii files for loading.  Sqlloader has a
"direct path" options that writes data blocks directly.  This method is
*VERY* fast.  Hence the need to extract the data.

-----Original Message-----
From: Matthew Tedder [mailto:[EMAIL PROTECTED]]
Sent: Friday, June 08, 2001 11:28 AM
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: Re: Data Extract


 
You've got to be kidding, a big database like Oracle doesn't have database
dump utility?  In PostgreSQL, you can dump the whole database's data and/or
dump all the SQL commands needed to rebuild and re-populate a database,
identically.  This is very safe method of upgrading or even switching to a
different RDBMS with a little conversion work on the SQL (if and where
necessary).
 
--Matthew

>>> Curt Russell Crandall <[EMAIL PROTECTED]> 06/08/01 11:23AM >>>
For a half a billion rows, I would seriously consider coding this in C
using whatever Oracle libraries are available for accessing the API.  That
would probably be your best choice in terms of having speedy code.  You
may also be able to do a large portion of your work in a stored procedure
as well.
Perl is awesome and the DBI is great... but for 500,000,000 rows of data
it may not be the best alternative depending on your speed requirements.

--Curt




Reply via email to