Jonah, Thank you for the answer. Good to know about this enterprise DB feature.
I´ll follow using pgloader. Regards. Adonias Malosso On Sat, Apr 26, 2008 at 10:14 PM, Jonah H. Harris <[EMAIL PROTECTED]> wrote: > On Sat, Apr 26, 2008 at 9:25 AM, Adonias Malosso <[EMAIL PROTECTED]> > wrote: > > I´d like to know what´s the best practice to LOAD a 70 milion rows, 101 > > columns table > > from ORACLE to PGSQL. > > The fastest and easiest method would be to dump the data from Oracle > into CSV/delimited format using something like ociuldr > (http://www.anysql.net/en/ociuldr.html) and load it back into PG using > pg_bulkload (which is a helluva lot faster than COPY). Of course, you > could try other things as well... such as setting up generic > connectivity to PG and inserting the data to a PG table over the > database link. > > Similarly, while I hate to see shameless self-plugs in the community, > the *fastest* method you could use is dblink_ora_copy, contained in > EnterpriseDB's PG+ Advanced Server; it uses an optimized OCI > connection to COPY the data directly from Oracle into Postgres, which > also saves you the intermediate step of dumping the data. > > -- > Jonah H. Harris, Sr. Software Architect | phone: 732.331.1324 > EnterpriseDB Corporation | fax: 732.331.1301 > 499 Thornall Street, 2nd Floor | [EMAIL PROTECTED] > Edison, NJ 08837 | http://www.enterprisedb.com/ >