Re: [PERFORM] Best practice to load a huge table from ORACLE to PG

2008-04-26 Thread Potluri Srikanth
 But do we link oracle trigger to postgres trigger ?
 i mean :
 oracle trigger will take a note of what has been changed .
 but then how do we pass those changes to postgres trigger ?
 can u suggest any logic or algorithm ?
 Regards, 
 Srikanth k Potluri 
 +63 9177444783(philippines) 
 On Sat 26/04/08  8:40 PM , Joshua D. Drake [EMAIL PROTECTED]
sent:
 Adonias Malosso wrote: 
  Hi All, 
   
  I�d like to know what�s the best practice to LOAD a 70 milion
rows, 101  
  columns table 
  from ORACLE to PGSQL. 
   
  The current approach is to dump the data in CSV and than COPY it
to  
  Postgresql. 
   
  Anyone has a better idea. 
 Write a java trigger in Oracle that notes when a row has been  
 added/delete/updated and does the exact same thing in postgresql. 
 Joshua D. Drake 
   
   
  Regards 
  Adonias Malosso 
 --  
 Sent via pgsql-performance mailing list
(pgsql-performance@postgresql.org [1]) 
 To make changes to your subscription: 
 http://www.postgresql.org/mailpref/pgsql-performance 


Links:
--
[1]
http://sitemail7.hostway.com/javascript:top.opencompose(\'[EMAIL 
PROTECTED]',\'\',\'\',\'\')


[PERFORM] bulk data loading

2008-04-07 Thread Potluri Srikanth

Hi all,

I need to do a bulk data loading around 704GB (log file size) at present in 8 hrs (1 am - 9am). The data file size may increase 3 to 5 times in future.

Using COPY it takes 96 hrs to finish the task.
What is the best way to do it ?

HARDWARE: SUN THUMPER/ RAID10
OS : SOLARIS 10.
DB: Greenplum/Postgres


Regards,

Srikanth k Potluri

+63 9177444783(philippines)