Re: [PERFORM] How to insert a bulk of data with unique-violations very fast

2010-06-07 Thread Torsten Zühlsdorff
Pierre C schrieb: Since you have lots of data you can use parallel loading. Split your data in several files and then do : CREATE TEMPORARY TABLE loader1 ( ... ) COPY loader1 FROM ... Use a TEMPORARY TABLE for this : you don't need crash-recovery since if something blows up, you can COPY it a

Re: [PERFORM] How to insert a bulk of data with unique-violations very fast

2010-06-07 Thread Pierre C
Within the data to import most rows have 20 till 50 duplicates. Sometime much more, sometimes less. In that case (source data has lots of redundancy), after importing the data chunks in parallel, you can run a first pass of de-duplication on the chunks, also in parallel, something like :