Seems similar to the pgloader project on pgfoundry.org.
It is similar and good, but I regard that as a workaround rather than
the way forward.
Yes, your way would be rad :)
---(end of broadcast)---
TIP 1: if posting/reading through Usenet,
If you've ever loaded 100 million rows, you'll know just how annoying it
is to find that you have a duplicate row somewhere in there. Experience
shows that there is always one, whatever oath the analyst swears
beforehand.
It's hard to find out which row is the duplicate, plus you've just
screwed
Simon Riggs [EMAIL PROTECTED] writes:
What I'd like to do is add an ERRORTABLE clause to COPY. The main
problem is how we detect a duplicate row violation, yet prevent it from
aborting the transaction.
If this only solves the problem of duplicate keys, and not any other
kind of COPY error,
Tom Lane wrote:
Simon Riggs [EMAIL PROTECTED] writes:
What I'd like to do is add an ERRORTABLE clause to COPY. The main
problem is how we detect a duplicate row violation, yet prevent it from
aborting the transaction.
If this only solves the problem of duplicate keys, and not any
Seems similar to the pgloader project on pgfoundry.org.
Chris
Simon Riggs wrote:
If you've ever loaded 100 million rows, you'll know just how annoying it
is to find that you have a duplicate row somewhere in there. Experience
shows that there is always one, whatever oath the analyst swears