On Tue, Sep 30, 2008 at 5:16 AM, Glenn Gillen <[EMAIL PROTECTED]> wrote:
> Hey all,
>
> I've got a table with a unique constraint across a few fields which I
> need to regularly import a batch of data into. Is there a way to do it
> with COPY without getting conflicts on the unique contraint? I have no
> was of being certain that some of the data I'm trying to load isn't in
> the table already.
>
> Ideally I'd like it to operate like MySQL's on_duplicate_key_update
> option, but for now I'll suffice with just ignoring existing rows and
> proceeding with everything else.

I ran into a similar problem. I'm using these merge_by_key functions:

http://pgfoundry.org/projects/mbk

Here's a quick example...

CREATE TEMP TABLE foo (LIKE dst INCLUDING DEFAULTS);

COPY foo (c1, c2) FROM STDIN;
(your copy data here)
\.

SELECT * FROM merge_by_key(
        'public', -- table schema
        'dst', -- table name
        'mnew.c2 < mold.c2', -- merge condition
        'select c1,c2 FROM foo'
);

Disclaimer: The author is a friend of mine. :-)

-- 
Sent via pgsql-sql mailing list (pgsql-sql@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-sql

Reply via email to