On Tue, 10 Apr 2001, Joe Johnson wrote:

> I have a table with over 1,000,000 records in it containing names and phone
> numbers, and one of the indexes on the table is a unique index on the phone
> number.  I am trying to copy about 100,000 more records to the table from a
> text file, but I get an error on copying because of duplicate phone numbers
> in the text file, which kills the COPY command without copying anything to
> the table.  Is there some way that I can get Postgres to copy the records
> from the file and just skip records that contain duplicates to the unique
> index? I found that using PHP scripts to do inserts for a file of this size
> take MUCH longer than I'd like, so I'd like to avoid having to do it that
> way if I can.  Any help is appreciated.  Thanks!

There are a few options.

This was discussed yesterday, in the thread 'problem with copy command'

-- 
Joel Burton   <[EMAIL PROTECTED]>
Director of Information Systems, Support Center of Washington


---------------------------(end of broadcast)---------------------------
TIP 2: you can get off all lists at once with the unregister command
    (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])

Reply via email to