Marc: You can delete duplicate rows based on all the columns in the table or only a few: DELETE DUPLICATES FROM your_table_name USING your_col_1, your_col_2,... Using this approach, the first duplicate row is retained, and the following duplicate rows are deleted, in you case, the original row would be kept and the duplicate rows loaded afterward would be deleted. You can also use the APPEND method as described by others but you still have to load data into a temporary table first. Javier,
Javier Valencia, PE President Valencia Technology Group, L.L.C. 14315 S. Twilight Ln, Suite #14 Olathe, Kansas 66062-4578 Office (913)829-0888 Fax (913)649-2904 Cell (913)915-3137 ================================================ Attention: The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender and delete the material from all system and destroy all copies. ====================================================== -----Original Message----- From: [email protected] [mailto:[EMAIL PROTECTED] Behalf Of Marc Sent: Wednesday, June 29, 2005 11:43 AM To: RBG7-L Mailing List Subject: [RBG7-L] - RE: Insert Thanks Javier I will try that and see what is the easiest option for our user. The only problem I can see is they may have the same Item name but a different description and this may not find it. Marc ----- Original Message ----- From: "Javier Valencia" <[EMAIL PROTECTED]> To: "RBG7-L Mailing List" <[email protected]> Sent: Wednesday, June 29, 2005 9:06 AM Subject: [RBG7-L] - RE: Insert > Marc: > How are you loading the data? If you are using LOAD then you can use the > NOCHECK option: > CHECK > NOCHECK > CHECK turns on rule checking. When rule checking is on, R:BASE checks input > against data validation rules. NOCHECK turns off rule checking. CHECK and > NOCHECK override the current setting of the SET RULES command. The default > is CHECK. > If you use this option, all data will be loaded and then you can just: > DELETE DUPLICATES FROM ... > Javier, > > Javier Valencia, PE > President > Valencia Technology Group, L.L.C. > 14315 S. Twilight Ln, Suite #14 > Olathe, Kansas 66062-4578 > Office (913)829-0888 > Fax (913)649-2904 > Cell (913)915-3137 > ================================================ > Attention: > The information contained in this message and or attachments is intended > only for the person or entity to which it is addressed and may contain > confidential and/or privileged material. Any review, retransmission, > dissemination or other use of, or taking of any action in reliance upon, > this information by persons or entities other than the intended recipient > is prohibited. If you received this in error, please contact the sender and > delete the material from all system and destroy all copies. > ====================================================== > > -----Original Message----- > From: [email protected] [mailto:[EMAIL PROTECTED] Behalf Of Marc > Sent: Wednesday, June 29, 2005 8:59 AM > To: RBG7-L Mailing List > Subject: [RBG7-L] - RE: Insert > > Hi Javier > > It is not the error message that is the problem > but it seems that the insert command stops once > it hits a row of data that violates the rule or PK. > or am I missing something else? > > > Marc > > > ----- Original Message ----- > From: "Javier Valencia" <[EMAIL PROTECTED]> > To: "RBG7-L Mailing List" <[email protected]> > Sent: Wednesday, June 29, 2005 8:45 AM > Subject: [RBG7-L] - RE: Insert > > > > Marc: > > I believe that if you: > > SET MESSAGES OFF > > SET ERROR MESSAGES OFF > > The load will proceed without displaying the error codes. > > Javier, > > > > Javier Valencia, PE > > President > > Valencia Technology Group, L.L.C. > > 14315 S. Twilight Ln, Suite #14 > > Olathe, Kansas 66062-4578 > > Office (913)829-0888 > > Fax (913)649-2904 > > Cell (913)915-3137 > > ================================================ > > Attention: > > The information contained in this message and or attachments is intended > > only for the person or entity to which it is addressed and may contain > > confidential and/or privileged material. Any review, retransmission, > > dissemination or other use of, or taking of any action in reliance upon, > > this information by persons or entities other than the intended recipient > > is prohibited. If you received this in error, please contact the sender > and > > delete the material from all system and destroy all copies. > > ====================================================== > > > > -----Original Message----- > > From: [email protected] [mailto:[EMAIL PROTECTED] Behalf Of Marc > > Sent: Wednesday, June 29, 2005 8:39 AM > > To: RBG7-L Mailing List > > Subject: [RBG7-L] - Insert > > > > Hi > > > > I want to insert several hundred rows of data into > > a table for my users. The problem is some of them > > already have some of these rows of data and when > > the insert command tries to add a row that already > > exists I get an error and the process stops. > > > > *there is a unique rule on the main column > > > > So, is there a better way to "insert" these rows > > into their table and have it ignore the duplicate > > rows but keep processing the rest of the rows > > of data? > > > > Thanks > > Marc > > > > > >
