Select DISTINCT wouldn't work in this case because you wouldn't know whether the new data in each row was unique until you compared it with the old data.
---mark --- Stephen Garrett <[EMAIL PROTECTED]> wrote: > I think this is a good idea, but doesn't it depend > upon whether > you need to keep an original row intact, as it > probably has > a numeric SEQ Key assigned to it?? > > How could you do that with the distinct method, and > keep the original > row in place? (eg it had an assigned part number or > sum such) > > Steve > > At 03:05 PM 3/15/2002 -0500, Cottell, Matthew wrote: > >Couldn't you insert the all the records in one fell > swoop. > >Then perform a Select Distinct on all the rows? > >Insert those records into a new table, and Voila, > your done. > > > >As I understand what you're saying, > >either the row is a complete match, or its a > completely unique record. > >There's no instances of some of the data being the > same and having to choose > >which record to include. > >Or am I missing something? > > > >Matt > > > > > >> -----Original Message----- > >> From: Mark Warrick [SMTP:[EMAIL PROTECTED]] > >> Sent: Friday, March 15, 2002 2:45 PM > >> To: SQL > >> Subject: Re: need help with large database update > >> > >> I have to compare each row of new data with > existing > >> data because there are no primary keys. So I > can't > >> just append new data into the table because then > I > >> might have duplicate data in the table. > >> > >> Just to be clear, "New" data doesn't necessarily > mean > >> that the data doesn't already exist in the table. > It > >> just means it's a new datafile. > >> > >> ---mark > >> > >> > >> --- Douglas Brown <[EMAIL PROTECTED]> wrote: > >> > Question....Why are you comparing the data > before > >> > updating? If the data that you are updating > with is > >> > the same data, it would not matter and also if > their > >> > is new data then that would be adjusted > accordingly. > >> > Maybe I'm confused. > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > "Success is a journey, not a destination!!" > >> > > >> > > >> > > >> > Doug Brown > >> > ----- Original Message ----- > >> > From: "Kelly Matthews" <[EMAIL PROTECTED]> > >> > To: "SQL" <[EMAIL PROTECTED]> > >> > Sent: Friday, March 15, 2002 11:05 AM > >> > Subject: Re: need help with large database > update > >> > > >> > > >> > > why not do it via a stored procedure... much > >> > quicker... > >> > > > >> > > ---------- Original Message > >> > ---------------------------------- > >> > > From: Mark Warrick <[EMAIL PROTECTED]> > >> > > Reply-To: [EMAIL PROTECTED] > >> > > Date: Fri, 15 Mar 2002 11:02:51 -0800 (PST) > >> > > > >> > > >Hello All, > >> > > > > >> > > >I have a database of about 85,000 records > which > >> > has 15 > >> > > >columns. I need to update this database > with a > >> > > >datafile that contains the same schema and > just > >> > as > >> > > >many records. > >> > > > > >> > > >For each row that is going to be imported, I > have > >> > to > >> > > >compare all 15 columns of data for each row > >> > against > >> > > >all 15 columns of each row in the database > to see > >> > if > >> > > >there's a match, and if not, then import the > new > >> > data. > >> > > > > >> > > >Every query I've written with ColdFusion to > do > >> > this > >> > > >seems to kill the server. Even comparing > one row > >> > of > >> > > >data seems to put extreme load on the > server. > >> > > > > >> > > >Anyone got a clue as to how I might > accomplish > >> > this > >> > > >goal? I may be willing to pay somebody to > do > >> > this. > >> > > > > >> > > >---mark > >> > > > > >> > > > > >> > > > >> > > >__________________________________________________ > >> > > >Do You Yahoo!? > >> > > >Yahoo! Sports - live college hoops coverage > >> > > >http://sports.yahoo.com/ > >> > > > > >> > > > >> > > >> > >> > > > ______________________________________________________________________ Get the mailserver that powers this list at http://www.coolfusion.com Archives: http://www.mail-archive.com/[email protected]/ Unsubscribe: http://www.houseoffusion.com/index.cfm?sidebar=lists
