that is true..before loading into database...we need
to do some validations....
if input file contains morethan 5% duplicates,
don't load into databse...

that is why i am finding duplicates...

Thanx
-Madhu

--- "Beau E. Cox" <[EMAIL PROTECTED]> wrote:
> Thanks Peter - good point
> 
> > -----Original Message-----
> > From: Peter Scott [mailto:[EMAIL PROTECTED]
> > Sent: Sunday, February 23, 2003 5:17 AM
> > To: [EMAIL PROTECTED]
> > Subject: RE: Out of memory while finding duplicate
> rows
> > 
> > 
> > In article
> <[EMAIL PROTECTED]>,
> >  [EMAIL PROTECTED] (Beau E. Cox) writes:
> > >Hi -
> > >
> > >Wait! If you are going to load the data into a
> database anyway,
> > >why not use the existing database (or the one
> being created) to
> > >remove duplicates. You don't even have to have an
> index on the
> > >column you are making unique (but it would be
> _much_ faster).
> > >Just select on you key, and, if found, reject the
> datum as
> > >a duplicate. You really shouldn't have to go to
> any draconian
> > >measures to find duplicates!
> > 
> > No need even to do that.  Just set a primary key
> constraint on
> > the database table (like all good tables should
> have anyway)
> > and you're done.  (Or if the "duplicate" criterion
> involves
> > some other column, put a UNIQUE constraint on it.)
>  Then all
> > inserts of duplicate records will fail
> automatically.  Just
> > make sure that RaiseError is set to false in the
> DBI connect.
> > 
> > -- 
> > Peter Scott
> > http://www.perldebugged.com
> 
> I was speaking in general terms - I have no idea
> what the
> structure of his target db is, so my manual way
> covers all
> bases... :)
> 
> Aloha => Beau;
> 
> 
> -- 
> To unsubscribe, e-mail:
> [EMAIL PROTECTED]
> For additional commands, e-mail:
> [EMAIL PROTECTED]
> 


__________________________________________________
Do you Yahoo!?
Yahoo! Tax Center - forms, calculators, tips, more
http://taxes.yahoo.com/

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to