00 records!!
-Original Message-
From: Ade Olonoh [mailto:[EMAIL PROTECTED]
Sent: MiƩrcoles, 01 de Marzo de 2006 12:19 p.m.
To: Miguel Guirao
Cc: php-db@lists.php.net
Subject: Re: [PHP-DB] Duplicate rows
Assuming you're using MySQL, instead of using INSERT INTO, you can use
REPLACE INTO instea
err columns.. sorry..
On Wednesday 01 March 2006 10:45 am, Micah Stevens wrote:
> Ahh, good point, yes, keep in mind you may have some index rows..
>
> On Wednesday 01 March 2006 10:18 am, [EMAIL PROTECTED] wrote:
> > Haha.. oh yeah.. DISTINCT works too.. in this case you'd get a list of
> > all
Ahh, good point, yes, keep in mind you may have some index rows..
On Wednesday 01 March 2006 10:18 am, [EMAIL PROTECTED] wrote:
> Haha.. oh yeah.. DISTINCT works too.. in this case you'd get a list of all
> totally 100% unique records.
>
> If you had an auto_increment column though, you'd want
Haha.. oh yeah.. DISTINCT works too.. in this case you'd get a list of all
totally 100% unique records.
If you had an auto_increment column though, you'd want to exclude it from the
list.
-TG
= = = Original message = = =
SELECT DISTINCT * FROM `tablename`
On Wednesday 01 March 2006 7:24 am
Assuming you're using MySQL, instead of using INSERT INTO, you can use
REPLACE INTO instead. If you have unique keys on that table, the new
record will overwrite existing records with the same unique keys instead
of creating a new one.
http://dev.mysql.com/doc/refman/5.0/en/replace.html
--Ad
Depends on how you determine if something's a duplicate or not. For example,
if it's just one column that can be used, you can do something like this:
select ItemName, count(ItemName) from ItemListTable group by ItemName
having count(ItemName) > 1
That'll show you if "ItemName" is repeated.
SELECT DISTINCT * FROM `tablename`
On Wednesday 01 March 2006 7:24 am, Miguel Guirao wrote:
> My dear beloved friends,
>
> I have a catalog of products that a product provider gave, sadly for me, in
> this CSV file there are many duplicated rows.
> I edited the file in my Linux system with the "
My dear beloved friends,
I have a catalog of products that a product provider gave, sadly for me, in
this CSV file there are many duplicated rows.
I edited the file in my Linux system with the "uniq -u" command, and it
worked somewhat fine, it eliminated some duplicated rows, originally the
file