Did you try using IGNORE keyword while using the LOAD DATAFILE command.
This will ignore duplicate rows from getting inserted and proceed further.
On Fri, Jun 15, 2012 at 11:05 AM, Keith Keller
kkel...@wombat.san-francisco.ca.us wrote:
On 2012-06-14, Gary Aitken my...@dreamchaser.org wrote:
- Original Message -
From: Gary Aitken my...@dreamchaser.org
surprising as the source did not enforce uniqueness. My problem is
the load data simply dies without indicating which line of the input
file was in error; the error message refers to line 3, which is not
even the SQL
I think the following might give complete information (I removed some
columns not involved in the problem)
Server version: 5.1.49-3 (Debian)
SET collation_connection = utf8_unicode_ci;
Query OK, 0 rows affected (0.00 sec)
show variables like '%colla%';
hi,
I am biased on mysql, and hence i am asking this on mysql forum first.
I am designing a solution which will need me to import from CSV, i am using
my JAVA code to parse. CSV file has 500K rows, and i need to do it thrice
an hour, for 10 hours a day.
The Queries will mainly be update but
Hello,
I am designing a solution which will need me to import from CSV, i am using
my JAVA code to parse. CSV file has 500K rows, and i need to do it thrice
an hour, for 10 hours a day.
try to use `LOAD DATA INFILE' to import from CSV file.
On 6/14/2012 5:57 PM, Gary Aitken wrote:
Hi all,
I've looked high and low for what I hope is a trivial answer.
I was trying to load a table using LOAD DATA INFILE. Unfortunately, it craps
out because there are some duplicate primary keys. Not surprising as the
source did not enforce
Hi List
I have (had) a mysql database running on a linux server which crashed
and suffered e2fsck file system corruption. I applied the e2fsck
filesystem checker, which recovered what appears to be most of the
files comprising the data, storing them in the lost+found
directory. This looks
Thanks, Shawn; I knew there was a better way to go about that.
Gary
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/mysql
On 2012-06-14, Gary Aitken my...@dreamchaser.org wrote:
So... I wanted to read the data line at a time and use a plain INSERT
statement. That way I could check for duplicate keys and discover where the
duplicate records are. However, I can't find a way to read input from the
console or
You are very close to a standalone test case. Please create such. Then post
it on bugs.mysql.com .
-Original Message-
From: GF [mailto:gan...@gmail.com]
Sent: Friday, June 15, 2012 12:45 AM
To: Rick James
Cc: Shawn Green; mysql@lists.mysql.com
Subject: Re: Foreign key and
My backups from a mysqldump process are useless, because the dump files are not
escaping single quotes in the data in the fields.
So, O'Brien kills it - instead of spitting out
'O\'Brien'
it spits out
'O'Brien'
I don't see anywhere in the documentation about mysqldump where you can tweak
I have mysql 5.5.
I am able to use mysqldump to export data with quotes and the dump had
escape character as seen below
LOCK TABLES `ananda` WRITE;
/*!4 ALTER TABLE `ananda` DISABLE KEYS */;
INSERT INTO `ananda` VALUES
On 6/15/2012 1:00 PM, Rick James wrote:
You are very close to a standalone test case. Please create such. Then post
it on bugs.mysql.com .
-Original Message-
From: GF [mailto:gan...@gmail.com]
Sent: Friday, June 15, 2012 12:45 AM
To: Rick James
Cc: Shawn Green; mysql@lists.mysql.com
Let's see
SHOW CREATE TABLE ...
SELECT ...
It sounds doable with MySQL; might be too big for NOSQL.
-Original Message-
From: abhishek jain [mailto:abhishek.netj...@gmail.com]
Sent: Friday, June 15, 2012 1:57 AM
To: mysql@lists.mysql.com
Subject: Which Database when lot of insert /
Those refer _only_ to German 'ß' LATIN SMALL LETTER SHARP S. The example GF
gave did not involve that character.
To my knowledge, that is the only case where MySQL changed a collation after
releasing it.
-Original Message-
From: Shawn Green [mailto:shawn.l.gr...@oracle.com]
Sent:
On 6/15/2012 3:19 PM, Rick James wrote:
Those refer _only_ to German 'ß' LATIN SMALL LETTER SHARP S. The example GF
gave did not involve that character.
To my knowledge, that is the only case where MySQL changed a collation after
releasing it.
Yes, it has been the only occurrence.
Are you using an abnormal CHARACTER SET or COLLATION?
SHOW CREATE TABLE
Show us the args to mysqldump.
-Original Message-
From: James W. McNeely [mailto:j...@newcenturydata.com]
Sent: Friday, June 15, 2012 10:19 AM
To: mysql@lists.mysql.com
Subject: mysqldump not escaping single
At 16.40 15/06/2012 -0400, Shawn Green wrote:
On 6/15/2012 3:19 PM, Rick James wrote:
Those refer _only_ to German 'ß' LATIN SMALL LETTER SHARP S. The example GF
gave did not involve that character.
To my knowledge, that is the only case where MySQL changed a collation after
releasing it.
-Original Message-
From: Gary Aitken [mailto:my...@dreamchaser.org]
Sent: Thursday, June 14, 2012 2:58 PM
I can get the table loaded by specifying REPLACE INTO TABLE, but that
still
leaves me with not knowing where the duplicate records are.
To find duplicate entries
select
2012/06/15 18:14 +0900, Tsubasa Tanaka
try to use `LOAD DATA INFILE' to import from CSV file.
http://dev.mysql.com/doc/refman/5.5/en/load-data.html
Try is the operative word: MySQL s character format is _like_ CSV, but not
the same. The treatment of NULL is doubtless the biggest
20 matches
Mail list logo