Re: ignore accents in order by

2009-06-12 Thread Per Jessen
PJ wrote: > Let me put it this way, I am not having the problem. The problem seems > to be withthe way that character encoding is set up on the internet - > as confused and inconsistent as most everything else. > You can put whatever charset you want in the header, in the collations > in your data

Mysterious progress after recovery in MySQL Community Edition 5.1.34

2009-06-12 Thread Mike Spreitzer
A colleague had to kill a MySQL server (on RedHat Enterprise Linux 5) because it had some problem shutting down. Later I launched it (with `/usr/share/mysql/mysql.server start`). In its err log I saw the recovery happen, apparently with a successful completion, and then the usual announcement

Re: Mysterious progress after recovery in MySQL Community Edition 5.1.34

2009-06-12 Thread Michael Dykman
It looks to me like you had trouble shutting down because you were in the middle of a HUGE transaction.. having been killed, a rollback of nearly 10 million statement need be run. I would suggest that somewhere in your processing, you are holding one connection open a long time, doing a lot of wo

Re: Mysterious progress after recovery in MySQL Community Edition 5.1.34

2009-06-12 Thread Mike Spreitzer
Yes, when the shutdown was initiated there was a huge "LOAD DATA" in progress. Is there some server config change I should make that would cause commits to happen occasionally during that operation? I know of no way to resume such an operation after the server shutdown and eventual restart, t

Re: Mysterious progress after recovery in MySQL Community Edition 5.1.34

2009-06-12 Thread Mike Spreitzer
BTW, I have another instance of this problem right now. I will try breaking that huge table up into chunks, but have not yet done so. I have a "LOAD DATA LOCAL INFILE" in progress, and want to abort it (so I can try the better way). I have ^Ced the `mysql` client twice, killing it. The serv

Re: Mysterious progress after recovery in MySQL Community Edition 5.1.34

2009-06-12 Thread Mike Spreitzer
I could afford to completely delete the schema (AKA database) into which the "LOAD DATA LOCAL INFILE" is going. How exactly would I do that, given that the server is still really busy shutting down? If necessary, in some instances, I could afford to lose all the data on a given machine (and I

a possible group issue???

2009-06-12 Thread bruce
Hi... I have the following... mysql> INSERT INTO ParseScriptTBL VALUES -> ('auburnCourse.py',40,1,1), -> ('auburnFaculty.py',40,2,2), -> ('uofl.py',2,1,3), -> ('uky.py',3,1,4), -> ('ufl.py',4,1,5) -> ; Query OK, 5 rows affected (0.00 sec) Records: 5 Duplicates: 0 Warnings: 0 mysql> select * from

Re: a possible group issue???

2009-06-12 Thread Max Bube
Try with GROUP_CONCAT(ScriptName) http://dev.mysql.com/doc/refman/5.0/en/group-by-functions.html#function_group-concat 2009/6/12 bruce > Hi... > > I have the following... > > mysql> INSERT INTO ParseScriptTBL VALUES > -> ('auburnCourse.py',40,1,1), > -> ('auburnFaculty.py',40,2,2), > -> ('uof

RE: a possible group issue???

2009-06-12 Thread bruce
hi martin... thanks for the reply.. but that still generates two separate rows as well... -Original Message- From: Martin Gainty [mailto:mgai...@hotmail.com] Sent: Friday, June 12, 2009 12:04 PM To: bruce Douglas Subject: RE: a possible group issue??? mysql> select * from ParseScriptTB

Re: ignore accents in order by

2009-06-12 Thread Isart Montane
I agree with Per, I use utf8 and it works fine for me, even with Chinese characters On Fri, Jun 12, 2009 at 8:40 AM, Per Jessen wrote: > PJ wrote: > > > Let me put it this way, I am not having the problem. The problem seems > > to be withthe way that character encoding is set up on the internet

BULK DATA HANDLING 0.5TB

2009-06-12 Thread Krishna Chandra Prajapati
Hi guys, I'm working in a telecom company. I have table called deliverylog in which 30 million records gets inserted per/day. The table has grown to 0.5TB I have to keep 60days record in the table. So, 60days * 30 million = 1800 million records. The query is taking a lot of time to fetch the resul