Yimin writes..

>I have a script that parses some log files and feeds the data 
>into a MSSQL database table, using DBI. The problem is that 
>sometimes the log files is generated too fast for my script 
>to keep up. I had success in reducing trips to the databases 
>by putting multiple SQL INSERT statements in one statement handle. 
>However I am now planning on moving the database to mysql, which 
>does not support multiple SQL statement in one statement handle. 
>The script is already running locally on the database server. 
>Any suggestions?

Well, MySQL will already do simple table inserts a lot faster than
MSSQL. I'm not sure where MySQL is up to these days with database
features, but to make the inserts quick you should avoid all table
constraints like foreign keys and indexes. If you need an index then
drop it before the inserts and recreate it after the inserts so it's not
being updated on each insert.

I've not looked into this sort of stuff, but if you use a particular
cursor or lock on the record set then can't you write to the recordset
and it handles updating the database? That might be quicker than
individual inserts as well.

Alternatively, I think MySQL supports some fairly neat integration with
files, you might be able to do sort of an INSERT INTO Table FROM
File_name and batch the inserts up like that. Check your doco, or maybe
ask in some MySQL forum.

-- 
  Jason King
_______________________________________________
Perl-Win32-Admin mailing list
[EMAIL PROTECTED]
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to