Rob Sharp wrote:

Sluggers,

I've got a large (2.6G) MySQL dump containing extended SQL inserts which
I need to import onto a server.

Normally, I'd import using something like:

        sudo nice mysql itm_integ < sqldump.sql

but in this case it grinds the server into the ground, presumably
because it loads the file into memory/swap as it imports.

I'm thinking I somehow need to split this file into manageable chunks to
import it, but the script I coded in PHP can't handle files of that size.

Anyone have any pointers on how I can split the file based on the string
'CREATE TABLE' or something like that? A file per table would be fine to
import.

If the dump is of the format (CREATE TABLE ... INSERT, INSERT, INSERT)
for each table, awk would be a nice candidate, but I know v.little awk.
Perl or Python should be able to handle it in about 10 lines of code.


cheer
rick



--
_________________________________
Rick Welykochy || Praxis Services

Power corrupts and PowerPoint corrupts absolutely.
     -- Vint Cerf

--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to