Hi there:

I'm on a shared machine, and trying to get some 15 MB or so SQL files into MySQL. No problem doing this at the command line, of course, but I have several SQL files that need to be imported a few times a month. So, ideally, I'd like to FTP them to a directory and go to a web page and hit a button to do the import.

Problem is, the PHP script is terminating with a PHP error; server's support thinks it's out of memory. According to phpinfo(), the script can use 90M, so the question is why a 20M sql import would use up 90M?

I'm doing it like this:

$cmd = "/path/to/mysql -h hostname -u username --password=password db_name < sqlfile.sql";
exec( $cmd );

I had thought that using this command-line command would prevent PHP from dealing with the memory footprint of the operation, but apparently this is not the case.

So...doable in PHP without an approach like busting apart the file into small chunks? Or better to use another language on the server for this...?

Cheers,
Marc


_______________________________________________
New York PHP Users Group Community Talk Mailing List
http://lists.nyphp.org/mailman/listinfo/talk

http://www.nyphp.org/Show-Participation

Reply via email to