I think the post about the command line maybe the way to go. People suck into terabytes of sql files into mysql tables. At that level you want to go hardcore and get as close to raw inputs as possible.
The command line and VI are your friends. Really! :-) That's about as hardcore as you can go. On Tue, Oct 20, 2009 at 5:57 PM, Marc Antony Vose <suzer...@suzerain.com>wrote: > Hi there: > > I'm on a shared machine, and trying to get some 15 MB or so SQL files into > MySQL. No problem doing this at the command line, of course, but I have > several SQL files that need to be imported a few times a month. So, ideally, > I'd like to FTP them to a directory and go to a web page and hit a button to > do the import. > > Problem is, the PHP script is terminating with a PHP error; server's > support thinks it's out of memory. According to phpinfo(), the script can > use 90M, so the question is why a 20M sql import would use up 90M? > > I'm doing it like this: > > $cmd = "/path/to/mysql -h hostname -u username --password=password db_name > < sqlfile.sql"; > exec( $cmd ); > > I had thought that using this command-line command would prevent PHP from > dealing with the memory footprint of the operation, but apparently this is > not the case. > > So...doable in PHP without an approach like busting apart the file into > small chunks? Or better to use another language on the server for this...? > > Cheers, > Marc > > > _______________________________________________ > New York PHP Users Group Community Talk Mailing List > http://lists.nyphp.org/mailman/listinfo/talk > > http://www.nyphp.org/Show-Participation > -- IM/iChat: ejpusa Links: http://del.icio.us/ejpusa Follow me: http://www.twitter.com/ejpusa Karma: http://www.coderswithconscience.com
_______________________________________________ New York PHP Users Group Community Talk Mailing List http://lists.nyphp.org/mailman/listinfo/talk http://www.nyphp.org/Show-Participation