I agree to use mysqldump to export the data to a text file. If you want to
save time, you could write a php script to ftp the text file from the dead
host to the new host. Then load the data back into the new database.
I guess if you have broadband connection, then you might not be saving too
Leo G. Divinagracia III wrote:
> project i'm working with, his host died or something.
> but, the phpMyadmin (on that host) can NOT do a backup of 200mb table
> without running out of memory or something weird.
> so what is the best way to text dump the table?
> i thought i would (pseudo-code):
> open DB
> do while not EOF
> grab 1000 rows from DB
> write to a file on host www path
> filename = filename000 + 1
> until EOF
> user would then FTP down the files to his machine.
> but i figured i could use the HEADER to send it to the user.
> i grabbed this from the HEADER help section:
> $output_file = 'something.txt';
> $content_len = 666;
> @ini_set('zlib.output_compression', 'Off');
> header('Pragma: public');
> header('Content-Transfer-Encoding: none');
> header('Content-Type: application/octetstream; name="' . $output_file .
> header('Content-Disposition: inline; filename="' . $output_file . '"');
> header("Content-length: $content_len");
> is that the best way to send the txt file to the user?
Try using mysqldump from the command line. If you need a PHP script to
do this, use system() or `` or one of the other system call functions.
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php