Sagar C Nannapaneni wrote:
> I've a table in my database with 1 lakh records using a disk space of
> around 50Mb. I wanted to download the records. I used phpmyadmin (coz my
> hosting provider doesnt allow remote access to my db) to export the
> database but its not working for larger databases. I thought that it might
> be a problem with the max execution time. So i thought i wud manually
> store the records in a text file so that i can download it. But when i
> tried that its not creating the file beyond 10Mb. I've tried these things
> out..
>
> ini_set ( max_execution_time, "300");
> ini_set ( mssql.connect_timeout, "300");
> ini_set ( memory_limit, "5000000");

I'm not sure, but you may be prohibited by your provider from resetting
some of those.

It's also possible that you only have 10MB left on your server for "free
space" and so when phpMyAdmin tries to make a temporary file for you to
download, it fails.  (Not sure that's how phpMyAdmin works, just that it
*could* work that way.)

You may want to just contact your provider and explain the problem.  They
can probably do a dump for you or work out a solution that fits in with
their setup faster than we can figure out what's going on.

If all else fails, you could write a custom PHP script to spit out N
records at a time, and then another PHP script on another box to access
that remotely, and concatenate the results to your final file.

This would only work for data that's mostly static.  If it's a high-volume
transaction system, you'll end up with a big mess of only the records up
to a certain point in time in each section of the data, and how that all
fits together...  Well, if you really understand your database and
application, you could maybe still make it work without excessive locking
of tables, but it would take some real work.

-- 
Like Music?
http://l-i-e.com/artists.htm

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to