I have about 1,000 flat files that I have loading into a mysql table. The
load is done through a shell script that iterates over each file, loading
sequentially. That is, I inititate a "load data local ..." statement on
each of the 1,000 files.
The total size of the completed data base is just over 10
million rows with about 200 columns.
About 3/4 through the process (roughly file 750), mysql responds thusly:
Lost connection to MySQL server during query
No additional data is loaded on the roughly 250 load statements. I thought
perhaps there was a cache or buffer problem. So I start the client with the
following command arguements.
mysql --wait --set-variable=max_allowed_packet=16M
flush query cache;
This didn't help. I also had the shell script pause a few seconds between each
load statement (in desperation) no change - dies at the same location.
I finally resorted to breaking the project into two tables instead of one, but
eventually need to get this solved. The data loads fine into two seperate
tables (each table gets half of the rows) indicating queue/buffer issues were
not the problem.
There is seemingly a table size limit that I am hitting - though this doesn't
seem possible. Any suggestions???
details:
myssql 4.0.20
linux: fedora core 2
file system : ext3
Thanks in advance.
Michaell
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]