Part of the problem was the way the server settings were allocating
memory. The concat would work until a certain size and then the
suddenly failed to insert (inserted NULL). After the memory
configuration change, in terms of concatenating (with CONCAT) in
pieces, it seems to work at least in my test to about 240MBs.
There may still be a problem as the data grows beyond this.
John
Harald Fuchs wrote:
In article <[EMAIL PROTECTED]>,
John Ling <[EMAIL PROTECTED]> writes:
Hello, realizing that there is a max_allowed_packet setting that
limits the size of the insert statement, is there a way around it by
chunking the query?
In particular, if the query is to insert a large text or blob, can I
simply concatenate smaller pieces of the data in succession using the
Concat command?
My concern is whether this will still in someway cause me other MySQL
resource problems?
I want to be able to insert a large text or blob of over 200-400MBs.
I tried INSERTing in chunks with concat() a few months ago and found
out that it didn't work; thus effectively your blob size is limited by
max_allowed_packet. But since you can increase max_allowed_packet up
to 1G since version 4.0, this should not be a problem any more.
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]