Assuming you insert 100,000 rows, you also have to consider that any indexes on the table will need to be re-analyzed to fix the statics. Also, the inserts will be slower due to any indexes.


You didn't mention the table-type (storage-engine) you were planning on using, but if you use InnoDB, and do it all in one transaction (ie turn off auto-commit and commit just once at the endi), then any failure during the mass-insert will cause a rollback, so you don't have to worry about the integrity of your data.

"Is it then possible to have a SQL statement that would somehow take the lump field data, select it, chop it up, and insert 100,000 rows, in this case, three fields per row?"

I haven't heard of a SQL function like that. In addition, you'll suffer the pain twice by inserting it once as one big field, selecting it out again, breaking it up, and then re-inserting it.

Is the code going to be running on the machine with the database? That could improve it.

One other thing to consider is to use "IMPORT DATA" to do a bulk load rather than a tonne of insert statements. You can do this from a command-line on the machine where the MySQL server is installed. See the docs at http://www.mysql.com to get the syntax of the IMPORT DATA - it should work on the format of the file as you specified below.

David.

Scott Haneda wrote:

Faced with some larger than usual for me data requirements, I thought I
would ask some others what they think about my logic.

MySql 4

In short, I will have a file upload on a web server that will digest a file,
I will be able to dictate the format of this file. There may be a few
formats, the main one will be:

data\tdata\tdata\r

So, that is some data, a tab, some more data, another tab, then a return.

My trouble is that the data file could be 100,000 lines in length, I have a
few options:

Iterate through the file, one line at a time, using insert delayed I can put
the data into a table with no trouble, this is the simplest method, but
perhaps has performance issues.  In any language, repeating 100,000 times
will take some time, of course, it will happen in the background, but it
still will take some time.  There is also the trouble with a crashing
server, I would have a incomplete set of data inserts, and no real simple
way to deal with this.

I was thinking, perhaps MySql is up to this task, what if I were to insert
the entire file into one field in mysql, then I have all the data in mysql,
this insert should happen much faster, as it is just data, it could be a few
MB's in size, but still should not take too long to get it into MySql.  It
is also only one operation, so the chance of a server crash interfering is
less.

Is it then possible to have a SQL statement that would somehow take the lump
field data, select it, chop it up, and insert 100,000 rows, in this case,
three fields per row?

I was thiking I could easily do some simple string replaces on the data and
get it to one large INSERT statement, this is not all that atractive to me
as I am not 100% certain each line in the file would be in the correct
format, someone on accident may have put in 5 tabs on one line.  In order to
check for this I am back to repeating through each line server side and
testing the line for integrity.

Any ideas and thoughts?

MySql 4






--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]



Reply via email to