Hello All:
    I'm working on an application that has to drop
and then recreate tables programmatically and does not
have 'file' privileges. i.e., I can't use 'load data infile'.

On a smaller venue, I programmatically create a .sql script
file that drops, then re-creates the table, and then
writes a series of insert statements that line by line insert
the values as imported from a tab-delimited text file. 

(Essentially the same as a 'dump')

Then make a system call to mysql to load the files
as in 'mysql --host=host --user=usr --password=pwd --database=db < table.sql'

Now that is sufficient for under a thousand records, but
next I must rebuild a table of 50-60 thousand records and
more with 70 fields or more.

I would welcome any helpful hints as to optimise this, as
well as pointers to docs or even thoughts on a different
solution.

TIA and best regards
-- 
Tim Johnson <[EMAIL PROTECTED]>
      http://www.alaska-internet-solutions.com
      http://www.johnsons-web.com

-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to