---- Sanjeev N <[EMAIL PROTECTED]> wrote:
> Hi,
> I have written a program which imports the tab delimited file and insert all
> the line from file to the mysql line by line.
> I am succeding in the above case. but problem with the above method is its
> taking to too much time while inserting into the database.
> The file's size will be more than 5000 lines.
>
> Then i tried to build a string of ; seperated queries. as follows
>
> for($i=1; $i<sizeof($array); $i++){
> $insert_string .= "insert into tablename (v1, v2..... v6)
> values('$array[$i][1]', '$array[$i][2]'..... '$array[$i][6]');";
> }
> if(!empty($insert_string)){
> mysql_query($insert_string, $conn) or die("query failed : ".mysql_errror());
> }
>
> Its throwing error saying check the manual for right syntax.....
>
> After investigating in some sites i come to know that its problem of
> limitations in query size.
>
> I also tried with "SET GLOBAL max_allowed_packet=30000000;"
> Then also its throwing the same error.
>
> Can anybody tell me how to fix this error and reduce the inserting time with
> a single statement instead of writing more insert statements
>
Sure, check with a MySQL list via http://www.mysql.com
Otherwise, split it up into multiple inserts.
I currently have a script which reads a 550,000 CSV file and uses
<?php
// read file
// for each line, do the following
while ($in != EOF)
{
$line=explode ('","',$in);
$out="insert into table values('','$line[0]', .....)";
fwrite ($outf, $out);
}
then I have another file that
<?php
//read the file line by line
//connect to db
$sql = $inLine;
mysql_query($sql) or die();
?>
which all in all takes about 10 seconds to run the conversion and then the
inserts.
HTH,
Wolf
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php