Fantastic - thank you so much for this - I will try both options - funny I
was considering using a transaction.

Thanks so much.

Celeste.


Dennis Cote wrote:
> 
> gtxy20 wrote:
>> 
>> I can't help but think that the separator is not being escaped properly
>> to
>> indicate a tab - I have tried \t, \\t, "\t", '\t' but no luck.
>> 
> 
> For some reason the -separator option on the command line always sets 
> the separator string to \\t (i.e. the literal string entered on the 
> command line). You can use the .show command to see the current 
> separator string.
> 
> You can use a command script file instead. Enter the following into a 
> text file called testcmd.txt:
> 
>    .separator "\t"
>    .import testdb testtable
> 
> Then use a .read command to process that file.
> 
>    sqlite3 test.db ".read testcmd.txt"
> 
>> Just getting frustrated. - I was originally iterating through the data
>> programatically and using an INSERT query to place the data but for 5
>> million rows this was taking far too long.
>> 
> 
> You will probably be better off going back to your program and ensure 
> that you execute your insert statements inside a transactions.
> 
>    sqlite3_exec(db, "begin", 0, 0, 0);
>    // your code to insert data
>    sqlite3_exec(db, "commit", 0, 0, 0);
> 
> Without the transaction each insert writes to disk. Inside the 
> transaction the disk is written much less often, and hence the operation 
> is much faster. I would expect 5 million orws to take about 2 minutes on 
> typical hardware.
> 
> HTH
> Dennis Cote
> _______________________________________________
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Issuing-command-for-bulk-import-tp18496217p18498259.html
Sent from the SQLite mailing list archive at Nabble.com.

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to