Am 01.02.2006 um 17:10 schrieb deBooza (sent by Nabble.com):
Hi
I'm using sqlite in a c++ program, using functions sqlite3_open,
sqlite3_exec, sqlite3_close etc.
I am trying to import a lot of data, probably around 100,000 rows+.
I have found it
quicker if I format the data into SQL s
Hi
Thanx for the responses. I've gone in the direction Dennis described - fast
enough for my purposes.
Fanx
--
View this message in context:
http://www.nabble.com/Executing-SQL-from-file-t1040732.html#a2755202
Sent from the SQLite forum at Nabble.com.
deBooza (sent by Nabble.com) wrote:
Hi
I'm using sqlite in a c++ program, using functions sqlite3_open, sqlite3_exec,
sqlite3_close etc.
I am trying to import a lot of data, probably around 100,000 rows+. I have
found it
quicker if I format the data into SQL statements and then use
the shell
> I'm using sqlite in a c++ program, using functions sqlite3_open,
> sqlite3_exec, sqlite3_close etc.
>
> I am trying to import a lot of data, probably around 100,000 rows+. I have
> found it
> quicker if I format the data into SQL statements and then use
> the shell statement .read to read in th
Something related, but that doesn't really answer the question: if you
want to populate a database with so many rows, to speed up things a lot
you should embed them into a transaction (or in a small number of
transactions). This way, if sqlite works synchronously, it doesn't need to
flush data
5 matches
Mail list logo