I don't know if it works for your data...but you don't need to do all 5.4M in 
one batch.
You should test doing it in different batch sizes -- like 1000 at a time (and 
let other processes do their thing potentially).  That way you won't lock them 
out.  But I think your other selects need to use the sqlite3_busy_handler 
function to let them be next in queue.
You will probably not take much longer to insert.

Michael D. Black
Senior Scientist
NG Information Systems
Advanced Analytics Directorate



________________________________________
From: sqlite-users-boun...@sqlite.org [sqlite-users-boun...@sqlite.org] on 
behalf of Frank Chang [frank_chan...@hotmail.com]
Sent: Saturday, February 19, 2011 11:25 PM
To: sqlite-users@sqlite.org
Subject: EXT :Re: [sqlite] Is it possible to determine how many openconnections 
are active for a sqlite database?

   I wanted to thank Roger Binns for solving my problem. Using sqlite3_request, 
I was able to determine the sqlite database was corrupted when I didn't issue a 
BEGIN EXCLUSIVE before beginning to insert the 5.4 million rows. Evidently, the 
use of BEGIN EXCLUSIVE prevents my transaction from being interrupted by 
another connection from the same process. Thank you.
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to