I am considering SQLite for a project that would create a fairly large
database (order of 50 million rows, max 100 million rows). I'll be using the
C API. The DB is mostly used for queries with relatively fewer inserts and
updates. All updates and inserts will be done in batches and will be
scheduled once or twice per day.
While discussing this with other folks at work who have used SQLite in the
past, I heard someone say that they have experienced sporadic database
corruptions in the field. Unfortunately more details are not available at
this point. The only other information is that the environment is Windows.
This is obviously a source of concern for me.

Has anyone else experienced DB corruptions? Are there any things that could
trigger DB corruption -- for example, updates to indices on large data sets?
For example, in my use case because I can do batch updates, dropping indices
before updates and inserts and recreating them later is an option (if it
removes the possibility of DB corruption). Any input on this would be very
valuable to me and the community.

Thanks,
PS
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to