On Thu, Apr 21, 2011 at 2:33 PM, Phoenix <phoenix.ki...@gmail.com> wrote:

> Hi,
>
> I have a database with 130 million rows. It's an RDBMS.
>
> I am thinking of using Sqlite3 as a kind of a backup, just for the
> important bits of data.
>
> Questions.
>
> 1. Is this advisable? Will Sqlite3 hold up to this volume?
>

I wrote a smallish application that ran on big iron and read a bunch of data
from flat text files into a sqlite database to support future relational
queries.

I was also pre-calculating some results that I knew would be asked for and
putting those into a little side table.

The data sets were usually on the order of 1.5 billion* to 2.0 billion
records of around 300 bytes each spread across maybe 10 or 15 columns.  I
was parsing and inserting at a rate of like 200k rows/sec -- so a couple
hours to read in the whole data set.

So, no problem from sqlite's end in my application fwiw.

Eric

* I'm American -- "billion" == "thousand million" == 10^9.
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to