On Wed, 16 Oct 2013 07:32:08 -0500, John McKown wrote:
I can't say for sure, but would somebody want to store 20
million rows in a SQLite data base? Why? I really am curious


Hi John,

you could eventually find interesting in some way my own
first hand experiences in the Geographic/GeoSpatial field.

in this very specific environment (certainly not the most
common one, I agree) it's not at all exceptional deploying
SQLite/SpatiaLite DB-files as big as 20/40 GB (and even more)
containing several tenths/hundredths million rows disseminated
in many hundredths different tables (aka layers in GIS jargon)
strictly related the one to the other not only in the "classic"
relational way based on Primary and Foreign Keys, but even in
the "awkward spatial way" based on geometric and topological
relationships.

there are several good technical reasons suggesting to use
exactly SQLite/SpatiaLite for processing, validating and
distributing huge geographic datasets:

- SQLite/SpatiaLite offers exactly the same overall level of
  standard Spatial SQL processing capabilities supported by
  the much more sophisticated (and complex) PostgreSQL/PostGIS;
  but it's by way simpler to be installed and configured, and
  it's usually faster under many common working conditions
  (mainly thanks to its really outstanding first class R*Tree
  implementation).

- not to mention the SQLite's "forbidden weapon"; if your HW
  supports an adequate amount of RAM you can directly load a
  whole DB in memory at once; and under such a configuration
  you can easily reach dramatically impressive supersonic speeds.

- you can directly copy / send a whole DB (even a really huge one)
  from a locations to another in a single shot and in the most
  painless way, because after all it simply is an ordinary file.

- if something goes completely wrong during any complex Spatial
  data processing operation (it happens ... nobody's perfect),
  you simply have to remove a single file and then patiently
  restart yet again from scratch after correcting your buggish
  Spatial SQL scripts.
  under the same conditions fully recovering some client/server
  Spatial DBMS left in an inconsistent state could eventually be
  a not so pleasant and easy affair.

- you can freely ATTACH and DETACH together many DB-files
  depending on your very variable specific requirements, thus
  achieving an outstanding and unconstrained flexibility.
  Quite often this is a really usefull feature, e.g. when you
  have to collect, integrate and merge together many different
  datasets presenting a very loose standardization because they
  were produced during a long period of time by many different
  subjects for different initial purposes.

all this considered, my answer is: "oh yes, it makes perfectly
sense storing 20 million rows in a SQLite DB; and it works
absolutely well" :-D

by Sandro
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to