There is no reason to be sqlite slow. How much are you are using transactions on your local db? What are you using for import? I'm importing Czech republic (50Mb bziped) in one or two minutes. So for 5.4GB it takes 108 minutes (two hours). But I'm importig to modified osm schema (something like mapnik schema in postgress). It is because of query speed.

I'm playing with sqlite and Mapnik. Sqlite knows RTree spatial index and with small modification to mapnik (sqlite datasource) it runs ok. If you use 8x8 tile per render, it is quite fast.

RTree in sqlite is fast, but there is overhead, when you use one bound rect per entity and also with mapnik (it queries many times per one render for same area). But if you do not need speed it works.

Tomas

Shaun McDonald napsal(a):
On 28 Apr 2009, at 00:16, Kelly Jones wrote:

I've seen many posts saying that SQLite2 can't handle OpenStreetMap's
large planet.osm data file:

http://planet.openstreetmap.org/planet-090421.osm.bz2

which is 5.4G bzip2 compressed, about 150G uncompressed.

Can SQLite3 handle this? Has anyone tried?

I tried to do this myself, but I'm on a slow machine and it's taking
too long (several days so far). If someone's done this, can I get a copy?


SQLite isn't designed for huge databases like OpenStreetMap. You could get away with a city or small region, but more than that, you will get the slowness that you are seeing.
http://www.sqlite.org/faq.html#q19 and some other FAQs on that page.
You are much better using Postgres, or even MySQL.

Shaun


_______________________________________________
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev

_______________________________________________
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev

Reply via email to