Hi Serge
You wrote earlier:
2) We don't really have a universal benchmark.
I've defined a generic spatial benchmark called HSR Texas Geo Database
Benchmark:
http://www.gis.hsr.ch/wiki/HSR_Texas_Geo_Database_Benchmark
This includes scripts for PostGIS, Spatialite and GeoCouch.
Feel free to add
The code is here: http://github.com/iandees/mongosm
I just had a quick look at it because I've never used MongoDB before
and was interested.
This line[1] makes me want to cry (not your fault). There's always
this one catch with a solution that looks perfect otherwise. Now
you've got to escape
What about rewriting this stuff in C? I have written a MySQL importer in
C some time ago, so what about reusing its XML parsing part?
If you are interested, I'll put it on Github.
Andi
Am 03.07.10 22:43, schrieb Ian Dees:
On Sat, Jul 3, 2010 at 1:17 PM, Nolan Darilek no...@thewordnerd.info
I'd be happy to look at the code, but XML parsing is no the slow part:
writing to the format that Mongo expects (BSON) is.
On Mon, Jul 5, 2010 at 1:43 PM, Andreas Kalsch andreaskal...@gmx.de wrote:
What about rewriting this stuff in C? I have written a MySQL importer in C
some time ago, so
On Mon, Jul 5, 2010 at 8:48 PM, Ian Dees ian.d...@gmail.com wrote:
I'd be happy to look at the code, but XML parsing is no the slow part:
writing to the format that Mongo expects (BSON) is.
To be fair, we never entirely established that (did we?). The BSON
encoder part of the MongoDB library
On Mon, Jul 5, 2010 at 2:02 PM, Serge Wroclawski emac...@gmail.com wrote:
On Mon, Jul 5, 2010 at 8:48 PM, Ian Dees ian.d...@gmail.com wrote:
I'd be happy to look at the code, but XML parsing is no the slow part:
writing to the format that Mongo expects (BSON) is.
To be fair, we never
Hmm, I just checked out your code. Out of curiosity, what hardware did
you run it on where importing a planet took several days?
It looks like our schemas are mostly identical, but in my experience,
MongoDB used more and more time importing the index as the import
continued. An import of the
It looks like our schemas are mostly identical, but in my experience,
MongoDB used more and more time importing the index as the import continued.
An import of the dataset for TX took several hours, but import speeds
dropped off markably as the import continued and, presumably, as the
On Sun, Jul 4, 2010 at 7:51 PM, Lars Francke lars.fran...@gmail.com wrote:
It looks like our schemas are mostly identical, but in my experience,
MongoDB used more and more time importing the index as the import continued.
An import of the dataset for TX took several hours, but import speeds
On Fri, Jul 2, 2010 at 20:52, Serge Wroclawski emac...@gmail.com wrote:
Similarly, Ian Dees and I have written a server using MongoDB, which
also provides functionality such as auto-sharding and built in
map/reduce.
Ah I remember that we've talked about this. Great!
Will any of you be at the
I'll be at SOTM and we'll both be at SOTM US.
I suggest a BoF (Birds of a Feather) session.
- Serge
On Sat, Jul 3, 2010 at 10:43 AM, Lars Francke lars.fran...@gmail.com wrote:
On Fri, Jul 2, 2010 at 20:52, Serge Wroclawski emac...@gmail.com wrote:
Similarly, Ian Dees and I have written a
On Jul 3, 2010, at 3:43 AM, Lars Francke lars.fran...@gmail.com wrote:
On Fri, Jul 2, 2010 at 20:52, Serge Wroclawski emac...@gmail.com
wrote:
Similarly, Ian Dees and I have written a server using MongoDB, which
also provides functionality such as auto-sharding and built in
map/reduce.
Ah I
On 07/03/2010 01:09 PM, Nolan Darilek wrote:
On 07/02/2010 01:52 PM, Serge Wroclawski wrote:
Similarly, Ian Dees and I have written a server using MongoDB, which
also provides functionality such as auto-sharding and built in
map/reduce.
Is this work available anywhere? How did you find
On Sat, Jul 3, 2010 at 8:17 PM, Nolan Darilek no...@thewordnerd.info wrote:
Is this work available anywhere? How did you find performance to be, and
to what uses did you put it?
There's Ian and my github accounts, and you can download it there, but:
1) RIght now the only hardware we've tested
On Sat, Jul 3, 2010 at 1:17 PM, Nolan Darilek no...@thewordnerd.infowrote:
On 07/03/2010 01:09 PM, Nolan Darilek wrote:
On 07/02/2010 01:52 PM, Serge Wroclawski wrote:
Similarly, Ian Dees and I have written a server using MongoDB, which
also provides functionality such as auto-sharding and
The web frontend is mainly for searching - it needs fast reads,
horizontal scaling across several servers.
In the backend it needs functions that check for geometric relations or
that compute new geometries. It must be able to handle huge, complex
geometries.
Fast geo indexes for both frontend
Similarly, Ian Dees and I have written a server using MongoDB, which
also provides functionality such as auto-sharding and built in
map/reduce.
- Serge
On Fri, Jul 2, 2010 at 12:58 AM, Lars Francke lars.fran...@gmail.com wrote:
were there any successful attempts to read OSM data into CouchDB
Hi,
were there any successful attempts to read OSM data into CouchDB and
Geocouch? Does somebody know of a backend?
Andi
___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev
were there any successful attempts to read OSM data into CouchDB and
Geocouch? Does somebody know of a backend?
I have done something like that and can provide some code at the end
of July (I won't be back home before then). It really is just a
different kind of schema. But the exact schema
19 matches
Mail list logo