Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-11 Thread Tels
Moin,

On Thursday 07 May 2009 01:46:36 you wrote:
 d) oh, and localStorage. I've partially implemented that but haven't
 had much testing... other work... ugh. So caching on a few levels,
 basically.

Ah, I think I get it now. (yeah, took a long time :) localstorage could 
be used when you are offline, so you can load a map, store it, 
diconnect and then go into the wilderness and still view your map?

That would be cool, because currently one would need to install my proxy 
locally to keep the proxy-side data and that is cumbersome.

All the best,

Tels

-- 
 Signed on Mon May 11 08:19:16 2009 with key 0x93B84C15.
 Get one of my photo posters: http://bloodgate.com/posters
 PGP key on http://bloodgate.com/tels.asc or per email.

 Neulich in Dresden gehört: Gundach. Schbindadoni. Isvleisch
 dadidada?

  -- Ex-Kahl-Libur


signature.asc
Description: This is a digitally signed message part.
___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-09 Thread Tels
Moin,

On Friday 08 May 2009 22:09:17 you wrote:
 Great, this is a good discussion. I've put up a wiki page with some
 of the things we've covered, with pros/cons. I hope we can continue
 to talk about our approaches and as we optimize for different
 problems post some of it back up here:
 http://code.google.com/p/cartagen/wiki/FeatureTradeoff

 I put in what I could gather about Temap, but feel free to update and
 add more pros and cons... this is just my thought process so far. We
 might also add a status column so we can annotate what we learn
 from each approach.

Ah, interesting. Since I don't have a google account, I reply here 
instead:

* Server-side proxy and filter   Yes ? 

Temap has a server side proxy and filer, called osmapi and it is 
running at http://bloodgate.com/cgi-bin/osmapi :)

* loading data live  direct from an API server 

Makes editor possible: I think that editing would be even possible if 
you do not load live from the API server, because you could: A: 
instruct the proxy to bypass its cache and reload from the API server 
again, and then upload the changes back to the proxy, which in turn 
uploads to the API server. (Due to cross-domain security the JS cannot 
talk to the API server directly unless a bit of JS is served from 
there, too. (There is a possibility, but it only works in Firefox 3.5 
IIRC)

* Pruning datasets before handing to JavaScript 
Under cons Looses important metadata - I would say non-important 
metadata. The only real con is that if you need that metadata (f.i. 
for editing), you need to download it for the features the user wants 
to touch and edit. Or in other words, you do not need to download 
every FIXME and attribution just to render the map, but when the 
user inspects features, you need the data.

Like the live-loading above, I think basically it will boil down to 
hybrid approaches. E.g. you load the bulk in a pruned way from the 
proxy, but if the user wants to change a street, you download the data 
for that street directly and 100%, then upload the change.

* Serve reduced polygons for lower zoom levels 

I would add unclear how much it speeds things up. I might be that the 
entire coastline of Europe is less data than lets say the inner parts 
of Washington DC :)

* use localStorage to persist a cache in the browser 

Again, I don't see what localStorage adds over the traditional browser 
cache - if I download 7.3,50.4,7.4,50.4.json.gz it will get cached and 
the browser will fetch it from the cache next time (when the server 
says the data is not too old). That automatically limits the storage 
(user settable!), validates the freshness of the data etc. The 
traditional cache is very good at solving these things, so implementing 
it manually in localstorage seems like re-inventing the wheel to me. 
(You can prove me wrong, tho :)

Con: Doesn't work in all browsers (which ones do have it btw?)

* Request plots filtered by tag  Yes ? 

temap does this, it loads reduced datasets for zoom level 11, and for 10 
and below. When you zoom in, it loads the full data set. (What it 
doesn't is reduce the dataset, once you have a certain level of data, 
zooming out will just reduce the data during rendering by skipping 
things. That's because I figured that pre-filtering the data client 
side would be as much as work as jus skipping parts. However, that 
might be able to be improved upon.


In summary I would also like to add that all these 
various pre-computation and caching strategies are quite nice and 
helpful, but they are also all premature optimizations in the sense 
that I'd first get it to work at all, then toy around reducing the 
work. E.g. rendering labels on a canvas is problem that is not solved 
in all browsers, and no matter how much or little you cache, it won't 
change the fact that Opera 9.6 has no labels on the map :(

Btw, jeffry:

* There is a talk I proposed for State of the Map and I don't
want to spoil everything before :)
  
   yes, me too! so if you want to discuss off-list that's fine.
 
  Heh, you have a talk scheduled, too? :) That sounds like fun :)

Will you be at the conference? :)

All the best,

Tels

-- 
 Signed on Sat May  9 13:37:44 2009 with key 0x93B84C15.
 Get one of my photo posters: http://bloodgate.com/posters
 PGP key on http://bloodgate.com/tels.asc or per email.

 Build a man a fire, and he'll be warm for a day. Set a man on fire,
 and he'll be warm for the rest of his life.

  -- Terry Pratchett


signature.asc
Description: This is a digitally signed message part.
___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-09 Thread Graham Jones (Physics)
I have been thinking about how to do real time rendering for a little while
(but unlike you guys, I haven't had chance to actually do anythingone
day).

The way I was planning on dealing with the problem of too much data at low
zoom levels wast to extend the idea of the tile data server (
http://wiki.openstreetmap.org/index.php/OJW%27s_tile_data_server), but
simplify the data on the server before it is passed to the client (this will
have to be done in advance because real time will be difficult...

The approach I was thinking of taking was to reduce the number of nodes by
recognising that each tile is only a 256x256 pixel area, so if two nodes
share the same pixel, merge them.
You would also prune out any ways that are less than 1 pixel long (this will
get rid of a lot at low zoom levels - you could probably be even more
vicious than this without losing anything).
You would probably prune out some of the points of interest too (e.g. bus
stops, park benches) that are irrelevant at low zoom levels, but this would
depend on what you are going to do with the data.

I accept that doing this will preclude you writing an editor, because the
data you receive will be very different to that in the real database.

As I say, this is just an idea - I haven't tried it yet, but I think it
would help with the 'rendering a whole city' problem.  Of course there is a
lot of pre-processing to be done, but it doesn't sound much worse than
rendering tiles into raster images.

If this sounds vaguely useful to anyone I could move it up my list of things
to do.


Graham.

2009/5/9 Tels nospam-ab...@bloodgate.com

 Moin,

 On Friday 08 May 2009 22:09:17 you wrote:
  Great, this is a good discussion. I've put up a wiki page with some
  of the things we've covered, with pros/cons. I hope we can continue
  to talk about our approaches and as we optimize for different
  problems post some of it back up here:
  http://code.google.com/p/cartagen/wiki/FeatureTradeoff
 
  I put in what I could gather about Temap, but feel free to update and
  add more pros and cons... this is just my thought process so far. We
  might also add a status column so we can annotate what we learn
  from each approach.

 Ah, interesting. Since I don't have a google account, I reply here
 instead:

 * Server-side proxy and filter   Yes ?

 Temap has a server side proxy and filer, called osmapi and it is
 running at http://bloodgate.com/cgi-bin/osmapi :)

 * loading data live  direct from an API server

 Makes editor possible: I think that editing would be even possible if
 you do not load live from the API server, because you could: A:
 instruct the proxy to bypass its cache and reload from the API server
 again, and then upload the changes back to the proxy, which in turn
 uploads to the API server. (Due to cross-domain security the JS cannot
 talk to the API server directly unless a bit of JS is served from
 there, too. (There is a possibility, but it only works in Firefox 3.5
 IIRC)

 * Pruning datasets before handing to JavaScript
 Under cons Looses important metadata - I would say non-important
 metadata. The only real con is that if you need that metadata (f.i.
 for editing), you need to download it for the features the user wants
 to touch and edit. Or in other words, you do not need to download
 every FIXME and attribution just to render the map, but when the
 user inspects features, you need the data.

 Like the live-loading above, I think basically it will boil down to
 hybrid approaches. E.g. you load the bulk in a pruned way from the
 proxy, but if the user wants to change a street, you download the data
 for that street directly and 100%, then upload the change.

 * Serve reduced polygons for lower zoom levels

 I would add unclear how much it speeds things up. I might be that the
 entire coastline of Europe is less data than lets say the inner parts
 of Washington DC :)

 * use localStorage to persist a cache in the browser

 Again, I don't see what localStorage adds over the traditional browser
 cache - if I download 7.3,50.4,7.4,50.4.json.gz it will get cached and
 the browser will fetch it from the cache next time (when the server
 says the data is not too old). That automatically limits the storage
 (user settable!), validates the freshness of the data etc. The
 traditional cache is very good at solving these things, so implementing
 it manually in localstorage seems like re-inventing the wheel to me.
 (You can prove me wrong, tho :)

 Con: Doesn't work in all browsers (which ones do have it btw?)

 * Request plots filtered by tag  Yes ?

 temap does this, it loads reduced datasets for zoom level 11, and for 10
 and below. When you zoom in, it loads the full data set. (What it
 doesn't is reduce the dataset, once you have a certain level of data,
 zooming out will just reduce the data during rendering by skipping
 things. That's because I figured that pre-filtering the data client
 side would be as much as work as jus 

Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-08 Thread Tels
Moin,

On Thursday 07 May 2009 01:46:36 you wrote:
 Hi, Tels -

   It's not been optimized yet, so loading is a little slow, but I'm
  optimistic
   that it will scale.
 
  Based on my experience, I can tell you right away it won't scale :)
  Not to discourage you, but:
 
  * the amount of data is really huge. Throwing a few dozend megabyte
  XML or even
  JSON at the browser will bring it to its knees. Not to mention the
  data you need
  to render even a small city.
  * re-rendering everything takes a long time. You want to avoid that
  :)

 I was actually talking about server-side load time. I'm running it
 off the 0.6 API, so it packs up XML, sends it to my server, i unpack,
 re-encode to JSON, send to the browser, render. Obviously that's
 SUPER inefficient, so I'm looking forward to cutting a lot of that
 out in the next week or so.

Heh, that sounds like my setup :)

My client requests data in (currently) 0.1° tiles, like 7.4,50.1 - 7.5, 
50.2 from a proxy. The public proxy runs at 
http://bloodgate.com/cgi-bin/osmapi and serves these requests as 
gzipped JSON. It is also possible to run one locally so you don't need 
internet access.

(I intend to document it, please give me a day or two).

* The proxy receives XML from the api or xapi server. Currently it 
requests the full dataset. 
* Then it removes unnec. tags (like note, fixme, attribution and a whole 
bunch of others that are not needed for rendering). Some of them are 
very minor, but 1 nodes with attribution=veryvery long string 
here can make up like 40% of all the data, and just clog the line and 
browser :)
* The data is then converted into a structure that is easier to work 
with (nodes as hash, ways as list, multipolygon ways attached to the 
outer polygon to avoid issues with the render order etc.).
* The data is then pruned into (currently 3) levels and stored in a 
cache:
  * level 0 - full
  * level 1 - no POI, no paths, streams, tracks etc. used for zoom 11
  * level 2 - no tertiary roads etc. used for zoom 10 and below
* The client is served the level it currently requested as JSON.gz.

That scheme can surely be improved but it works for now.

The problem with that scheme is that:

* There are three servers in the list (api.openstreetmap, 
xapi.informationfreeway and tagwatch) and a lot of them do not complete 
the request (internal error, not implemented etc. etc.). It can take a 
lot of retries to finally get the data.
* Even when you get the data, it takes seconds (10..40 seconds 
is normal) to minutes - upwards to 360 seconds just to serve one 
request.

So currently all received data is stored in the cache for 7 days to 
avoid the very very long loading times.

Ideas of fetching the full dataset and pre-computing the cache simple 
don't work because I don't have a big enough machine and no big enough 
online account to store the resulting JSON :(

 Actually, rendering in the browser's been pretty good - for example
 this page loaded with no noticeable slowdown, and I haven't even
 begun optimizing:

 http://www.flickr.com/photos/jeffreywarren/3476532351/

 But you're right, it's a challenge. I'm impressed that you rendered a
 whole city like Berlin - do you have some code online so I can see,
 or a screenshot? I bet it looks great...

http://www.bloodgate.com/wiki/index.php?title=Temap_-_Screenshots :)

Actually, I managed to render Bonn, Germany, but Berlin, Germany is out 
because the amount of data exceeds Firefox stack limit. Oops.

I'll upload a new screenshot soon. 

 What I'm looking at now is:

 a) rendering only some tags per zoom-level, so no rendering footpaths
 and buildings as you zoom out... but that's dependent on the xapi,
 which I haven't been able to fetch from reliably (help anyone?)

I already have that implemented, the render rule specifies for which 
zoom level that feature applies. Plus a few other hard-coded rules 
like no borders from zoom X upwards, but these should be pushed into 
the ruleset, just like the default map background should be pushed 
there. (Nice idea :)

 b) cutting the API out of the loop and running direct from a
 planet.osm, but then you can't use it to view live edits, like you
 can here: http://vimeo.com/4435969

Also, somehow processing 150 Gbyte XML into JSON will prove to be a 
challange :)

 c) trying to serve partial polygons... I'd like to try plotting only
 every 3rd or 10th node... do the polygons collapse? Can i cull nodes
 in a more intelligent way? Someone on this list or geowanking pointed
 to a company that can serve lower-res polys over an API. I'm sure
 folks have worked on this in tile systems, so if you know anything
 about it and are willing to share, I'm all ears. This becomes really
 relevant as you zoom out... don't want to render every node for the
 coast of Argentina, for example.

Yes, but reducing the polygons is also a lot of work :) I haven't 
started on this yet, because on zoom 12 or higher you need to render 
almost anything, anyways. Plus, you 

Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-08 Thread Tels
On Thursday 07 May 2009 02:51:34 you wrote:
 Jeffrey Warren wrote:
 Since an area will always be closed any segment can be taken to be
 build upon [as start point]. Even without the routing data the object
 can be fully connected, based on the start and endpoints.

How does that scheme deal with multipolygons? (e.g. areas with holes in 
it, as these seem to be getting popular :)

All the best,

Tels

-- 
 Signed on Fri May  8 15:35:17 2009 with key 0x93B84C15.
 View my photo gallery: http://bloodgate.com/photos
 PGP key on http://bloodgate.com/tels.asc or per email.

 Sacrificing minions: Is there any problem it CAN'T solve?

  -- Lord Xykon


signature.asc
Description: This is a digitally signed message part.
___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-08 Thread Stefan de Konink
Tels wrote:
 On Thursday 07 May 2009 02:51:34 you wrote:
 Jeffrey Warren wrote:
 Since an area will always be closed any segment can be taken to be
 build upon [as start point]. Even without the routing data the object
 can be fully connected, based on the start and endpoints.
 
 How does that scheme deal with multipolygons? (e.g. areas with holes in 
 it, as these seem to be getting popular :)

I have to look that up; because one of the writen about, but not yet 
implemented accelerators can use this information. But currently I 
consider multipolygons outside the scope.


Stefan

___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-08 Thread Jeffrey Warren

 * The proxy receives XML from the api or xapi server. Currently it
 requests the full dataset.
 * Then it removes unnec. tags (like note, fixme, attribution and a whole
 bunch of others that are not needed for rendering). Some of them are
 very minor, but 1 nodes with attribution=veryvery long string
 here can make up like 40% of all the data, and just clog the line and
 browser :)


Yes, I'm thinking of trying to cache locally but still request changesets if
the ?live=true tag is set... caching locally is great for more static data
but for the live viewer, I'm trying to not use caching, but increase
efficiency in the requests.

* The data is then pruned into (currently 3) levels and stored in a

 cache:
  * level 0 - full
  * level 1 - no POI, no paths, streams, tracks etc. used for zoom 11
  * level 2 - no tertiary roads etc. used for zoom 10 and below
 * The client is served the level it currently requested as JSON.gz.


Great, this is what I'm working on too. I'm thinking a ruleset about what
features are relevant for what zoom levels could be something to work
together on? I was also thinking of correlating tags with a certain zoom
level. But maybe each tag should be associated with a range of zoom levels,
like way: { zoom_outer: 3, zoom_inner: 1 }. Thoughts?


 * There are three servers in the list (api.openstreetmap,
 xapi.informationfreeway and tagwatch) and a lot of them do not complete
 the request (internal error, not implemented etc. etc.). It can take a
 lot of retries to finally get the data.
 * Even when you get the data, it takes seconds (10..40 seconds
 is normal) to minutes - upwards to 360 seconds just to serve one
 request.

 So currently all received data is stored in the cache for 7 days to
 avoid the very very long loading times.

 Ideas of fetching the full dataset and pre-computing the cache simple
 don't work because I don't have a big enough machine and no big enough
 online account to store the resulting JSON :(



 Also, somehow processing 150 Gbyte XML into JSON will prove to be a
 challange :)


So I'm having the same problems with the APIs. The standard 0.6 api has been
pretty good but of course it serves XML, not JSON. The xapi is not very
responsive to me, it seems. I thought parsing XML in JS would be molasses,
so if you're interested, we should put up our own XAPI or custom api off the
planet.osm file, and send JSON?

I have an quad-core Intel Mac Pro with 1.5 TB and a bunch of RAM we can
dedicate to this effort, with plenty of bandwidth. And perhaps when Stefan's
work is published, we could run it as well, since it seems to be a great
solution to requesting fewer nodes for large ways... but for now do you
think you could use an XAPI? i think all my requests fit into that api.

Alternatively, Stefan points out that the dbslayer patch for the Cherokee
server allows direct JSON requests to a database. So some very thin db
wrapper might serve us for now? This isn't my area of expertise, so if you
have better ideas on how to generate JSON direct from the db, like GeoServer
or something, and still have tag-based requests, i'm all ears.


Yes, but reducing the polygons is also a lot of work :) I haven't

 started on this yet, because on zoom 12 or higher you need to render
 almost anything, anyways. Plus, you would then need to cache thepartial
 data somehow (computing it is expensive in JS..)


Seems like Stefan's work may address this, no? Or if we did cache it, seems
like we'd calculate it on the server side.


  d) oh, and localStorage. I've partially implemented that but haven't
  had much testing... other work... ugh. So caching on a few levels,
  basically.

 I fail to see what localstorage actually gains, as the delivered JSON is
 put into the browser cache, anyway and the rest is cached in memory.
 Could you maybe explain what your idea was?



Yes, localStorage persists across sessions so you could build up a permanent
local cache and have more control (in JS) over requesting it and
timestamping when you cached it, not to mention applying only changesets and
not complete cache flushes. This has some advantages over the browser cache,
although that does of course persist across sessions too.



 * There is a talk I proposed for State of the Map and I don't want to
 spoil everything before :)


yes, me too! so if you want to discuss off-list that's fine.

Of course, semi-dynamic rules like color them according to feature X by
 formula Y are still useful and fun, and avoid the problems above.
 (Like: use maxspeed as the color index ranging from red over green to
 yellow :).


Yes, this is an exciting area to me, for example the color by authorship
stylesheet i posted before:

http://map.cartagen.org/find?id=parisgss=http://unterbahn.com/cartagen/authors.gss

or this one i threw together yesterday, based on the tags of measured width
instead of on a width rule:

http://map.cartagen.org?gss=http://unterbahn.com/cartagen/width.gss

A more fully-rendered screenshot 

Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-08 Thread Tels
On Friday 08 May 2009 15:34:13 Tels wrote:
 Moin,

  But you're right, it's a challenge. I'm impressed that you rendered
  a whole city like Berlin - do you have some code online so I can
  see, or a screenshot? I bet it looks great...

 http://www.bloodgate.com/wiki/index.php?title=Temap_-_Screenshots :)

 Actually, I managed to render Bonn, Germany, but Berlin, Germany is
 out because the amount of data exceeds Firefox stack limit. Oops.

 I'll upload a new screenshot soon.

Done:

http://www.bloodgate.com/wiki/index.php?title=Temap_-_Screenshots

The page also contains a few detals to the data size and timings.

The red areas were features where I had missing render rules, fixed a 
few of them now. The rest are wrongly or untagged ways.

All the best,

Tels

-- 
 Signed on Fri May  8 21:02:52 2009 with key 0x93B84C15.
 Get one of my photo posters: http://bloodgate.com/posters
 PGP key on http://bloodgate.com/tels.asc or per email.

 Wo die Schoschonen schön wohnen.



signature.asc
Description: This is a digitally signed message part.
___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-08 Thread Jeffrey Warren
Great, this is a good discussion. I've put up a wiki page with some of the
things we've covered, with pros/cons. I hope we can continue to talk about
our approaches and as we optimize for different problems post some of it
back up here:
http://code.google.com/p/cartagen/wiki/FeatureTradeoff

I put in what I could gather about Temap, but feel free to update and add
more pros and cons... this is just my thought process so far. We might also
add a status column so we can annotate what we learn from each approach.

Best,
Jeff

On Fri, May 8, 2009 at 3:00 PM, Tels nospam-ab...@bloodgate.com wrote:

 Moin,

 On Friday 08 May 2009 20:04:48 you wrote:
   * The proxy receives XML from the api or xapi server. Currently it
   requests the full dataset.
   * Then it removes unnec. tags (like note, fixme, attribution and a
   whole bunch of others that are not needed for rendering). Some of
   them are very minor, but 1 nodes with attribution=veryvery
   long string here can make up like 40% of all the data, and just
   clog the line and browser :)
 
  Yes, I'm thinking of trying to cache locally but still request
  changesets if the ?live=true tag is set... caching locally is great
  for more static data but for the live viewer, I'm trying to not use
  caching, but increase efficiency in the requests.

 I fear loading data live from the API server is just not feasible,
 unless you:

 * only load diffs (minute-diffs?) and update your already cached at the
 proxy data with that. OTOH I read that importing a one-hour diff into a
 postgres database can take 40..70 minutes, e.g. depending on load you
 might not even manage to update your DB with the diffs fast enough...
 * invent an API server that is about 1000 times faster :)
 * do never zoom out from level 18, anything below will request so much
 data that you can't get it live :)

 Currently I consider live-view not an achiveable goal, I am happy if I
 can render data that is about 1 day or so old.

  * The data is then pruned into (currently 3) levels and stored in a
 
   cache:
* level 0 - full
* level 1 - no POI, no paths, streams, tracks etc. used for zoom
   11 * level 2 - no tertiary roads etc. used for zoom 10 and below *
   The client is served the level it currently requested as JSON.gz.
 
  Great, this is what I'm working on too. I'm thinking a ruleset about
  what features are relevant for what zoom levels could be something to
  work together on? I was also thinking of correlating tags with a
  certain zoom level. But maybe each tag should be associated with a
  range of zoom levels, like way: { zoom_outer: 3, zoom_inner: 1 }.
  Thoughts?

 My rules do have a minimum zoom level, smaller than that and they are
 not rendered. The levels are inspired by the osmarenderer and mapnik
 outputs, but I moved a few of them down so you can render really high
 resolution maps.

 However, the pruning at the proxy is something else and not connected to
 that. For instance, somebody might not want to see tertiary roads on
 level 13, but others want. So I make sure that I only prune out data
 that is never be able too seen on that level. E.g. a conservative
 pruning.

 Also, about 90% of the data-pruning is about removing unwanted data
 (like note=blah :) and not about the smaller zoom levels because
 currently it is simple not feasible to render below 10 and even for
 zoom 10 you need a really really beefy machine and a long wait time

   * There are three servers in the list (api.openstreetmap,
   xapi.informationfreeway and tagwatch) and a lot of them do not
   complete the request (internal error, not implemented etc. etc.).
   It can take a lot of retries to finally get the data.
   * Even when you get the data, it takes seconds (10..40 seconds
   is normal) to minutes - upwards to 360 seconds just to serve one
   request.
  
   So currently all received data is stored in the cache for 7 days to
   avoid the very very long loading times.
  
   Ideas of fetching the full dataset and pre-computing the cache
   simple don't work because I don't have a big enough machine and no
   big enough online account to store the resulting JSON :(
  
  
  
   Also, somehow processing 150 Gbyte XML into JSON will prove to be a
   challange :)
 
  So I'm having the same problems with the APIs. The standard 0.6 api
  has been pretty good but of course it serves XML, not JSON. The xapi
  is not very responsive to me, it seems.

 Neither for me, but the API server is very slow, too. It seems it can't
 manage to send me more than 17Kbyte/s (but maybe it is bandwidth
 limited?).

  I thought parsing XML in JS
  would be molasses,

 When I tried it, it used ungodly amounts of memory (because the data
 structure is not usefull for rendering and it contains so much cruft),
 and I also never managed to extract the actual node data for rendering
 from it...

  so if you're interested, we should put up our own
  XAPI or custom api off the planet.osm file, and send JSON?

 Yeah, 

Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-07 Thread Stefan de Konink
Jeffrey Warren wrote:
 Wow, this is fantastic and very exciting! Stefan - by javascriptsockets 
 were you thinking of the thru-flash technique for sockets? 

I have two options; I was indeed looking at the flash technique for 
native protocol access. Alternatively the Cherokee Webserver has its own 
'dbslayer' implementation this allows an SQL database to be queried over
HTTP and outputed in the common formats [json/xml/cvs/...]

 I'd love to 
 help out if I can. Right now, Cartagen doesn't use sockets, but small 
 Ajax requests every 1/3 of a second. Latency is low but not quite 
 realtime... it's within about a second but depends on the size of the 
 plot requested of course. I've been avoiding Flash because it's not 
 available on the iPhone or Android, and it'd be nice to have almost the 
 same renderer/editor codebase on mobile devices and PC.

I don't want to reinvent the wheel, so I will checkout how you build 
this thing and probably hook in some code.

 Just curious, how does Mapnik or Omsarender do it? Surely they don't 
 actually render every node of a way for very large bboxes?

In the default Mapnik case polylines are fetched, so that is basically 
an operation that already has a materialized linestring. Since the 
tables know what type the line will be it is just a draw this line at 
that point. [A bit more complex due to layer = ...]


Stefan

___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-06 Thread Tels
Moin,

2009/4/25 Jeffrey Warren war...@mit.edu:
 I'm working on a Javascript map renderer, non tile-based. It's really
 early-stage alpha, and not publicly released yet, but I'd love some feedback
 from folks as I'm continuing to develop it.

Sorry for not replying directly or earlier, I wasn't subscribed to this list
until 5 minutes ago :)

Initially I didn't want to spread my project to far, as it was (and is) still
quite beta. However, now that so many people start pursing the same idea... ;)

Jeffrey, you wrote:

 It's not been optimized yet, so loading is a little slow, but I'm optimistic
 that it will scale.

Based on my experience, I can tell you right away it won't scale :) Not to
discourage you, but:

* the amount of data is really huge. Throwing a few dozend megabyte XML or even
JSON at the browser will bring it to its knees. Not to mention the data you need
to render even a small city.
* re-rendering everything takes a long time. You want to avoid that :)

My app has already quite a few optimizations, and it still chokes at big cities
like Berlin or London. However, I am confident that things can be improved :)

(Browser limitations non-withstanding. Single-threaded dead-slow JS and
incomplete Canvas spec without dashe I hate thee... :(

Regarding the rule sets and CSS:

I've already considered adding a different rule-set (just to show that it can be
done). However, from a technical viewpoint, that is not that spectacular. As 
long
as the renderer is flexible enough to handle the wanted features, it doesn't
really matter in what format the rules are (CSS, GSS, JSON, XML, you name it) or
where they come from (hard-coded, web, URI, user input), as long as you can 
load,
parse and convert them, it can display them.

In my eyes the much bigger impact is that you no longer need different sets of
tiles or tile providers - just the data and the rules to display it and the map
can morph in real-time from mapnik to cyclemap to whatever-you-want. And one 
more
button click and the user can save it locally. That's at least my vision I work
towards :)

All the best,

Tels

___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-06 Thread Jeffrey Warren
Hi, Tels -

  It's not been optimized yet, so loading is a little slow, but I'm
 optimistic
  that it will scale.

 Based on my experience, I can tell you right away it won't scale :) Not to
 discourage you, but:

 * the amount of data is really huge. Throwing a few dozend megabyte XML or
 even
 JSON at the browser will bring it to its knees. Not to mention the data you
 need
 to render even a small city.
 * re-rendering everything takes a long time. You want to avoid that :)


I was actually talking about server-side load time. I'm running it off the
0.6 API, so it packs up XML, sends it to my server, i unpack, re-encode to
JSON, send to the browser, render. Obviously that's SUPER inefficient, so
I'm looking forward to cutting a lot of that out in the next week or so.

Actually, rendering in the browser's been pretty good - for example this
page loaded with no noticeable slowdown, and I haven't even begun
optimizing:

http://www.flickr.com/photos/jeffreywarren/3476532351/

But you're right, it's a challenge. I'm impressed that you rendered a whole
city like Berlin - do you have some code online so I can see, or a
screenshot? I bet it looks great...

What I'm looking at now is:

a) rendering only some tags per zoom-level, so no rendering footpaths and
buildings as you zoom out... but that's dependent on the xapi, which I
haven't been able to fetch from reliably (help anyone?)

b) cutting the API out of the loop and running direct from a planet.osm, but
then you can't use it to view live edits, like you can here:
http://vimeo.com/4435969

c) trying to serve partial polygons... I'd like to try plotting only every
3rd or 10th node... do the polygons collapse? Can i cull nodes in a more
intelligent way? Someone on this list or geowanking pointed to a company
that can serve lower-res polys over an API. I'm sure folks have worked on
this in tile systems, so if you know anything about it and are willing to
share, I'm all ears. This becomes really relevant as you zoom out... don't
want to render every node for the coast of Argentina, for example.

d) oh, and localStorage. I've partially implemented that but haven't had
much testing... other work... ugh. So caching on a few levels, basically.

What strategies have you employed, if you're willing to share?

Also agreed that GSS is not technically spectacular - the driving motivation
is that it is legible to those new to mapping, being CSS-like. So really an
adoption decision, though the JavaScript-ability of it is a nice bonus -
dynamic rules are fun.

Anyways, I'm excited to hear you've been working on this kind of stuff too.
I'm happy to collaborate or just share information, the whole codebase is
at http://code.google.com/p/cartagen/.

Best,
Jeff




 My app has already quite a few optimizations, and it still chokes at big
 cities
 like Berlin or London. However, I am confident that things can be improved
 :)

 (Browser limitations non-withstanding. Single-threaded dead-slow JS and
 incomplete Canvas spec without dashe I hate thee... :(

 Regarding the rule sets and CSS:

 I've already considered adding a different rule-set (just to show that it
 can be
 done). However, from a technical viewpoint, that is not that spectacular.
 As long
 as the renderer is flexible enough to handle the wanted features, it
 doesn't
 really matter in what format the rules are (CSS, GSS, JSON, XML, you name
 it) or
 where they come from (hard-coded, web, URI, user input), as long as you can
 load,
 parse and convert them, it can display them.

 In my eyes the much bigger impact is that you no longer need different sets
 of
 tiles or tile providers - just the data and the rules to display it and the
 map
 can morph in real-time from mapnik to cyclemap to whatever-you-want. And
 one more
 button click and the user can save it locally. That's at least my vision I
 work
 towards :)

 All the best,

 Tels

 ___
 dev mailing list
 dev@openstreetmap.org
 http://lists.openstreetmap.org/listinfo/dev

___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-05-06 Thread Stefan de Konink
Jeffrey Warren wrote:
   c) trying to serve partial polygons... I'd like to try plotting only
 every 3rd or 10th node... do the polygons collapse? Can i cull nodes in 
 a more intelligent way? Someone on this list or geowanking pointed to a 
 company that can serve lower-res polys over an API. I'm sure folks have 
 worked on this in tile systems, so if you know anything about it and are 
 willing to share, I'm all ears. This becomes really relevant as you zoom 
 out... don't want to render every node for the coast of Argentina, for 
 example.

I have currently an alternative data format storing the Planet. I 
personally consider it the best method to store data for rendering and 
routing and only requires a single table table to store all geoconcepts.

The basic functionality is build on the concept we had here, in OSM, 
before called segments. Each segment is materialized by the database, 
thus will return a segment that can be directly plotted. For all *non* 
areas this is sufficient. If you want on the other hand render an area 
you will be forced to create a list of the results that come back. [In 
GIS terms a circular linestring]. Using the data also used for routing 
the exact sequence can be restored by the renderer.

Since an area will always be closed any segment can be taken to be build 
upon [as start point]. Even without the routing data the object can be 
fully connected, based on the start and endpoints.


For the visual people:

n1---n2

n1 is stored as lat,long
n2 is stored as lat,long


For a renderer this is more than enough. The extra database features 
come with constraints to make the following possible:


n1v-n2
   |
   |
   |
   |
   |
  n3

Where v is actually a constraint that line n3 is constrainted to the 
center (50%) of line n1..n2.


...what is available is done in plain SQL. [Commercial break]Ofcourse 
MonetDB was used for storing the data[/Commercial break].


I was looking at implementing native rendering using javascriptsockets 
(aka just fetch tuples directly from the database), because I want live 
editing :)



Stefan

___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-04-27 Thread Colin Marquardt
2009/4/27 Jeffrey Warren war...@mit.edu:
 One thing I really
 like about the osmarender frontend project is that beyond trying to be a
 wysiwyg map editor, it uses a style markup similar to what I've been doing
 in Cartagen... the rules file:
 http://wiki.openstreetmap.org/wiki/Osmarender/Rules#Example_of_using_embedded_CSS_styles

You might also be interested in Cascadenik then:

http://mike.teczno.com/notes/cascadenik.html
http://teczno.com/cascadenik/doc/

Cheers
  Colin

___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev


Re: [OSM-dev] Cartagen - client-side vector based map renderer, dynamic maps

2009-04-26 Thread Mario Ferraro
Here I am :)

Going to write Jeffrey to join our efforts if possible :)

Recently I've tried to retrieve data from an sqlite DB in Mozilla 
Firefox (which is slow, but perhaps not too much considering it doesn't 
need to be the fastest renderer in the world but something that doesn't 
need any installation client-side, just a Firefox extension). I've found 
some benchmarks good and some bad (Osmarender is quite demanding about 
number of queries), but it worth further analysis to me.

Cheers,

Mario Ferraro

Ian Dees ha scritto:
 
 
 On Sat, Apr 25, 2009 at 11:49 AM, Colin Marquardt 
 cmarq...@googlemail.com mailto:cmarq...@googlemail.com wrote:
 
 2009/4/25 Jeffrey Warren war...@mit.edu mailto:war...@mit.edu:
   I'm working on a Javascript map renderer, non tile-based. It's really
   early-stage alpha, and not publicly released yet, but I'd love
 some feedback
   from folks as I'm continuing to develop it.
 
 Interesting, this is already the second JS renderer after I came across
 http://bloodgate.com/wiki/index.php?title=Temap
 a few days ago. Maybe you could join forces?
 
 
 One of the Google Summer of Code applicants submitted an application for 
 finishing his JavaScript-based OSM renderer. Check it out: 
 http://osmarenderfrontend.wordpress.com/
 
 
 
 
 ___
 dev mailing list
 dev@openstreetmap.org
 http://lists.openstreetmap.org/listinfo/dev


___
dev mailing list
dev@openstreetmap.org
http://lists.openstreetmap.org/listinfo/dev