Hi, Hans F. Nordhaug wrote: > Me: I have just installed the ti...@home client on my server - nice to be > able to contribute some CPU cycles to the OpenStreetMap project. > > Shaun McDonald: um do you know that one machine can deal with the > rendering of the whole world wilst t...@h needs lots of machines and > bandwidth?
That's just the snotty London cabal thinking they are better than the rest ;) > So what exactly are we contributing to when we participate in the > ti...@home project? Everyone has their own reasons probably. It is true that ti...@home attracts a certain kind of contributor who cares perhaps not so much about the quality of the data in OSM but more about how fast he or she is able to climb the contributor ladder. I myself have often made fun of people who boast about their tile rendering farm but have never mapped a single street for OSM. Tile rendering is definitely not the main objective of OSM. It is also true that a proper Mapnik rendering engine can produce a map tile in a fraction of the time needed by the osmarender+svg2png software that ti...@home uses. This, also, is a constant cause of derision from those who use the Mapnik rendering. And a third downside of ti...@home is that each time a ti...@home client renders a tile, it needs to download the data for this from somewhere. In olden times this used to be the API, and ti...@home was one of the main data downloaders from the API, slowing things down for interactive users. To understand ti...@home better, you have to look at the history. There was a time, not too long ago actually, when the central Mapnik database was only updated once a week, and even then not all tiles were rendered anew, so that any change you made to the data would only show up on the Mapnik layer after a week or two. ti...@home was vastly better, achieving an update within an hour or two, sometimes even within 15 minutes. That was a time when ti...@home could afford to make fun of the Mapnik map, and when ti...@home really was the tool of choice for every mapper to check (and bask in) their new edits. ti...@home deserves a lot of credit for improving mapper motivation over a long time. But ever since the central database can be updated quickly, there is not much technological reason for the distributed rendering that ti...@home does. The rendering engine used by ti...@home is different and may have some advantages (especially, the ti...@home styles are less strictly controlled and people can add their pet features more easily), and the tile server is operated independently of the OSM foundation which some may consider an advantage, but other than that the concept is a bit outdated. Still the infrastructure should not be discarded just yet. Shaun is right when he says that ti...@home is currently wasting a lot of computing power trying to do what the Mapnik renderer does more easily. But you can't waste something you don't have - the computing power available to ti...@home is probably at least a hundred times more than what is available to all the central OSM servers taken together. This means that if we should ever come up with something interesting we want to compute from OSM data, and something which is computationally expensive, then the ti...@home infrastructure (and contributors) might come in very handy. That's my thinking at least, but as I said, everyone probably has their own motivation. Bye Frederik -- Frederik Ramm ## eMail [email protected] ## N49°00'09" E008°23'33" _______________________________________________ Tilesathome mailing list [email protected] http://lists.openstreetmap.org/listinfo/tilesathome
