I think people have already touched on most of the things I want to mention, so I'll be brief. - An "off switch" in tilesAtHome.conf would probably be good, for those who are serious about unlimited maximum complexity or have set their complexity for reasons other than CPU speed. - Using the complexity of the tileset when the system value for MaxTilesetComplexity is 0 results in very low maximum complexity values when the first tileset is not very complex. I just made a change in my local copy to use the ($tilecomplexity * 900 / $deltaT) value as a first guess if MaxTilesetComplexity is 0 and the tile has a non-zero complexity. - 1% new value, 99% old value in the decaying average might not change fast enough if the initial value (whether a user's guess, the actual first tileset complexity, or some other value) is too high or too low. However, this is one of those points that could probably be debated endlessly.
On Mon, Oct 13, 2008 at 04:38, Patrick Kilian <[EMAIL PROTECTED]> wrote: > Hi, > >> Looking at the code for _unstable (which is identical to stable at the >> moment, IIRC,) there's a function to recalibrate maxTilesetComplexity >> based on processing time. IMO, it's fixing one problem (people not >> having a good idea of where there complexity should be,) but causing >> others (I see a couple things I consider logic errors, and it ignores >> the fact that, for some of us, the bigger problem with complexity is >> available RAM and not available CPU cycles.) > I'm the one who is to blame for that code. Input is welcome. > > > Patrick "Petschge" Kilian > -- David J. Lynch [EMAIL PROTECTED] _______________________________________________ Tilesathome mailing list [email protected] http://lists.openstreetmap.org/listinfo/tilesathome
