--- Toad <[EMAIL PROTECTED]> wrote: > On Fri, Oct 24, 2003 at 11:34:50AM +0200, Some Guy wrote: > > --- Toad <[EMAIL PROTECTED]> wrote: > > > On Thu, Oct 23, 2003 at 02:12:55PM +0200, Some Guy wrote: > > > > Was NGR tested as extensively as freenet's original routing? If not then why? > > > > > > > > You could have reused some of the same code from the first freenet paper. > > > > Even now this > > > might > > > > still be a good idea. Sorry if you guys have done this and I haven't heard, > > > > but I never > was > > > all > > > > that convinced by NGR. > > > > > > Nope. Ian was dazzled by its beauty. But the main problem was that it's > > > virtually impossible to simulate NGRouting. > > > > Hmmmm, dazzled by its beauty eh. > > > > I don't see why testing it would be any harder than testing the old routing as was > > done in the > > freenet paper(freenet.pdf 1999). For the paper the simulated 1000 nodes. They > > didn't have > move > > actual data around. The NG 10 points seems like less state per node to simulate > > than the old > > system. > > It's not 10 points. It's 16 IIRC. But that's per estimator and we have a > bunch of estimators and a bunch of decaying averages per node, which are > then combined to get an estimate for a given key, which we use to route. > But the big issue is that it's virtually impossible to simulate it because > it involves timings and is intentionally highly sensitive to network > conditions.
Ok, you're right its a couple points per vertex. That's still not a lot of state. Most of the simulation's memory will probably be storing keys in caches (not the data though). I'm not talking about a serious simulation of: * network latency * congestion * packet loss * faulty nodes * bandwidth limitations * nodes disconnecting * nodes requesting things by taste * random bits of alchemy * reconnection costs * node resource overload The first freenet paper ignored these too. I'm just suggesting a simular simulaton that shows if nodes use NG to estimate hop count and pick their optimal route, the system will specialize and be efficient. If NG works for a homogenious random graph of nodes that never disconnect with unlimited bandwidth, we can all feel a bit more confortable about it. If this isn't case, well... we might need to figuar out why. > > Boy it sounds like I'm volunteering aren't I! I'm not as dazzled by its beauty > > though. Still > if > > nobody has time to do this, maybe I should. Is there any interest in writing a > > paper about NG > as > > for the original routing? Does anyone else think this is a good idea??? > > There was one. You have read it? On the project web site. You're talking about this right?: http://www.freenetproject.org/index.php?page=ngrouting It's nice. It gives you an idea of how NGR should work. It might be nice to have a paper that shows that it works. Maybe I can figuar out what O(f(N)) NG routes in, or what O(g(N)) it takes for a compeletly nonspecialized net to converge. Can you think of anything else that should be looked for? I can think of one interesting experiment where you take some functions and see how well NG fits to them, maybe the routing time of an idea Plaxton Routing node -- an upside down spike. __________________________________________________________________ Gesendet von Yahoo! Mail - http://mail.yahoo.de Logos und Klingelt�ne f�rs Handy bei http://sms.yahoo.de _______________________________________________ Devl mailing list [EMAIL PROTECTED] http://dodo.freenetproject.org/cgi-bin/mailman/listinfo/devl
