On Tue, Oct 28, 2003 at 05:44:43PM -0500, Nick Tarleton spake thusly: > > Right now with > > the totally random routing due to no specialization freenet can only store > > as much retrievable data as 25*n where n is the average size of the > > datastores on freenet and 25 is the current max htl. No bueno. > I would think it'd be greater because although the amount of retrievable data > on one path is 25*n, each node has a choice of many other nodes to go to.
Ideally it should have to try only one path and proper routing should approach this ideal. If it has to try multiple paths we are greatly increasing cpu and bandwidth load on the network. Having just one failure and trying another path doubles the load. With proper routing we have "a place for everything and everything in its place". Sort of like keeping a file cabinet in alphabetical order or a library arranged by the Dewey Decimal System. This makes us much more efficient at finding and storing data. We can find data more quickly and we can store more of it. I think once we have good routing and high degrees of specialization we will see overall network load decrease significantly while finding more information more quickly. Right now the network is doing a huge amount of work for relatively little result, even with the better peforming builds of late. In fact, if routing issues had been tackled first perhaps we would not have needed nio and the many other performance improvements that have been implemented so soon although we certainly would have needed them eventually. -- Tracy Reed http://copilotconsulting.com
pgp00000.pgp
Description: PGP signature
_______________________________________________ Devl mailing list [EMAIL PROTECTED] http://dodo.freenetproject.org/cgi-bin/mailman/listinfo/devl
