--- Tom Kaitchuck <[EMAIL PROTECTED]> wrote: 
> On Sunday 19 October 2003 12:42 pm, Frank v Waveren wrote:
> > On Sat, Oct 18, 2003 at 02:43:28PM -0400, Ken Corson wrote:
> > > 20,000 . Of course, trying to estimate the size of Freenet is next
> > > to impossible, but I'll take a stab :) Any other gamblers out there ?
> >
> > A few months ago I collected noderefs from my own and several other
> > peoples nodes and counted the number of unique, reachable ones. Judging
> > from the rate of growth of the number of reachable nodes, I strongly
> > doubt there being more than 2000 nodes, and probably less than half
> > that number (I think I ended up with having 400 reachable nodes or
> > something like that, check the list archives).
I think that would be a pretty good method; if you plot number of nodeIds known verses 
time and
that graph seems to go up and slam into some max, that max is probably around the 
number of nodes.

 
> I don't know about your methods, however your conclusion makes no sense at 
> all. Try setting your max connections to 2000 or so. If what you are saying 
> is correct you should not be able to fill it. 
Actually yeah, I think that's what would happen if he only knows of X nodeIds he won't 
be able to
connect to more than X nodes.

> Second this would mean the 
> because the default max connections is 512, and AFAIK most nodes connection 
> lists are full, then most nodes are connected to a sizeable portion of the 
> network. If this were true, queries should not take more than a few hops to 
> find the data on the network if it is there at all. 
You're assuming the routing works that well.

> Also consider this: how much data is 
> available and retrievable on Freenet? All that data must be on a computer 
> somewhere. Given all the sights that have ISOs, movies, Image etc. archives 
> that are very large, and the very small default datastore size, I can't see 
> how that would work out unless very large numbers of people were upping it 
> substantially. 
Sure if that data were reachable.  It's been a while since I tried, but haven't been 
able to
download an entire movie.

> Finally the number of downloads for any given VERSION of 
> freenet is higher than this. Even though not all those people are running 
> permanent nodes there are surely at least that many users.
That is a cool thing to count.  One problem may be that people have scripted the 
update function
to repeatedly and are causing extra downloads.  Does update.sh check to make sure it 
needs to
update?

> Back in the day, when some network simulations were done, the code at the time 
> seemed to scale ether logarithmicly or x^.27 (Ether model fit the data.) So, 
> sometime during 5xx I did my own test by downloading lots of files and 
> recording the average HTL where I found them. It was just under 15 at the 
> time. So if you ether assume that it scales with respect to X^.27 or 
> logarithmically, and even then is no more efficient than a binary search tree 
> (very broken) you are still talking 20,000-30,000 nodes easily, and that was 
> back then. 
If you can't tell the difference between O(log(n)) and O(n^.27), you needed more 
data(larger
simulation).

Those simulations are great, but they may miss a bunch of real world peculiarities: 
* nodes coming and going
* requests and inserts not being random but ordered by "taste"
* changes in routing like NG
* all the unknown things nobody has even thought about yet

__________________________________________________________________

Gesendet von Yahoo! Mail - http://mail.yahoo.de
Logos und Klingelt�ne f�rs Handy bei http://sms.yahoo.de
_______________________________________________
Devl mailing list
[EMAIL PROTECTED]
http://dodo.freenetproject.org/cgi-bin/mailman/listinfo/devl

Reply via email to