Bob Harris wrote:

    But you can't make a a division between "small-world networks"
    and DHTs, because all DHTs are, as has been noted, small-world networks,
    just with varying degrees of randomness.


Oskar: I don't think you realize that your use of the word "small world" is quite
different from everyone else's and the graph theoretic definition. If you
think CAN defines a small world in any sense, we will not be able to
communicate.

I don't know if somebody using the term "small-world network" got your grant money or something, but you need to get over your fixation with having your own definition for this term, and be more specific about what you do not like. As far as I can tell, your problem seems to be with trying to use the Kleinberg model (and further developments on it) to build probabilistic DHT networks. This is a particular model, which really has nothing to do with Watz and Strogatz, "theoretician wannabes" or "crackpot physicists".

I guess the success of the "small worlds" meme owes a lot to having a catchy
name, a loose allusion to human behavior, and a model simple enough that
anyone, including theoretician wannabes and even crackpot physicists, to get
into the p2p game. When performance is not an issue or when one cannot tell
X from X^2, every approach is as good as every other.

I don't know what part of "log^2 scaling when the node degree is constant" it is that you do not understand, but it feels like a lost cause trying to repeat it again. For the benefit of any other readers, however, I reran the same simulations I posted before, this time scaling up the degrees with the size of the network (2 log_2 N edges per node):

1000    4.6811
2000    5.0803
4000    5.446
8000    5.7923
16000   6.1494
32000   6.561
64000   6.8694
128000  7.2843
256000  7.6469
512000  8.0424

For those who are to lazy to calculate the deltas, here is a plot:

http://www.math.chalmers.se/~ossa/temp/llog.png

What kind of scaling does that like to you? (Hint: what function gives a straight line when the x-axis increases exponentially?)

But the fact that the number of edges can be scaled independently of the size of the network is a STRENGTH of the randomized networks. Prescribing a certain number of edges for a certain N is dumb, because more edges can mean quicker lookups, so there is no reason for nodes to have less edges then they can manage. In the randomized networks every edge is helpful, but no edge is necessary.

The bigger point is, however, that staring oneself blind at the asymptotic order of the scaling is the silly behavior of people whose reasoning is limited to "5 << 25". One should look at the size window that one can expect the network to grow to, consider acceptable values for other parameters like node degree, and pick the appropriate algorithm which performs well in that window.

// oskar
_______________________________________________
p2p-hackers mailing list
[email protected]
http://zgp.org/mailman/listinfo/p2p-hackers
_______________________________________________
Here is a web page listing P2P Conferences:
http://www.neurogrid.net/twiki/bin/view/Main/PeerToPeerConferences

Reply via email to