On Thursday 01 June 2006 16:00, Ian Clarke wrote:
> 
> On 1 Jun 2006, at 10:52, Matthew Toseland wrote:
> 
> > Maybe we should have the initial backoff time dynamically calculated?
> > Discuss.
> >
> > [13:30] <edt> toad_ simpiler idea use max(n*myping,targetNodes*ping)*2
> > as target nodes min window
> > [13:30] <edt> toad_ simpiler idea use max(myping,targetNodes*ping)
> > *2 as
> > target nodes min window
> > [13:31] * edt sighs
> > [13:31] <edt> here is is correctly expessed
> > [13:31] <edt> toad_ simpiler idea use
> > max(myPingTime,targetNodesPingTime)*2 as target nodes min window
> 
> Can you explain the rationale behind this formula?

My thoughs are like this.  Now we have picked a number out of the air to use as 
a minimum 
window value (5 seconds).  The ping time gives us the absolute fastest time we 
are going to
get a reply from a given node.  The *2 is a guess that, if a packet contains 
data, it will take
longer to process - '2' may not be optimal...  Why should we not use a minimum 
window that
is based on how fast a given connection is working.  Note this does not change 
the backoff
logic, just allows it to send more packets (smaller window) IF the connection 
is fast enough.

Ed 
_______________________________________________
Devl mailing list
[email protected]
http://emu.freenetproject.org/cgi-bin/mailman/listinfo/devl

Reply via email to