Hi Arjuna,

I'm not quite sure what you mean here. There is no text in the RFCs that I know of that recommends this procedure. What the RFCs say is that the rate is not to be *reduced below* W_init as the result of an idle period. This doesn't change "min_rate", exactly, as far as that parameter exists. For instance, while the sender is idle [so X_recv must needs be 0], the implementation might do something like "X_recv := max(X_recv/2, W_init/2R)" on every nofeedback timer, rather than the straightforward "X_recv /= 2" common when the sender is sending. Can you tell us where you got the calculation you listed below?

Eddie


Arjuna Sathiaseelan wrote:
If the sender had been idle or datalimited, then the minrate is calculated as:

If (sender has been idle or data-limited)
                min_rate = max(2*X_recv, W_init/R);
            Else
                min_rate = 2*X_recv;

But I guess we have overlooked the possibility that loss event rate p
could be greater than 0 after an idle or datalimited period. So in the
presence of loss, it may not be wise to assign the minrate to be a
maximum of 2*X_recv and Winit/R..

If that is the case, the I guess we have to add something like this:

If ((p==0) && sender has been idle or data-limited)
                min_rate = max(2*X_recv, W_init/R);
            Else
                min_rate = 2*X_recv;

Am I making sense here?

Regards
Arjuna

Reply via email to