I was wondering if someone could explain why I would be getting an avg value 
within RED of -1.94 give or take 1/100th or a point.  I am looking to force RED 
to drop packets in the user level, however the condition if avg>max_thresh is 
never evaluated as true because of the value.  Maybe some explaination of 
exactly how the avg is calculated and an understanding of why I might be 
getting these values.  I am throwing approximately 12 MB/sec at the router with 
a 2MB/sec outgoing linerate.  I am just looking for a little help.

Thank you,
John C Stille

                
---------------------------------
Yahoo! Music Unlimited - Access over 1 million songs.Try it free. 
_______________________________________________
click mailing list
[email protected]
https://amsterdam.lcs.mit.edu/mailman/listinfo/click

Reply via email to