Hi All,

I have few queries,

1) when we define cbr rate as "$cbr_(0) set rate_ 0.1Mb" and packet size as
"$cbr_(0) set packetSize_ 512" this means that node should send app. 195
(0.1*1000*1000/512 = 195.32 Packets) packets in a second (Please correct me
if i am wrong). but when i run my simulation for 1 sec, I observe the number
of packets sent are only 25 packets.

Why is this so?

2) Instead of defining cbr rate in Mb if i define the rate in Kb or i just
define it 0.1, than the  umber of packets sent in 1 sec is only 1.

Why is this so?

3) while calculating the average delay of the network, should we take into
account the dropped packet as well? and if yes than what end time should we
take for dropped packets.

4) how can we put the varying data rate and average delay of the network
together in 1 file (either through awk script or through tcl file) so that i
can plot the curve between network load and average network delay?

Please help me in the above queries.


Reply via email to