Hello all - I have to create a program in C - I have very less 
experience in C. Could somebody help me in writing this program?

Here are the details:-

Suppose N stations are waiting for another packet to finish on an 
Ethernet. 
All transmit at once when the other packet is finished, and collide. 
Write a 
program to simulate the continuation of these attempts, and to 
determine how 
long it takes before one succeeds. Make the following 
simplifications: ignore 
interframe spacing, ignore variability in collision times (so that 
retransmission is always after an exact integral multiple of the 
51.2microsec
slot time), and assume that each collision uses up exactly one slot 
time. 
Model the time, T, in units of slot times, so a collision at time T 
followed 
by a backoff of k=0 would result in a retransmission attempt at time 
T+1. 

Find out the average delay for N=20, N=40, N=100, and maybe some 
larger values 
too. Do your data support the notion that the delay is linear in N? 
(It should.)

I appreciate any guidlines or help. Please guide me the steps 
included for this task.

Thanks

Reply via email to