Constant timer creates a delay between each sample that is the same for all 
samples.

Gaussian random creates a delay that will average X milliseconds, but in such a 
way as to create a bell-shaped distribution around that average.  The average is 
the sum of the offset (minimum delay) and the deviation (the range of values).

Uniform random timer creates a uniform distribution around the average.

-Mike

On 3 Sep 2002 at 10:25, Amir Nashat wrote:

> Hello all,
> 
> I asked this on Friday but I have had no response so I will ask again. Can anyone 
>explain the difference between the timer options? The documentation is pretty sparse. 
>Any examples of which to use in what situation would be helpful. I have researched 
>the faq, help and gone through google but 
not much has been explained. Any help would be appreciated.
> 
> 
> TIA
> amir
> 
> 
> 
> 
> --
> To unsubscribe, e-mail:   <mailto:[EMAIL PROTECTED]>
> For additional commands, e-mail: <mailto:[EMAIL PROTECTED]>
> 



--
Michael Stover
[EMAIL PROTECTED]
Yahoo IM: mstover_ya
ICQ: 152975688

--
To unsubscribe, e-mail:   <mailto:[EMAIL PROTECTED]>
For additional commands, e-mail: <mailto:[EMAIL PROTECTED]>

Reply via email to