Hi, all

I'm simulating some noise to try to improve my somewhat sketchy
understanding of what goes on with the various noise types as shown on an
ADEV plot. Nothing fancy, ~3600 points of gaussian random numbers between 0
and 1 in excel, imported into Timelab as phase data, scaled to ns.

I mostly get what I expect; "pure" random noise, gives the expected slope
for W/F PM, -1. Integrating the same random data gives the expected slope
for W FM -1/2. Integrating the same random data yet again gives a slope
of +1/2, again as expected for RW FM.

However, looking at the data, I am somewhat baffled by a difference in the
starting point of the slopes. Given that this is exactly the same random
sequence, I would expect the curves to have the same startingpoint at
tau0.. Clearly not (see attached), but I do not understand why. Any clues?

Is this some elemental effect of integration (sqrt(n) or some such), or am
I seeing the effects of bandwidth and/or bias-functions or other esoterica?

In case the screenshot does not make it though;
W PM starts at 1.69e-9
W FM starts at 9.74e-10
RW FM starts at 6.92e-10

Thanks for any help!
Ole
_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to 
http://lists.febo.com/mailman/listinfo/time-nuts_lists.febo.com
and follow the instructions there.

Reply via email to