> -----Original Message----- > From: [email protected] [mailto:[email protected]] On > Behalf Of Robert Darby > Sent: Sunday, July 07, 2013 3:33 PM > To: [email protected] > Subject: Re: [time-nuts] Question about effect of sample interval on ADEV > > Thanks to all who responded. I didn't phrase my question very well I'm > afraid. The periodic noise/beat note issue is what lead me to try a > different sample interval. Previously I had always let TimeLab set the > sample interval (about .07s for the5370B). > > I was surprised by the difference in the adev traces when I varied the > sample interval from .07s to .25s to .5s to 1.0s and that is what I > hoped someone could explain. > > When I edit a trace and change the sample interval there is not a > substantial change to the Tau / Sigma (Tau) values yet when I actually > run at the different sample intervals I get minimum values for each run > of 400s 5.60e-13, 600s 1.70e-13, 1000s 9.55e-14, 3000s 4.15e-14. The > traces are totally different; same oscillators and counter, just > different sample intervals. That's what I hoping one of you could explain.
So the only difference between the test setups is the setting of the Display Rate control on the 5370, correct? You're allowing TimeLab to estimate the sample rate automatically, and giving it enough time to converge on a stable reading before hitting 'Start Measurement'? You're correct in that changing the real-world sample rate should yield results that are identical (or at least very similar) to resampling the phase data after the fact. In frequency mode, dead time between readings would make that an iffy proposition, but for data taken in TI mode the outcomes should be close. -- john, KE5FX Miles Design LLC _______________________________________________ time-nuts mailing list -- [email protected] To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
