Hi Don:

If you want to average many (100, 1000) reads in one second you need to use the 
ARM function as described in appendix B of the PRS10 manual.
http://www.prc68.com/I/PRS10.shtml

Have Fun,

Brooke Clarke
http://www.prc68.com/P/Prod.html  Products I make and sell
http://www.prc68.com/Alpha.shtml  All my web pages listed based on html name
http://www.PRC68.com
http://www.precisionclock.com
http://www.prc68.com/I/WebCam2.shtml 24/7 Sky-Weather-Astronomy Web Cam

Don @ True-Cal wrote:
> Fellow Time-Nuts,
> 
> I am having great fun with Ulrich's EZGPIB and Plotter programs to automate 
> my ADEV and TI measurements. Wow, what a nice set of programs, thanks Ulrich!
> 
> I use the SR620 TIC with a Fury board as an external reference. The Fury 
> disciplines an 10811-60168 external oscillator. I can go unlocked to improve 
> the range around Tau 100s if and when necessary. For a series of tests, I 
> used an LPRO-101 10Mhz signal to drive B-Ch (Stop) of the SR620; the A-Ch 
> (Start) was set to Ref. for a Zero-Crossing TIME measurement on the TIC. I 
> streamlined the EZGPIB SR620 query program and experimented with counter 
> settings to minimize the inevitable and inherent latencies of the computer 
> layers, network, GPIB-Enet/100 bridge and the counter (counter being the 
> worst). With the counter set to 100 samples and the 1KHz "Ref" being used as 
> the START, I was expecting a new, 100 sample TI average, every 0.1 seconds. 
> My first evidence of something not being ideal was embedded in the details of 
> the EZGPIB output console and accompanying file. Sometimes there were 7, 8 or 
> 9 samples per second of time and never 10. Also, the total time span of a 
> large coll
ection of samples was always slightly longer than the product of the sample 
rate and count. I used Excel to scan 18000, 0.1s TI samples to determine what 
the actual statistics might be:
> 
> Average = 0.122302796 sec
> Min = 0.188015099 sec
> Max = 0.108984648 sec
> 
> Since the ADEV function as well as Ulrich's Plotter program requires a 
> constant Tau-0, I experimented with the nominal 0.1s and the real "average" 
> of 0.1223s Tau-0 setting and attached a graph that illustrates the variance 
> across Tau. My question is; what is "acceptable" practice for defining Tau-0 
> when the likelihood of having a stable sampling interval is difficult. It was 
> rather simple to specify a more accurate time sample interval once determined 
> by the extra step of spreadsheet analysis and the effect on the results is 
> obvious. But that is still, only an average. What about the effect of the 
> deviation about the average value? It would seem that would be a much more 
> complex issue to deal with.
> 
> See attached export or Plotter graphic.
> 
> Regards...
> Don
> 
> 
> ------------------------------------------------------------------------
> 
> _______________________________________________
> time-nuts mailing list -- [email protected]
> To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
> and follow the instructions there.

_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to