Hi,

On 12/18/2017 11:27 PM, Tom Van Baak wrote:
Rick,

The 53132A is a "12 digit/s" counter. Unless the frequency is really close to 
10 MHz. Then it becomes a 11 digit/s counter. This is because it uses oversampling (IIRC, 
200k samples/s) and it relies to some extent on statistics for its 12 digit resolution.

This technique does not do as well when DUT is too closely aligned in phase and 
frequency with REF. I mean, you can oversample all you want, but when the two 
clocks appear locked most of those samples are redundant; they offer no 
statistical advantage. Hence the reduced resolution. By a factor of 10!

The nice thing about the 53131/53132 is that this condition is recognized in 
f/w and the output resolution is pruned automatically. If you have long log 
files of an oscillator warming up you can see it quite nicely.

Note also that it's not just when DUT is 10 MHz or near 10 MHz; there are 
hundreds of magic frequencies where reduced resolution occurs: any rational 
fraction or multiple of that's within about 7 digits of 10 MHz. This is not 
undocumented. Buried in the manual is:

http://leapsecond.com/pages/53132/53132-reduced-resolution.gif

Also, the issue isn't unique to the 53132A. Any counter or software that uses 
oversampling has to face this effect [1]. That is, you can't blindly assume 
your resolution always improves by sqrt(N). As obscure as this effect is, I'm 
really impressed hp put so much thought into it. It's one reason I have a lot 
of trust in the 53132A.

It's really not due to oversampling. It is really an inherent effect of all counters. The oversampling and filtering done to improve frequency reading is just not algorithmically efficient when the systematic sampling spread goes outside of the time-window that the filter represents.

It does not care if the averaging is done in the counter or in the post-processing, the effect will be there. It will also eat you as you compare ADEV and MDEV, because MDEV does an averaging before the ADEV processing core.

Finally, at the risk of mentioning noise, measurement, and ADEV here, you can 
also guess that this clever oversampling measurement technique has 
ramifications on the fidelity of ADEV calculations made from frequency 
readings. Check previous posts, probably from Magnus, that discusses this [2].

Yeah, I keep study obscurities like these. :-)

The references given is good food for thought. I also made a paper and poster presentation for this summers event on the interaction of noise and time quantization, and it doesn't quite as people assume. I think only a few got the full lecture I was giving there. More to be written, more to be explained.

If I only had more time to do this stuff.

Anyway, the filtering of the counters as they do averaging causes a bias to ADEV, but their main effort is to filter out noise. The trouble is that the quantization noise isn't random so your milage may vary in fighting it. Turns out more actual noise helps to get better higher tau resolution, allowing MDEV or PDEV to go deeper than ADEV. Things just doesn't work the way you expect. Thus, ADEV can be "captured" by the systematics and short-tau MDEV and PDEV too.

/tvb

[1] One way to avoid or reduce the chances are to use an obscure frequency for 
REF. Another way is to deliberately apply carefully characterized jitter to DUT 
or REF during measurement. You can see the connection with DMTD systems, or 
TimePod.

[2] See papers like:

     "On temporal correlations in high–resolution frequency counting", Dunker, 
Hauglin, Ole Petter Rønningen (!!!)
     https://arxiv.org/pdf/1604.05076.pdf

Look closely at what you have, they have.
I had to point people to the poster-presentation of this at EFTF in York.

     "High resolution frequency counters", E. Rubiola
     http://rubiola.org/pdf-slides/2012T-IFCS-Counters.pdf

Another good read.

     "The Ω counter, a frequency counter based on the Linear Regression", 
Rubiola, Lenczner, Bourgeois, Vernotte
     https://arxiv.org/pdf/1506.05009.pdf

The Delta-counter is also a good read-up. The Omega counter is the linear regression / least square estimator. I presented an accelerated version of this last year at EFTF and IFCS.

     "Isolating Frequency Measurement Error and Sourcing Frequency Error near the 
Reference Frequency Harmonics"
     http://literature.cdn.keysight.com/litweb/pdf/5990-9189EN.pdf

Thanks for that reference, I had lost track of that paper and I need it.

Now, why does it look like DDS noise? ;-)

Cheers,
Magnus



----- Original Message -----
From: "Richard (Rick) Karlquist" <[email protected]>
To: "Discussion of precise time and frequency measurement" <[email protected]>; "Pete 
Lancashire" <[email protected]>
Sent: Monday, December 18, 2017 1:33 PM
Subject: Re: [time-nuts] Recently acquired 53132A


I worked in the HP Santa Clara Division frequency counter
section at the time of the development of the 53132A series, which had
the internal code name of "Major League Baseball".  IIRC, the external
reference circuit in it was designed by a couple of
engineers who had no background in time nuttery and
did a mediocre job.  Someone else commented on a problem
with it not wanting to measure 10 MHz correctly.  I
never heard of that before, but it would not surprise
me, because the main measurement engine was designed
by a very excellent FPGA engineer without an extensive
background in time nuttery.  The problem mentioned might
have been too subtle.

The 53132 has many good points but is not perfect.

Rick


_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to