My point was, that DSO is basically an ADC. Therefore, there is some amount
of noise, nonlinearity and drift, limiting the jitter measurement. Do you
think any method can dig more information from given data than sinc()
interpolation and zero-crossing computation?
The cross-spectrum averaging does indeed do just that, relying on two
ADCs to produce uncorrelated noise, which can be averaged out.
Or am I misunderstanding your point?
Nothing against that. It depends on what noise level after averaging you
require. I only posted my experience with a very low-quality DSO, which
has 100psRMS single-shot. Using sinc() interpolation, but my point was,
that I suppose there is no way to obtain better single-shot performance
than this. To average out 100psRMS to, say, 1psRMS, it would require 10^4
edges (under the assumption, that the 100psRMS is well behaved noise).
What performance it could yield with a better scope? I hope I'll try
LC584AL some day, I guess it might give sth like 10psRMS single-shot...
Regards,
Marek
_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.