Two part question:

I'd like to test the effect of small signal level changes on the phase output 
of a high resolution linear phase detector at ADEV values below 1e-16 and tau 
>1000 sec.
I'm looking for suggestions on how I can manually vary the signal amplitude of 
one of it's 10 MHz sine wave inputs by up to say 3 db with ~0.1 db step size,  
while keeping the signal's phase constant in the sub 0.1 picosecond range.

Is there any data available on how much the signal amplitude of a LPRO Rubidium 
Oscillator or a HP10811 class Oscillator changes over temperature and time when 
driving an exact constant resistive load?

ws
  
_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to