Dave wrote:

I'm not sure how to avoid the change in input impedance with the clipper involved unless I were to increase R1 to a very much higher value.

The voltage noise density of a 3.3k resistor is about 7.5nV/rootHz, for a total noise of about 23uVrms in a 10MHz bandwidth (compared to 2.7nV/rootHz and 9uV for a 475 ohm resistor). I doubt that changing R1 to 3.3k would raise the jitter measurably.

With R1 = 3.3k and R5/R6 = 33k (to reduce input attenuation with the increased R1), the input impedance for the first ~600mV (+/-) of input voltage would be ~50 ohms (100||100|33k), and for input voltages >600mV it would be ~49.2 ohms (100||100|3.3k||33k). (Again, with a transition zone of ~100mV as the diodes turn on.)

Best regards,

Charles



_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to