I think the 10 Meg default value became a de facto standard at the time of VTVMs (vacuum-tube volt meters), as a convenient value which reduced input circuit loading while remaining compatible with the grid current of the input triode. Designers of early solid-state voltmeters merely decided not to change a good thing.
Just my $0.02 worth!

Joel Setton


On 10/04/2014 18:55, Steven J Banaska wrote:
As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high voltages
without much drift. Caddock THV or HVD are fairly common in precision dmms.

Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure. As you
mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other than
it may extend the life of the relay that switches the 10M divider in or out.

Steve



_______________________________________________
volt-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Reply via email to