I built a digital thermometer 15 years ago with a transistor as the
probe. I used Celsius to calibrate it. Ice water for freezing and
boiling water for 100 C. It worked very well.
On Sat, 2006-03-18 at 20:20 +0000, Martin Vlietstra wrote:
> I did an interesting experiment today.  I have a digital thermometer with a
> probe on the end of a three-metre cable.  The thermometer also has a C/F
> switch.
> 
> I set the thermometer to °C and held the probe between my thumb and finger.
> It registered a temperature rise.  Once the temperature moved above 30°C,
> the temperature continued to increase in 0.1°C increments, the time between
> each increment becoming larger as the temperature approached body
> temperature.
> 
> I allowed the probe to cool and then repeated the experiment, this time with
> the thermometer switched to °F mode.  Once the temperature moved above 80°F,
> it continued to rise, but the increments had what to the outsider would
> appear to be an illogical sequence - there were four increments of 0.2°F,
> followed by one increment of 0.1°F, another four increments of 0.2°F,
> followed by one increment of 0.1°F and so on.  The result was that one would
> get five consecutive different readings in which the tenths of a degree
> digit was even, followed by five where it was odd.
> 
> I can see those who understand the technology itching to tell me the reason.
> It is quite simple - the thermometer's electronics is designed around a
> counter (within an ADC) whose value increments whenever the temperature
> increments by 0.1°C.  The C/F switch merely converts the Celsius display
> (already rounded to the nearest 0.1°C) to the closest Fahrenheit value
> thereby giving a further level of approximation.
> 
> I suspect that most digital measuring instruments are designed around metric
> units and that a Customary or Imperial Unit switch merely converts the
> output rather than switching in alternative electronics.  Any comments
> anybody?
> 

Reply via email to