At 11:39 PM 8/10/2011, Jed Rothwell wrote:
Abd ul-Rahman Lomax <<mailto:a...@lomaxdesign.com>a...@lomaxdesign.com> wrote:
The maximum error in the actual measurement, then, will be +/- 0.1
degree, plus a little, so that it *might* be off by another digit
under some circumstances. I.e, suppose the calibration reads 100.0,
but the internals of the meter is saying 100.0499. So we then have a
systematic error of -0.0499 degree. Then we go to measure a
temperature of 100.0998 degrees. The meter will "read" 100.0499,
rounding down to 100.0. An error of almost 0.1 degree.
Good point. On a meter with a fixed display, you cannot calibrate
any finer than the last digit displayed, minus a tad. McKubre can
calibrate RTDs (I think they are) to a fraction of a degree because
he is looking at a computer screen with as many digits as you like.
Thanks, Jed. You can display to as many digits as you want, but the
issue will be, for resolution, the resolution of the A/D converter in
the data capture device. It can get complex. I'll have to deal with
resolution of the A/D converter in the LabJack, unless I build or use
an external amplifier. The signal itself from the thermocouple is
analog, so theoretical resoluton is infinite; however, there is also
noise to consider. By averaging many readings, noise can effectively
be cancelled....
If they wanted to really know the pressure accurately, and the true
temperature behavior, they'd need to use something more sophisticated
than what they did.
The value in all this is in preparing for truly conclusive
demonstrations. Being thorough in understanding errors and possible
errors in the early demonstrations is an important part of this.