Am Sat, 6 Jun 2020 15:58:55 +0000
schrieb Miguel Yepes <[email protected]>:

> But in calibration you have to make some type of corrections and on
> verification you dont you just compare to be within the manufacturer
> ranges. Now for instance Tektronix has verification manuals to check
> their equipment, before they called them calibration manuals and
> included adjustment to meet the specifications of the manufacturer.
> Now the user cant adjust easily because most are via software and is
> not commercially available.
> 
That's a very popular misconception here: Calibration by itself does
not include adjustment. You can have a calibrated instrument which is
one order of magnitude off, or an instrument which fails manufacturer
specifications for measurement accurracy, and still the calibration is
valid. Also you can have an instrument which meets specification but
is not calibrated. As Ilya mentioned, calibration is essentially just a
quantified comparison to a known good reference, giving you an estimate
of how much the instrument reading is off under well-defined conditions.

Of course performing a calibration requires you to take all known
systematic effects into account, and it would be wise to do the same
for verification, but there's no need at all to touch the instrument.
If you happen to know through calibration that your volt meter has an
offset of 1.8236V in all ranges, there'sno need to change what the
instrument displays. You just take this offset into account whenever
you use the value you read. The same goes for all the other types of
corrections: if you know the instrument has a temperature dependency of
-50mV/K, and it has been calibrated at 23°C, but you're doing the
measurements in summer at 28°C, you better add 250mV to your reading to
cancel that temperature drift. And you do exactly the same for a
verification. In the end it doesn't matter if these influences are
taken into account before the instrument displays its reading or before
you write down the measurement result in the report.

Personally, I'd rather include these considerations in measurement
reports than adjusting the instrument for two reasons: a) it
demonstrates that you're aware of these effects and can compensate for
them, and b) as Ilya mentioned it doesn't change the instrument. Even
though nowadays I consider argument b) only half valid: It's true that
there's a certain risk of changing the instruments behaviour by turning
trim pots and the like, thereby invalidating prior knowledge about the
instruments behaviour. But purely mathematical corrections do not
change the behaviour of the instrument. There's still the issue of
comparing readings taken at different points in time: the same readings
will have different meanings before and after adjustment, while
properly calculated corrections should lead to the same result even at
different points in time.

I consider it good measurement practice to not only identify the
sources of error and their respecive contributions, but also the
uncertainties associated with these, as it clearly shows where
improvements have the most effect on your quality of measurement.

And if you implement all these corrections outside your instrument,
there's no need to buy a calibration manual as you take the instrument
as is. 

HTH,
Florian
> ________________________________
> From: volt-nuts <[email protected]> on behalf of Dr.
> David Kirkby <[email protected]> Sent: Saturday, June 6,
> 2020 9:19 AM To: Discussion of precise voltage measurement
> <[email protected]> Subject: Re: [volt-nuts] What if
> “verification” in metrology?
> 
> On Sat, 6 Jun 2020 at 12:18, Florian Teply <[email protected]> wrote:
> 
> > Funny thing how things work out time-wise: I had a discussion
> > yesterday on the very topic during re-audit for ISO 9001.
> >  
> 
> Yes, it is.
> 
> In basic terms, verification in metrology is a very slimmed down
> > calibration: For a calibration, you essentially check every range of
> > your instrument at usually five or more spots within that range in
> > order to determine accurracy of your instrument in each range.
> > For a verification, you do this only at the spot where you intend to
> > measure. So if you were to measure a nominal 7.2V source, you'd
> > compare the reading of your meter with your, say, known good 7.5V
> > reference instead of doing a full calibration of the meter.
> >  
> 
> 
> > So, in order to determine whether or not your chinese voltage
> > reference meets its specs, you'd check your meter against, say, the
> > well-characterized LTZ1000A you happen to have in your lab.
> >
> > Strictly speaking, you still have to do it as carefully as you
> > would do a real calibration, taking all known effects into account,
> > but it's still much less time-consuming than a full calibration as
> > you check only one single point instead of all possible ranges with
> > five points each.
> >
> > Does this help answer your questions or did I just bring up more
> > questions than answers?
> >  
> 
> Yes. I wish VIM was a bit more explicit about it. A single sentence
> just does not do it justice.
> 
That's true, also took me some time and many discussions with
colleagues to wrap my head around the concept. Verification can be a
pretty powerful tool to save cost: at work we use it to significantly
extend calibration periods of our test equipment. And it helps keeping
measurement uncertainty in check.

Florian

_______________________________________________
volt-nuts mailing list -- [email protected]
To unsubscribe, go to 
http://lists.febo.com/mailman/listinfo/volt-nuts_lists.febo.com
and follow the instructions there.

Reply via email to