--------
In message <[email protected]>, Florian Teply 
writes:

>Now, as far as I understand, calibration at first sight is merely a
>comparison between what the meter actually reads and what it is supposed
>to read. As long as the difference between the two is smaller than what
>the manufacturer specifies as maximum error, everything is fine, put
>a new sticker to the instrument and send it back to the owner.

What the sticker really says is that you have credible statistical
reasons to think the meter will be inside spec until the date on
the sticker.

This is why you can go longer between calibrations if you have
the calibration history for the instrument.

If for instance you instrument over the last five yearly calibrations
have been found to show 0.10, 0.15, 0.20, 0.25 and 0.30, then there
is every statistical reason to expect it to show 0.35, 0.40, 0.45
and 0.50 at the next four yearly calibrations, barring any unforseen
defects or mishaps, and the date for next calibration can be chosen
accordingly.

If on the other side its calibration history contains something
like ... +0.25, -0.35 ... you know it can change 0.6 in one year
and you may have to pull up the date on the sticker accordingly.

If the instrument has no history and reads 0.35, you will have to
consult the manufacturers drift specs and project forward and see
what the earliest date the instrument can become out of spec, and
write a date conservative to that estimate on the sticker.

>Background of my questions is me wondering if it would be feasible to
>do the calibration in house instead of sending equipment out for
>calibration.

The biggest advantage to inhouse calibration, is that you can do it
much more often, and therefore don't need to do it as precisely
as the cal-lab, because the sticker only needs date some months
ahead.

The second biggest advantage is that you can perform the calibrations
in the target environment, rather than at some artificial enviromental
conditions, which don't apply in real life.

The third biggest advantage is that the calibration doesn't take
the instruments out of commission for several days due to transport
and scheduling, and they don't get damaged and lost in transit.

The biggest disadvantage is that you need to maintain suitable
cal-standards in-house.

If it is just DC and AC voltage/current/resitance in the audio
range, a HP3458A will handsomely pay itself back.

Up to about some hundred MHz you do somethign similar with
a good vector network analyzer.

In GHz territory it gets nasty.

-- 
Poul-Henning Kamp       | UNIX since Zilog Zeus 3.20
[email protected]         | TCP/IP since RFC 956
FreeBSD committer       | BSD since 4.3-tahoe    
Never attribute to malice what can adequately be explained by incompetence.
_______________________________________________
volt-nuts mailing list -- [email protected]
To unsubscribe, go to https://lists.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Reply via email to