All of the following is conjecture, based on the dielectric strength equipment
that I used in 1961 woring at UL. We had a large unit, about 24 inches square
and 36 inches high, all enclosed in a wooden enclosure, mounted on wheels so
that it would be wheeled about to wherever the test needed to be done, and with
an AC power cord so that we could plug it into the wall. It had a large wheel
in front (about 10 to 12 inches in diameter, a single pole circuit breaker also
in front, both at about 2 1/2 feet of the floor so that they were easy to get
to. On the top rear, there was a panel with a manual (crank type) 60 second
timer, and an outlet receptacle for a voltmeter to plug into. I think that
there was also an outlet for the equipment to be plugged into, but I can't be
positive about that; there might have been cables out the back.
When it was time to run the hypot test, we wheeled the unit over to the
equipment, plugged in a voltmeter, plugged in the equipment under test, plugged
the tester into the wall, and turned the circuit breaker on. Then we would turn
the wheel until the voltmeter read 900 volts (the test voltage at the time),
and when it was reached, we would turn the timer to 60 seconds. At the end of
the 60 seconds, the timer would turn the circuit breaker off. The circuit
breaker did not have a current rating on it, but I did ask and was told that it
was set for about 6 to 10 ma. It was a very touchy circuit breaker, and if
there was any breakdown it would trip.
The way that the test is described in the standards follows the equipment that
I used in '61, and at that time the unit was already old, and the standards had
been in existence for decades. That pretty much follows the way that many
standards are written, first you find a method of testing that works then you
document the method. But I digress.
The reason for the 500 VA transformer requirement is that it was the size of
the transformer in the test unit. The reason for being AC is that the unit
could be plugged into the wall, and at that time there was no need for DC
(remember this is before transistors). The reason that there was no clearly
defined failure criteria is that we relied on a circuit breaker to trip and the
breaker was not marked with a rating. The reason for requiring a voltage
indicator is because we had to plug one in on the unit. The reason for one
minute is because the timer we used was 60 seconds.
Maybe someone even older that me can fill us in on exactly where the 900 volts
came from. Maybe Reed Keyes at Santa Clara would know. Wally Wedikind would
have known but he's long gone.
Gabriel Roy
Hughes Network Systems
MD
The opinions expressed are those of Jim Eichner's invisible friend.
------ snip ------
Jon Bertrand wrote:
I'm a little off topic here:
Does anyone know the "history" of hipot testing?
Say someone tells you to "hipot" your cable or wire harness at
1000VAC, measure the current, and decide if it's low enough. Where
did 1000VAC come from, and why is it AC?
I'm guessing it all started in World War Two building B17's and such.
Since AC was easy to make (you'd just use a transformer) people
probably picked it for the supply voltage. I'd also guess that 0.1 to
5.0 mA was the common current range because it was the lowest amount
your meter could measure (and it was just low enough not to kill you).
Does this sound correct?
Anybody know for sure?
Jon Bertrand
[email protected]