To the group, I am hoping that someone has some good suggestions for inducing harmonic distortion to the ac line input of an instrument under test. The nominal input is 120Vac/60Hz. The distortion frequencies are 120, 180, 240, and 300 Hz, at 5% applied harmonic distortion.
The plan is to place the secondary winding of a 10:1 step-down transformer in series with one phase of the instrument under test. The distortion frequency will feed the primary of the transformer. Both frequencies will be generated by variable frequency ac sources. One source will be adjusted + or - 0.5Hz away from the exact frequency in order to explore all possible phase relationships over time. I plan to use true rms meters to monitor voltages. I also intend to first measure the open-circuit secondary voltage of the transformer in order to determine the exact winding ratio. Another possible method might be to use a combination of 50 ohm attenuators and a splitter to sum the outputs of two oscillators into the external (50 ohm) input of an ac source. My question is: Will controlling voltage ratios be enough to guarantee 5% distortion, or must I obtain a distortion analyzer and actually measure the output? I hope one of the group has had some experience with this type of test. This test can not be waived as the instrument being tested uses the ac waveform as a reference signal. Thanks in advance for any help Scott B. Lacey [email protected] --------- This message is coming from the emc-pstc discussion list. To cancel your subscription, send mail to [email protected] with the single line: "unsubscribe emc-pstc" (without the quotes). For help, send mail to [email protected], [email protected], [email protected], or [email protected] (the list administrators).

