>-----Original Message----- 
>From: Gordon,Ian [ mailto:ian.gor...@bocedwards.com] 
>Sent: Monday, August 11, 2003 8:33 AM 
>To: 'IEEE EMC-PSTC GROUP' 
>Subject: Temperature effects on conducted emissions? 
> 
> 
> 
>All 
>Can anyone suggest a means by which the indicated signal from a LISN + 
>transient limiter + receiver combination can result in a 30dB 
>change over 
>one month? I used the same "reference source" and test 
>configuration on both 
>occasions to generate emissions. However, the source is merely 
>a piece of 
>standard equipment and not intended as a calibration reference. 
>The temperature has varied considerably over the last month 
>but I would not 
>have thought this could result in a 30dB variation. 
>Alternatively, can anyone suggest a means of constructing a 
>reference source 
>which may be connected to the LISN input? 
>Thanks 
>Ian Gordon 
> 



Ian: 


You didn't say if the latest measurements were higher or lower than the
original. However, 30 dB is a huge variation, and room temperature shifts
shouldn't cause that at all.

You need to build confidence in your setup. 

First, verify the cal of your analyzer. Easiest way is to use the front panel
cal output at maybe 100 MHz or 300 MHz. Or inject a signal from a signal
generator into the analyzer input. One thermal effect could be that you have
fried the analyzer's input attenuator.

Now, put the limiter on, and repeat the above. 

Now inject into the head end of your coax. You should be down only by the
expected coax loss. 

Now, inject into the input of the attenuator that you normally use on the
LISN. You should be down by 10 dB. 

Last, check your LISN. That's relatively easy. Assuming that you are using an
LISN that works from 9 kHz to 10 MHz or so, just connect a 50-ohm signal
source to the LISN power output terminal and the case ground. (You did
disconnect the LISN input power?) Also connect a high impedance oscilloscope
probe to this point. Now connect the second oscilloscope channel to the LISN
signal output port, using the 50-ohm termination option in the oscilloscope.
(I use a Tek TDS640A.) Inject enough RF signal to get a nominal value on
Channel 1, typically 1 Vrms. Now look at the Channel 2 50-ohm signal port
level. It should also be almost exactly 1 Vrms, except for the frequency range
below about 100 kHz.

If this reading is significantly lower than it should be, then you are seeing
a bad coupling capacitor in the LISN. If too high, then a blown LISN resistor
is the cause.

BTW, as you sweep the injected signal down towards 9 kHz, you should see a
rise in the signal port loss, until you see about 5 dB or so at 9 kHz. This is
normal, and something that you should have been adding in to you acquired
data. (It's the typical loss associated with the RC voltage divider. The loss
will be very high at the power frequency, and infinite at DC.)

The last thing to consider is the attenuator and limiter. Both of these may
have been subjected to extreme physical shock (as in I dropped it on the
concrete floor). This could cause an intermittent problem. You might want to
tap or slap them while you watch the signal loss through them. Also try
wiggling and moderately pulling on the various coax fittings and connectors.

Regards, 

Ed 


Ed Price 
ed.pr...@cubic.com         WB6WSN 
NARTE Certified EMC Engineer & Technician 
Electromagnetic Compatibility Lab 
Cubic Defense Systems 
San Diego, CA  USA 
858-505-2780  (Voice) 
858-505-1583  (Fax) 
Military & Avionics EMC Is Our Specialty 
> 


Reply via email to