EMC Group Members:

The following questions came to mind after looking at the HDMI electrical
interface specifications. Especially, questions regarding the potential of
EMI generation for the technique that is specified in HDMI 1.2, assuming
1.3 is similar.

The spec looked like it could cause problems, because instead of matching
rise and fall times, they have tr and tf specified VERY unbalanced, at
least potentially, ie, risetime < 75ps and fall time < 0.4Tbit [related to
bit time]  Doesn't that yield a huge spike of common mode at each
transition?

It was my understanding from reading the spec that the transmitter
consists of a pair of current switches to shield ground, where either is
on, much like driving the cable with ECL Logic, except in the HDMI
transmitter the impedance was kept as high as possible, approaching
infinite.  I realize using current switches may give you 6 dB larger
signal over LVDS for the same power supply, but...here's the big question:
 Wouldn't steering current through either conductor have more potential
for EMI generation from physical reality limitations, than if the same
cabling were to be driven using LVDS with matched impedances?  And one can
only drive over shorter distances using current switches? Plus, there has
to be ringing, after all, you are only load terminating the line and never
source terminating.  Shouldn't long lines should be be terminated at both
ends?

Also, consider a standard 100 ohm differential shielded cable: It is my
understanding that most 100 ohm 'differential' cables barely make it to 75
ohm and 75 ohm to shield and 300 ohms between the two conductors. I
believe some of the Belden models for balanced twisted pair show 61 ohm,
61 ohm and only 560 ohm between. Is it possible to even get better
'forgiveness' to the balance and have 100-100 and 200 between them?

Now consider an unbalanced cable caused from manufacturing tolerances, any
'unbalance' in the cable will exacerbate EMI generation, because the
shield current is subtracting large numbers to get zero.  In other words,
driving a slightly off balance pair of conductors, CAUSES EMI when driven
with LVDS.  But, that effect must be much less when driving with switched
currents, only the voltage levels become unbalanced?

Any papers covering this in detail out there?

What happens when you go through a connector where the structure usually
becomes 50 ohm to shield, 50 ohm to shield, and zero between during this
transition?

Has anyone compared the two driving techniques, current switching to LVDS?

Has anyone compared the maximum distance obtainable for the two techniques?

Regards,
Robert

-

This message is from the IEEE Product Safety Engineering Society emc-pstc
discussion list. To post a message to the list, send your e-mail to
<[email protected]>

All emc-pstc postings are archived and searchable on the web at:
http://www.ieeecommunities.org/emc-pstc
Graphics (in well-used formats), large files, etc. can be posted to that URL.

Website:  http://www.ieee-pses.org/
Instructions:  http://listserv.ieee.org/request/user-guide.html
List rules: http://www.ieee-pses.org/listrules.html

For help, send mail to the list administrators:
Scott Douglas <[email protected]>
Mike Cantwell <[email protected]>

For policy questions, send mail to:
Jim Bacher:  <[email protected]>
David Heald: <[email protected]>

Reply via email to