Timothy Normand Miller wrote:
On 11/29/07, James Richard Tyrer <[EMAIL PROTECTED]> wrote:
This doesn't quite add up.  IAC, are we talking about analog or digital
display?

Digital.  Analog has visible problems at much higher dot clocks.  I
haven't seen it myself, but reportedly, what you see is some
medium-frequency jitter that causes vertical lines to wiggle
horizontally so that they move on and off center of the aperture
grill, making it look like it's twinkling.

For digital, it's a clock-recovery problem.  The receiver has a PLL
that it's using to rephase its clock so it can sample the data.  It
can't keep the clock in line with the data, and since the data is
coming in at like a GHz, you don't have to be far off to sample at a
transition.

So, basically, the clock jitter is making the TMDS output so dirty that the receive can't recover a good clock if the eye closes a little due to a long cable. I really hate it when that happens.

The hard question is whether the data we are getting is clean enough that it will transmit correctly if the TMDS transmitter has a better clock.

One option is to use the clock generators in the Lattice part, but
even they have like 400ps of jitter, and they also severely limit the
range of frequencies we can generate.
IIUC, we have a clock generator for the pixel clock -- meaning that the
frequency will have to be changed for different formats, but while
running it will have a constant frequency.  But, both channels will have
different frequency.

On the surface of this, it appears that you don't need to use a new
clock generator, but rather de-jitter the pixel clocks which you have.

We could take our output clock and dejitter it through a PLL, yes.
The problem is that our data is being produced on the jittery clock,
and you can't dejitter all of those signals.

Or we could feed the jittery clock out, through a PLL, and then back
in to drive everything.  That would work.

On the surface of this, it appears that a PLL for each channel would do
this.  The phase jitter will average out -- it is just a matter of
having a sufficiently stable VFO and a low enough F0 for the loop filter.

I will look and see what is available.  I need to know your specs:

        Maximum frequency

The frequency range is huge, although we can divide by integers in the
FPGA without introducing jitter.  Max is like 300MHz.

Perhaps 330MHz IIRC.

        Maximum jitter

Howard will have to give you a good answer on that.  Shoot for 100ps.
I'm just guessing.

        Does duty cycle of the output clock matter?

Yes, because we drive the DVI transmitters DDR.

Basically, that means that you need to have the VCO run at twice the output and use a flipflop since the VCO isn't going to have 50% duty cycle.

        Does the phase of the clock matter relative to the input clock?

Not really.  I think the way we generally output a clock is to use a
DDR flipflop in the IOB that has its inputs tied to 0 and 1.  Then the
clock comes out of its pin exactly in phase with the data.  Of course,
sometimes, we want the clock to be slightly behind so that the edge is
in the middle of the data eye, so that's a concern.  One trick,
perhaps, is to lower the slew rate on the clock, but you can only do
that so much before you don't get a proper clock.

Probably not relevant since state machine PLL detector ICs are now common.

NOTE: I am not having a good day today but will try to look for ICs tomorrow.

--
JRT
_______________________________________________
Open-graphics mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-graphics
List service provided by Duskglow Consulting, LLC (www.duskglow.com)

Reply via email to