Timothy Normand Miller wrote:
Sorry about the cross-post. We're -><- THIS close to getting OGD1
done, with artwork in the hands of board makers who are working on
quotes, and we've discovered a problem that could make the video
output unacceptable.
We've discovered that the clock generators in the Xilinx FPGA part are
lousy for generating video clocks. We're seeing like 900ps of jitter,
which causes artifacts on DVI monitors at resolutions as low as
1280x1024 when the cable gets beyond a certain length. (I don't
recall all the details.)
This doesn't quite add up. IAC, are we talking about analog or digital
display?
One option is to use the clock generators in the Lattice part, but
even they have like 400ps of jitter, and they also severely limit the
range of frequencies we can generate.
IIUC, we have a clock generator for the pixel clock -- meaning that the
frequency will have to be changed for different formats, but while
running it will have a constant frequency. But, both channels will have
different frequency.
On the surface of this, it appears that you don't need to use a new
clock generator, but rather de-jitter the pixel clocks which you have.
On the surface of this, it appears that a PLL for each channel would do
this. The phase jitter will average out -- it is just a matter of
having a sufficiently stable VFO and a low enough F0 for the loop filter.
I will look and see what is available. I need to know your specs:
Maximum frequency
Maximum jitter
Does duty cycle of the output clock matter?
Does the phase of the clock matter relative to the input clock?
--
JRT
_______________________________________________
Open-graphics mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-graphics
List service provided by Duskglow Consulting, LLC (www.duskglow.com)