> - I can imagine that LCD displays *need* timings since video chips have
>   a "raster" paradigm, but *why* would timing even matter? Those devices
>   are always-on, "parallel" (as opposed to scan-line) devices, right?
> 
> Seems to me that as long as the LCD displays can pick the video signals
> out of the composite stream, timings/scan-rates shouldn't matter.

There is little difference between an LCD panel and a
monitor when it comes to scanning. BOTH are scanned. The
differences come in the rates. 

Most Analogue LCD panels will lock quite happily to most
monitor timings and store the information in a frame store
of some type so that the more parallel nature of the 'glass'
can be refreshed ( Think of it as a large number of thin
monitors stacked on top of one another ). The computer can
not be expected to generate the larger number of signals,
one for each strip of the display <g>.

As long as the analogue rates are within the range of the
'convertor' there will not be a problem.

The digital interfaces simply eliminate the RGB analogue to
digital conversion, providing single pixels for display.
This is where the rates are more critical, as the clock rate
must match the pixel stream, but then it comes along with
the data, so there is even less for the framestore to do. In
most cases you will see no difference between an analogue
panel and a digital panel until you get to the high
resolutions.

I am writing this looking at an analogue 1024x768 panel, and
when I change resolution or scan rate it usually fills the
screen, unlike the monitor it replaced where it jumped
around. All the normal monitor controls are available, but
the convertor usually gets it right.

This was a long winded way of agreeing, but hopefully has
provided some more insite.    

-- 
Lester Caine
-----------------------------
L.S.Caine Electronic Services
_______________________________________________
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert

Reply via email to