Don't forget that modern LCD screens only have the res they are rated for. So anything you send needs to be an exact division of that or you will have pixels lost or merged with others as they fall between displayable pixels. CRT's had more points of light they the highest res they where rated for. So you could deal with odd multiples better.

On 1/28/2011 2:57 AM, Stuart Morris wrote:

Standard definition video is going to be harder than I thought.
I used xrandr to set this mode via HDMI to my LCD TV:
# 1440x576i @ 50Hz (EIA/CEA-861B)
ModeLine "1440x576" 27.000 1440 1464 1590 1728 576 581 587 625 -hsync -vsync 
The TV reported mode 576i ok, but the desktop graphics were unreadable.
I tried to view an interlaced standard def video using my little test 
application and it looked awful.
However the 1080i mode worked very well:
# 1920x1080i @ 50Hz (EIA/CEA-861B)
Modeline "1920x1080" 74.250 1920 2448 2492 2640 1080 1085 1095 1125 +hsync 
+vsync Interlace

I think for standard definition video via HDMI there will be a need to
upscale to a resolution better supported by HDMI and that requires
inverse telecine and deinterlacing. This may still be within the
capabilities of todays low power systems.

My little test has staisfied me that 1080i or 1080p video can be displayed
with interlaced output.


BTW my hardware setup was an old Sony KDL32V2000, and AMD HD4200 integrated 
graphics with the AMD closed driver.

vdr mailing list

Reply via email to