>> is there really no recommendation for a board not using Nvidia 
>> graphics components? It would really be great to not depend on 
>> proprietary drivers.

> Hardware decoding through VA-API is working on some Intel chipsets 
> and CPUs, but I haven't seen any usable GPU deinterlacing implementations 
> besides those in Nvidia's VDPAU.

I also would like to remind the framerate issues. Naturally you decide what
is enough precision and quality for you.

Computer hardware usually cannot provide 50.000Hz, 59.940Hz or 23.976Hz
outputs to your TV/Monitor. This will cause some judder on display output
as MPEG/AVC input-stream is not synchronized to output framerate.

For example dedicated blu-ray player Philips BDP3000 had a 24.000Hz output
(before firmware fix) which resulted a jumped frame every 42 seconds as 
real input stream is 23.976Hz. It was annoying after you noticed it. But
luckily now it is fixed to real 23.976 output.

With this bridge we come to VGA-hardware which might have 50.01Hz closest to
50Hz signal. So every 100 frames we get a jump in picture synch. Ever seen 
jumping camera panning while watching a film and got annoyed by it?

For ATI cards there is a dynamic framerate fix, unfortunately there is not
for Nvidia cards. Nvidia has good HW acceleration but potentially bad
With ATI vice versa.

This ATI fix fixes 50.01 by dynamically reprogramming VGA timers so real
is 50.000Hz. (General description, the author can describe more if needed).

So if you aim to HD with full HD quality I'd be carefull with computer
My answer was to setup Popcorn hour to output 50.000Hz with TV-stuff.
you lose some easy-of-use with VDR as you need to have separate layer to set
up etc.

Also as original question came from Finland and if you use commercial HD
they are (or at least should be) paired to your receiver.

Happy hunting for HW & SW..

Attachment: smime.p7s
Description: S/MIME cryptographic signature

vdr mailing list

Reply via email to