On 15/01/11 21:49, VDR User wrote:
On Sat, Jan 15, 2011 at 1:09 PM, Goga777<goga...@bk.ru> wrote:
In general, get a gt220, as it has built in audio hardware, so that
you should get audio without clock drift relative to the hdmi output.
It is also powerfull enough to do temporal spatial deinterlacing on
what do you think about
NVIDIA's GeForce GT 430
It's a nice card but I'm not sure why you think it's the best choice
for VDR/htpc. It's not going to give you any better image quality on
HD content then you get from a gt220 at half the price. I don't see
any advantage for most users in spending the extra money for one.
Even if it does run cooler than a GT220 it can't be by much judging by
the size of the heatsinks. Ones with fans might be too noisy in an HTPC,
and ones without will need a well-ventilated case, bearing in mind they
might be working quite hard decoding HD for long periods. So...
I wonder whether it might be possible to use a more eonomical card which
is only powerful enough to decode 1080i without deinterlacing it and
take advantage of the abundant CPU power most people have nowadays to
perform software deinterlacing. It may not be possible to have something
as sophisticated as NVidia's temporal + spatial, but some of the
existing software filters should scale up to HD without overloading the
CPU seeing as it wouldn't be doing the decoding too.
Alternatively, use software decoding, and hardware deinterlacing.
Somewhere on linuxtv.org there's an article about using fairly simple
OpenGL to mimic what happens to interlaced video on a CRT, but I don't
know how good the results would look.
BTW, speaking of temporal and spatial deinterlacing: AFAICT one means
combining fields to provide maximum resolution with half the frame rate
of the interlaced fields, and the other maximises the frame rate while
discarding resolution; but which is which? And does NVidia's temporal +
spatial try to give the best of both worlds through some sort of
TH * http://www.realh.co.uk
vdr mailing list