Terry Barnaby wrote:

<snip>

Sigh. Must I really go and buy a relatively expensive PCI based nVidia card to get proper 50 Hz image? I am open to all of your ideas! Thanks a lot in advance,

-- Jeroen
_______________________________________________
mythtv-dev mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev


Hi Jeroen,

I had issues like this with MythTv on a Via M10K box (see: http://www.kingcot.eclipse.co.uk/unichrome/unichromeTvOut.html).

In my case I now have good quality display on a TV with MythTv.
Todo this I have:
1. The Via driver provides a VSYNC interrupt at 50Hz synchronised
    to the output field rate, but not to the frame rate.
2. MythTv is set to bob-deinterlace via the VldXvMC driver.

In my case this causes MythTv to send fields at 50Hz to the display
frame buffer synchronsied with the TV display output. ie. The individual
fields within the output frame buffer are updated independently.

The TV output takes the fields from the display memory and outputs them
to the TV. Note that the TV output is not syncronised to the frame rate,
just the field rate.

So, in general, MythTv can do what you want, but possibly not with the
SiS drivers ...

Terry

I found out now that I already visited your site before without knowing; I downloaded your interlace_test.mpg clip to, well, test interlacing :)


I am *very* close to buying an nVidia card now, this is taking too much time really. What I cannot understand either, is that up till now, no one seems to have marked this issue as important. Is everybody just taking slow frame rates and a blurry image for granted, not wanting the best picture quality possible..?
I am really frustrated by the fact that bobdeint is the only filter that gives smooth video AND the only one that doesn't work properly. I am looking forward to some others' insights ;)


-- Jeroen

_______________________________________________
mythtv-dev mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev

Reply via email to