People,

Please read the following: http://www.100fps.com/why_bobbing.htm
I am seriously doubting the correct implementation of the 'Field Bob' filter (as this website calls it) in MythTV's bobdeint, because this occurs in every single show I watch up till now, be it a recording or Live TV. I have had discussions with several people on the mailing list already to find the origin of this behaviour but never got a satisfying answer or even a solution.

THE PROBLEM: TV output at full frame rate (50 Hz) is only possible with 'bobdeint' but never stable. Here are my thoughts:

0. I am in Europe (The Netherlands), so we use PAL @ 50 Hz.

1. I use an ASUS Pundit barebone with a SiS graphics chip; which can't feed interlaced material to its video out (see here: http://www.winischhofer.at/linuxsispart3.shtml#faq, look for the 14th question).

2. To bypass this problem, I built a VGA-to-SCART converter, that connects my Pundit directly to the TV set; hence no interlacing incompatibilities - you would think. The output still doesn't look smooth as 'normal' TV. The video is captured at 25 fps (2 fields per frame!) and displays it at 25 fps, both fields interlaced and at the same time. Using a modeline set to 720x576i (interlaced) the output *should* be fluid, 50 Hz. But it isn't, I don't understand why. By the way, the GUI graphics look horrible due to no flicker filtering.

3. I concluded that the S-Video TV-out image looks way better than direct RGB, which doesn't even solve my primary problem.

4. Using the Bob deinterlacer one *can* achieve full 50 Hz frame rate (actually: field rate!), but all fields must be played back in the correct order. Explanation: the recorded NuppelVideo files are 25 fps and contain 2 fields per frame, the playback order is like this: [ a+b, a+b, a+b ] where a and b are the even and odd fields and every comma marks a frame.
Bobdeint *should* split the frames into a an b an d display them at double frame rate to achieve: [ a, b, a, b, a, b ]; the result is 50 Hz and fluid. <-- I want this!

5. I experience the fact that every now and then, the result of bobdeint gets out of (vertical) sync for a while (!), like this: [ a, b, b, a, a, b ]. It loses its sequence and starts swapping fields or even frames, I am not 100% sure about this. Some pointed out this is due to the video driver not knowing the monitor's Vsync. But reading the website I linked at the top, a faulty 'Field Bob' filter implementation could also be the cause.

6. NVidia users for example have hardware OpenGL acceleration that can take care of Vsyncing, I do not. Unfortunately, the SiS X driver does not support hardware DRI and/or OpenGL. They can also benefit from Xv Motion Compensation (XvMC), which presumably also gives good quality (does it output full framte rate?). I guess I am running out of options.

7. MythTV's output is decoded by the IVTV driver, correct me if I'm wrong. I was told to blame the IVTV developers for bobdeint not doing its job well without knowing vsync. Can anybody confirm this?

8. Other software like tvtime, MPlayer, Xine deinterlace very well. For instance, playing recordings through MPlayer delivers excellent image quality. Why can't MythTV? For Windows, there is DScaler (http://deinterlace.sourceforge.net), it does deinterlacing very well too.

Sigh. Must I really go and buy a relatively expensive PCI based nVidia card to get proper 50 Hz image? I am open to all of your ideas! Thanks a lot in advance,

-- Jeroen
_______________________________________________
mythtv-users mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users

Reply via email to