On Wed, Jun 24, 2009 at 03:23:44PM +0200, Krzysztof Halasa wrote: > I wonder what is the difference between the on-air frame rate and > your card's (fixed) one? 100 ppm would need one second of additional > (initial) buffering for ca. 3 hours of playback. I think the players > initially buffer more than 1 second, don't they?
the problem is not the absolute accuracy of your graphics card. the problem is: there are always differences between the on-air frame rate and the card's fixed one if both are not synchronized. Even if it were possible to adjust your graphics card exactly to 50Hz it would help nothing. Because the on-air frame rate always floats around a litte. Besides that the graphics card of course never is running at exactly 50Hz. May be somewhere in the range of 49.90 and 50.10 if your are very optimistic. In practice (after heavily experimenting with that) this leads to field/frame losses/duplicates at least every 45 seconds. The only solution to avoid this judder is to synchronize the graphics card to the base clock of your software player. This is what the vga-sync-fields patch does. Any buffering won't help. Because the problem arises between the software player's base clock and the graphics card. And not between the TV station and the software player. Theoretically you could synchronize the software players clock to the (fixed) graphics card clock for replay of recordings. Even that is not a common practice under linux because the software player has no clue about the graphics card frame rate. For live-TV this is not possible anyway. Your assumption of 1 second buffer per 3 hours of playback is way too optimistic. - Thomas _______________________________________________ xorg mailing list [email protected] http://lists.freedesktop.org/mailman/listinfo/xorg
