Dieter wrote:
No. FIR filtering (usualy called linear/cubic blend in this context)
is one method. There are others that use the optical flow or try to emulate
the scaning and afterglow behaviour of interlaced CRT devices (aka TVs).
I first note that
what you have just said is awful is one specific case of this general
method. I presume that you would do odd field 1 & even field 1, then
even field 1 & odd field 2, and then odd field 2 & even field 2, etc.
That only works if, and only if, the display device is 100% synchronized
to the video stream. And you need to decrease the luminance of the
field that you show a second time to half or a quater of its original
luminance to get the right effect, otherwise you'll get horrible
combing. That's BTW basically the "emulation" algorithm i mentioned
above.
I couldn't see that there was a field that was being shown a second
time. Only the interpolated progressive frames would be shown.
Does decreasing the luminance to 1/2 or 1/4 look better than reducing it
all the way to black? Black would more closely emulate an interlaced
CRT display which interlaced video is designed for, right?
Actually, the phosphor in a TV CRT does have some persistence or perhaps
newer sets don't. And, some LDCs has even more persistence. But, the
theory is that the system isn't displaying the odd and even fields.
With temporal interpolation it displays virtual progressive frames which
are half way between the fields in time.
http://www.xbitlabs.com/articles/other/display/lcd-parameters.html
See photos on page 2.
This is a computer monitor (and probably a fairly new one based on 100
Hz vertical scan) so it isn't surprising that it has very low
persistence phosphors. He didn't say what type of camera he was using
but if it is a vertical focal plane shutter, you need to account for the
fact that the scan and the shutter are going in opposite directions at
similar rates of speed. Turn the camera upside down and you will get
different results.
Interlacing is needed for all TV based systems because they operate
in a (surprise!) interlaced mode, showing half frames (fields) at
double rate.
Traditional CRT TVs are interlaced, but what about the newer LCD, plasma,
DLP, ... TVs?
There are/were a few wide screen HD 16:9 CRT sets that would show 720p.
There is/was a 32 inch Sony. Huge, heavy and expensive. And there
are quite a few digital tuner 4:3 CRT sets -- not really sure how HD
they are and what type of scan they use.
Actually, my new TV displays 480p and 720p and these signals are
broadcast along with 1080p (and the old NTSC 480i analog).
The 720p advocates say that progressive is better, but I find serious
motion artifacts that I don't see with 1080i. Go figure.
What kind of display is it? CRT? LCD? plasma? DLP? ...
OOPS, left that out. I have a 768 line LCD TV.
720p should have 60 full frames per second, vs 60 fields for 1080i.
(50 in Europe, I suppose.) If you see more motion artifacts with
720p than with 1080i, something is wrong.
They look like motion artifacts. Most noticeable is when a person in a
close head shot nods their head. It can look seriously strange.
Could you be seeing compression artifacts? Scaling artifacts?
Display artifacts? Most LCDs are not fast enough, so the faster
rate could make things worse instead of better.
But, only on CBS which is 720p. The problem does not occur on NBC or ABC.
OTOH, I haven't looked into LCD monitors that much so I don't know if
they only accept certain fixed frequencies or if they accept variable
frequency sync input over a range like most current CRT monitors do.
LCDs are usualy 60Hz fixed frequency.
Yes. They might be faster in the future.
http://www.behardware.com/articles/641-1/1rst-lcd-at-100-hz-the-end-of-afterglow.html
It isn't a large problem if they are 60 Hz. The problem occurs when a
vertical scan rate is between 60 Hz and 90 Hz and isn't actually one of
them or 72 Hz. This is going to be like showing US film 24 f/s on PAL
tv 25 f/s. From what I have read, it isn't very pretty to look at.
Basic math is simple, the closer the two frequencies (or a multiple in
this case) are together, the longer that rationalization cycle is going
to be and the worse it looks. But, the exact match at 72 Hz is a
singularity in the function and it works perfectly. 60 Hz and 90 Hz can
use some type of reverse telecine. But, I don't know how frequencies
are to be handled.
So, it would be better to have X11 set the vertical rate at 60 Hz, 72 Hz
or 90 Hz when showing 24 f/s film based content. IIUC, that can now be
done while X11 is running (in user space) if the driver supports XRandar.
--
JRT
_______________________________________________
Open-hardware mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-hardware