Attila Kinali wrote:
Good evening,
On Thu, 19 Jul 2007 03:05:56 -0700
James Richard Tyrer <[EMAIL PROTECTED]> wrote:
Doesn't make sense. The video is generated at the RAMDAC,
there we need already a scaled version. But as we have scalers
anyways for 3D we can recycle those (just map 2D data into
a 3D object and scale it)-
Duh? You know video like the output from the MPEG decoder. That needs
YUV to RGB and then it needs to be scaled to fit whatever size window it
is being displayed in.
Yes, but we have to integrate the video we output in the output
of the other stuff of the graphics card. Thus it's easier to
use the frame buffer of OGA for the output. And if we use
that, we can also use the scaler in OGA.
Yes, and we do that by scaling the output of the decoder and writing it
somewhere in frame buffer memory. This is the only way that I know to
do it -- it would be the same if we use existing chips or do our own.
<SNIP>
Upsampling from 4:2:2 to 4:4:4 is nothing difficult either (simple
FIR filter operating on scanlines), but the upsampling from 4:2:0 to
4:2:2 is (upsampling in vertical direction, guess why they don't do
it).
Perhaps it is because it isn't needed. Which decoders output 4:2:0?
Uhmm.. MPEG1, MPEG2, MPEG4, H.264,..... 4:2:0 is the defacto
standard for video subsampling. The only case i know that regularly
uses 4:2:2 are video cameras because it is a lot easier to implement.
I think that I am missing something here. Why does it matter if 4:2:0
or 4:2:2 chroma sampling method is used since the decoder is going to
output YUV data for each 2 pixels for each line, isn't it? If so, then
the decoder converts it from 4:2:0 to 4:2:2.
What decoders are you talking about? Any example of an video
decoder that accepts 4:2:0 encoded video and outputs 4:2:2 decoded video?
I was speaking of a decoder in the abstract sense. A black box that
accepts encoded video and outputs a digital signal that can be directly
displayed -- perhaps after DA conversion.
I only know software video decoders and those don't
do a 4:2:0 to 4:2:2 up conversion unless explicitly
asked to do so.
AFAIK, the decoder must do this since there is no way to *directly*
display 4:2:0 chroma sampled video. I don't know how 4:2:0 digital
video data would be be formatted, but you would need to remember more
than one pixel to display it. IIUC, the display would have to remember
a whole line.
I found this link interesting since we are going to be designing this:
http://www.hometheaterhifi.com/volume_8_2/dvd-benchmark-special-report-chroma-bug-4-2001.html
The mentioned MPEG code appears to be available here:
ftp://ftp.cesnet.cz/MultiMedia/Video-TV/mpeg/SSG/mpeg2v12.zip
Deinterlacing is something that cannot be really done without
some information from video source (or some assumptions on the video
output device) and if only a simple implementation is used (which i
assume), then the quality will suck.
Are we talking about motion compensating deinterlacing (for sources
originally shot with an interlaced video camera and recorded on tape)?
No, we are talking about plain deinterlacing without any motion
compensation. The one needed if you display interlaced content
produced for TV consumption on a progressive display.
Plain deinterlacing is just writes to and reads from memory. I don't
know how that could "suck"
The algorithm you apply between the read and the write.
Or, perhaps you are referring to line doubling by vertical interpolation.
NO! that's the most awfull way to deinterlace.
IIUC, you are suggesting running one odd and one even field through a 2D
FIR filter (actually a convolution) to "deinterlace" I first note that
what you have just said is awful is one specific case of this general
method. I presume that you would do odd frame 1 & even frame 1, then
even frame 1 & odd frame 2, and then odd frame 2 & even frame 2, etc.
The simplest algorithm is to use vertical interpolation on each frame
obtain the missing scan lines and then interpolation between the line
doubled images (temporal interpolation) to try to approximate a
progressive frame halfway in time between the two fields. The simplest
in both cases is linear interpolation.
Sounds nice doesn't it but it is a 3x1 convolution filter:
PixelOut(N, M) =
PixelIn(N, M - 1)/4 + PixelIn(N, M)/2 + PixelIn(N, M + 1)/4
I presume that this simple filter sucks. It probably does since it is
going to degrade the sharpness of the image. You could use a square
filter and a larger square -- perhaps even go to a 3D filter. These
would need to be based on a maximum spatial and temporal frequency
response - since polynomial interpolation will cause aliasing. No
matter how you slice it, this means a low pass filter and if it filters
out anything below the Nyquist frequency then what has to happen is that
you will loose resolution. So, it depends on what you mean by quality.
OTOH, is this really a concern for content originally shot on film -- no
actual deinterlacing needed there.
<SNIP>
Of course. But interlacing has nothing to do with telecining.
FIR temporal filtering has nothing to do with telecining, but telecining
produces frames (every third frame) where the odd and even fields are
not from the same film frame. So, it is related to deinterlacing
without filtering since you have to combine odd and even fields to
create a whole frame.
Hmm.. after reading again what you wrote in your previous mail,
i have the impression that you have a major mix up between
telecine and interlaced.
Please read:
http://en.wikipedia.org/wiki/Telecine
I find the illustrations here to be better:
http://www.theprojectorpros.com/learn.php?s=learn&p=theater_pulldown_deinterlacing
http://en.wikipedia.org/wiki/Interlaced
http://www.mplayerhq.hu/DOCS/HTML/en/menc-feat-dvd-mpeg4.html#menc-feat-dvd-mpeg4-interlacing
http://www.mplayerhq.hu/DOCS/HTML/en/menc-feat-telecine.html
But, will put the mPlayer links on my to read list.
Short summary:
Telecine is meant to change frame rates from ~24fps
to ~30fps which is only needed to show movies on NTSC based TVs.
It works by inserting an additional frame every 5 frames, that is
generated by using two succeeding frames.
I know how it works. To match 60 fields per second, 24 frames per
second movies need to show each frame for 2.5 fields, so this is
emulated by showing one frame for 3 fields and the next for 2 fields,
etc. The result is that every 3rd frame has fields that don't match.
Interlacing is needed for all TV based systems because they operate
in a (surprise!) interlaced mode, showing half frames (fields) at
double rate.
Actually, my new TV displays 480p and 720p and these signals are
broadcast along with 1080p (and the old NTSC 480i analog).
The 720p advocates say that progressive is better, but I find serious
motion artifacts that I don't see with 1080i. Go figure.
IIUC, the reason that we need to worry about this is that we need to
show interlaced video along with our progressive video.
<SNIP>
With a computer monitor, if you can run at 72 f/s this gets much easier.
Computer monitors don't run at 72Hz. They run at arbitrary frequencies.
And this frequency cannot be changed at will by the graphics card,
just because the video it's showing sugests a different refresh rate.
Most CRT monitors currently (and for the past 10 years) sold for
Personal Computers are variable frequency in both horizontal and
vertical. My old NEC MultiSync XV15 says that it will run at 72 f/s
although I don't know how many horizontal lines you could display at
that vertical frame rate. I think that it is currently running at only
70 f/s +/-. Newer monitors have higher horizontal scan rates than my
old monitor so should have no problem displaying 72 HZ vertical -- the
limiting factor is actually the horizontal scanning frequency (my
monitor's spec says up to 120 f/s vertical but you wouldn't get many
horizontal lines at that frequency).
OTOH, I haven't looked into LCD monitors that much so I don't know if
they only accept certain fixed frequencies or if they accept variable
frequency sync input over a range like most current CRT monitors do.
--
JRT
_______________________________________________
Open-hardware mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-hardware