Michael T. Dean wrote:
Tim McClarren wrote:
I had posted a while back about a problem I was having with video
capture.
Basically, I get about 10 black pixels on the left edge of captured
stream, and about 20 pixels at the bottom, which are mostly black, but
with some grey blocks in it. I'm pretty sure it's entirely an issue
with my very cheap SAA7130 based capture card.
No. It's an issue with NTSC (or PAL). These analog formats have the
unfortunate problem of having "ragged edges," sometimes with "rainbow
stripes." Therefore, when designing the specifications, the solution
that was chosen was to factor in an amount of "overscan"--extra image
size that would be shown outside the visible area of the screen.
Because of this designed in overscan--which had a bit of an engineering
factor built in--some of the extra lines of information were used to
include non-visible data (i.e. closed-captions, etc.). Therefore, you
may see gray lines on the top of your video, too, that aren't a problem
with your capture card.
Yes, I know... I just assumed that a GOOD capture card would understand
that this part of the NTSC signal is not part of the visible image, and
would "auto calibrate". I don't really have a lot of experience with a
lot of different capture cards, but I do know that if I plop this card
in a Windows box and use the regular Windows driver, I don't need to
mess with cropping... the driver seems to handle it "out of the box", as
it were.
Although that's how the specification was designed--to "correct" these
ragged edges with the player by covering part of the picture tube with a
bezel. Therefore, correcting it on a computer outputting to a
non-overscanned screen should be done by the player.
which means I have to configure each front end seperately, and I have
to set up the transcoder to crop out those annoying bits if I want to
watch my stuff in some other format).
But each frontend typically has different playback characteristics
requiring different settings, anyway. If you crop the video at capture,
it looks great when displayed on a digital projector or computer monitor
without overscan, but you lose part of the image when displayed on a TV
with overscan...
I'm not sure I understand this. With Myth, you're no longer viewing the
raw NTSC signal, so overscan/underscan isn't an issue. Are you saying
that people make their X desktop start some ways above and to the left
of the TV's top left corner, and have the bottom right corner not
visible because it's beyond the display area? I have my MythTV front
end connected to a plasma using WXGA on an RGB cable, so I'm not sure
what happens when you use S-Video or composite -- I thought the video
out of a graphics card would emit an NTSC signal that made all of the
desktop visible, which means that MythTV's front end output would all be
visible. But, it sounds like you have to mess with image size and
offsets if you're displaying S-Video or composite out to make the image
look right on a TV.
Oh, and by the by, if anyone has a modeline to make the Radeon driver do
138x by 768, I'd love to have it :)
Am I the only one who has this issue? Do other cards capture pretty
much perfectly, right out of the box?
Only if they're not capturing NTSC/PAL. :) Note, though, that the
source format (i.e. analog cable, digital cable/satellite with a
receiver usign S-Video output, etc.) you're using has a big effect on
how noticeable the issue is.
Would it be worth having the UI in the "Capture Card" settings dialogs
to be able to specify a crop rectangle?
Not for me, but it wouldn't hurt me, either. Just make sure you
consider cards that don't support cropping. If you do, and create the
patch with defaults that work the same way as current recordings (i.e.
no cropping), it's quite possible it will be accepted into the trunk.
Okay. Thanks for your comments. It's easy to make V4L2 API calls to
determine the cropping capability of the device.
_______________________________________________
mythtv-dev mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev