MacNean Tyrrell wrote:
On 10/30/05, Michael T. Dean <[EMAIL PROTECTED]> wrote:
MacNean Tyrrell wrote:
I have a nvidia geforce 5200, using svideo out. Mythtv looks sharp
and very nice looking. However playback is still not that great. Not
really sharp like the menu. I've played with the XV picture color
controls, and tried kernel, bob, and liner deinterlacing techniques
and nothing looks as sharp as that. Is it because it's deinterlacing
the video? or because of the ivtv driver for the 2 pvr 250's i have?
In a manner of speaking. It's likely the NTSC--both your PVR-250's
recording NTSC and your video card outputting NTSC. You can fix the
video card outputting NTSC by using a better connection (like
DVI/VGA/Component), but can only fix the PVR-250's recording NTSC by
upgrading to high-def capture and playback.
So playback could look better over DVI, that would be nice.
Yep. Or the VGA connection you found.
So how do you go
to high def capture,
pcHDTV HD-{2,3}000 or Air2PC or HD-5000
and isn't there only like 6 channels in High Def,
Yeah. And it varies significantly depending on your location in the
country (world?).
i
have HD on my cable box, but only Fox, CBS, NBC, ABC, TNT, ESPN, and one
other channel come in HD, so wouldn't High-def capture capture at the same
things for the other channels as the PVr-250's?
Yes. As a matter of fact, you probably can't get ESPN or any other
"cable" channels in HD because your cable company is probably encrypting
these "premium" channels, so really this will only help for local
channels at least in the short term.
Or is there a card out there that captures regular tv better?
No. Some of the "high-def" channels actually transmit "regular" TV
(480i = 640x480 interlaced) for some shows, but they transmit digitally,
so the picture quality will be better on those shows--you're capturing
the same MPEG stream they broadcast instead of letting your cable box
decode the MPEG and send an analog NTSC signal to your PVR-250 which
encodes it to MPEG for your Myth box to decode and send to your TV. But
as you mentioned above, this will only help you on about 6 channels.
However, you can still fix the other NTSC issue--Myth's outputting
NTSC--by switching to VGA/DVI, and doing so will significantly improve
video quality (because it's easier to configutre--it's just like setting
up a computer monitor).
Also, i was thinking that a better ivtv
driver, than perhaps the one i'm using, would actually look beter, Like it
seems my old Tivo (i have the original with no subscription) looks better
than a lot of recordings of my pvr-250's.
Have you tried playing with your recording settings. On a 52" DLP, any
artifacts in the encoded images will be magnified, so you will probably
need to use a pretty good bitrate/resolution, and make sure you leave
around 1/3 extra headroom in the bitrate (i.e. max bitrate =~ 1.33 * avg
bitrate). Also, make sure you're using OpenGL VSync for better frame
deliver.
I've seen 5200 with dvi outs (like this, but this is 5500,
http://www.newegg.com/Product/Product.asp?Item=N82E16814130197), i
have a 52" lg dlp tv with dvi input, my cable and my dvd player go to
the 2 composite inputs, would getting this video card could i then run
it in progressive so no deinterlacing required?
You will want to deinterlace all interlaced video you pass over the DVI
output. Normally, your TV would deinterlace the NTSC and any interlaced
ATSC video coming in through its tuner/component/S-Video/composite
inputs, but it simply displays the DVI signal as input. So, if you
don't deinterlace, you'll see interlacing artifacts since your TV is not
interlaced. The problem is the interlaced input--all your recordings
off PVR-x50's are interlaced because NTSC is interlaced.
Than i'm confused, my TV can display 1080i and 720p (480p and others, but
those are the two highest resolutions), so i thought 1080i was interlaced
display, so if the video going over dvi was interlaced wouldn't that show
fine as long as the dvi select was for 1080i? I'll admit i'm a bit slow when
it comes to all this stuff with HighDef resolutions between tv and
computers, so thanks for the help.
1080i and 720p are video formats that use MPEG-2 video compression.
These are used when the TV receives and decodes the MPEG-2 for you.
Generally, these can only be used via the TV's "high-definition tuner"
or the TV's firewire port. Note, also, that because your DLP has a
fixed resolution (probably 1280x720 or possibly 1920x1080 pixels),
whenever the TV receives a video signal for any other resolution, it
will scale it to fit your screen. Also, if your TV receives an
interlaced video, it will deinterlace it for your display.
The DVI connection, on the other hand, makes the TV act as a "computer
monitor," so it just takes the image sent and presents it--without
modification (for the most part--my TV scales it to take up 92% of the
screen to compensate for overscan, but you get the idea).
Although it's theoretically possible to send a signal over the firewire
port to allow the TV to do the decoding/deinterlacing/scaling for you,
to make this work, you would have to create a valid ATSC stream to pass
over the firewire port. That means encoding--in real time--the Myth
display in one of the valid ATSC formats (
http://www.hdtvprimer.com/ISSUES/what_is_ATSC.html ). Unfortunately,
we're still several years away from having general-purpose CPU's
powerful enough to do real-time encoding of high-definition video (720p
or 1080i)--it takes a pretty powerful CPU to just decode an MPEG-2
high-definition stream. Also, your GUI quality would probably be
reduced by the encoding, but you would have good video quality.
However, as long as Myth is doing equally well with the
decoding/deinterlacing/scaling, there's no benefit to using the TV to do
that work for you--especially since the TV is a black box and Myth gives
you full control over the processing (even allowing things like time
stretch and filtering).
Mike
_______________________________________________
mythtv-users mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users