It seems like every time I feel like I get a good grasp on how the bitrates and resolutions and codecs play together something comes up that confuses me.  Like just about everyone, I'm trying to get the best quality for the least space, and I thought I was doing alright.  My WAF doesn't tolerate graininess very well, and blockiness is a killer.  I have a PVR150 that I record at 352x480 3500/ 720x480 5000/ 720x480 6000  by quality level.  The HQ ones I leave alone, but the low quality ones I autotranscode to mpeg4 at 352x480 2200 with the four high quality options turned on.  This seems to give good quality, with a half hour show coming in at around 700MB.

Last night though, I ripped a few dvd's to take with me on a trip this weekend, and I realized that I was ripping a full 140 minute DVD to about 1GB with perfectly good quality, at only around 1000 bps VBR.  If I tried to transcode my shows at that, they'd look terrible.  I'm sure I'm missing a basic principle somewhere, but I can't figure out what it is.  I would think changing the format would have something to do with it, from mpeg2 to mpeg4, but wouldn't that apply to the dvd as well?  Double pass encoding might be part of the answer as well (I don't think myth can do that, can it?)

Can someone explain this effect to me?

Thanks,
-- Ryan
_______________________________________________
mythtv-users mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users

Reply via email to