Dieter wrote:
Xbitlabs did some measurements on how much cpu it takes to
play video.
http://xbitlabs.com/articles/video/display/video-playback.html
Of course they did this with binary drivers for virus-server.
And they didn't hunt down high bitrate sources. Or tell us
what the bitrate of the sources is. And they used a x2 "FX"
(expensive) cpu. Some of the tests would have failed using a
more normal cpu, much less a low end one, or an older model.
They do provide some maximum bitrates:
DVD mpeg2 max bitrate approx 10 Mbps
mpeg2 HD up to 80Mbps
mpeg4 - DivX HD capabilities are formally limited to 1280x720 resolution
at 30fps and a bit-rate of 20Mbps. [ seems odd ]
MPEG-4 AVC/H.264 - up to 40Mbps HD-DVD and Blu_Ray
H.264 takes 2-3 times as much cpu as mpeg2. (for whatever bitrate
sources they used)
A "rather low bit-rate" H.264 would nearly max out a single FX cpu.
So a higher bit-rate H.264 would fail on a single cpu, even with a
"premium" graphics card. Forget using a normal cpu, much less a low
end or older model. With an "entry level" card, their "rather low
bit-rate" H.264 needed 67.2% of the x2 FX cpu. A high bit rate
H.264 might well fail even with their x2 FX cpu.
Questions:
How much hw assist are these graphics cards providing? I get the
impression that ATI and Nvidia concentrate on gaming, perhaps we
can beat them at video decoding?
If we offer an add on card with a media processor chip, then yes.
How much cpu is it acceptable to require? CPUs have other
things to do besides decoding video, so using 100% of the cpu
is not going to work. Is it really acceptable to require a
high end x2 FX cpu?
I don't know if that is the question. 1080p/30 does require a high end
dual core CPU.
--
JRT
_______________________________________________
Open-graphics mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-graphics
List service provided by Duskglow Consulting, LLC (www.duskglow.com)