According to anandtech,

$100 GeForce 8500 supposedly does 100% decode of HD h.264.

But that doesn't include decrypting the data, which takes 40% of a
dual cpu, so figure about 80% of a single cpu.

Without HW assist, cpu load was pegged at high 90s %, so I assume it was
actually dropping the ball.  Decoding with cpu used more power (Watts)
than decoding with gpu.

I don't see any mention of bitrate, but some of their images don't work,
perhaps useful data is hiding there.

I assume their video decode stuff is undocumented.

"the NVIDIA GPUs don't handle CAVLC/CABAC for VC1 decode"

"ATI uses its shader units to handle video decode"

"H.264 offload is absolutely necessary for good Blu-ray/HD-DVD playback."
_______________________________________________
Open-graphics mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-graphics
List service provided by Duskglow Consulting, LLC (www.duskglow.com)

Reply via email to