Joshua Penix wrote:
On Jul 11, 2006, at 6:34 PM, DJA wrote:
Who really cares if it's a closed source driver? It works! I've had
almost no problems using nVidia cards in Linux since RH6. I have a
256MB GeForce Go 6600 in my laptop and have experienced zero problems
in Linux. It also works well enough in Linux to play FarCry.
Agreed, if gaming is a priority, then who cares. But there are people
who *do* care about openness. And there are people who have
experienced problems and have been shunned by the kernel developers due
to tainted modules.
People who bitch and moan about the binary drivers from NVIDIA, etc. do not understand 3D
at all. They are not being realistic and are caught up in the quest for their own perfect
little world where everything is free, no one needs to make a buck, and no competitors
steal from one another.
I worked on video compression hardware for a company that produced the first MPEG-2
digital cable and satellite equipment. A competitor paid an IT employee to steal a
computer and a hard drive - my test station hard drive in fact - so that they could get a
hold of our software and algorithms. I've also had a rival game developer steal my 3D
source code from a "private" Windows FTP server and use the algorithms and ideas in their
game. Openness does not work in all situations, especially in such a competitive area.
Those who think it should need to get over it and enter the real world.
As for heat, again, who cares! If you want a good performing video
chipset then heat is a given.
Not so much these days. With 90nm GPUs aimed at mobile applications,
I'd expect them not to get ridiculously hot or battery hungry. 3D
chipsets these days are so wicked good that no game uses all of the
capability.
The mobile computing GPUs, in general, are lower performing than their desktop
counterparts. They have to be in order to produce less heat and suck less power. Aside
from space limitations in the laptop, that's one reason they generally have less video
memory (or use shared memory). Less memory in a 3D video application means lower performance.
Five years ago you could've said the same thing about laptop CPUs -
heat is a given if you want performance.
Not really true anymore.
With the recent advent of asynchronous processors (ARM just announced one a couple months
ago), I would expect performance to increase and heat dissipation to decrease in the near
future. It may take some time to re-design GPUs and CPUs into an asynchronous
architecture, but I believe that's the way the industry may have to go.
(I knew this discussion would ensue as soon as I read the first post. It always
does. :) )
PGA
--
Paul G. Allen
Owner, Sr. Engineer, BSIT/SE
Random Logic Consulting Services
www.randomlogic.com
--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list