On Tuesday 03 October 2006 17:50, Jason Papadopoulos wrote:
> Quoting John R Pierce <[EMAIL PROTECTED]>:
> > how many people have pentium/athlon CPUs?
> >
> > what percentage of those people have X1900's ? (noting that these are
> > $400 video cards, bought almost exclusively by the 'extreme gamer' crowd
> > who are willing to spend twice the money for tiny increments of game
> > performance)
>
> How many of those graphics cards implement double-precision floating point?
>
> While it's possible to implement a LL test using only 32-bit integers,
> unless those graphics cards can do wide integer multiplies that scheme
> would also be impossible to port as well.
>
> Using graphics cards for finding primes is a long-standing FAQ, and the
> answer is always no, but as time goes on there are fewer and fewer
> reasons it's impossible.

Surely the overriding problem is that the manufacturers of graphics cards tend 
to keep the driver code confidential. e.g. the linux drivers - if provided - 
"work" but by no means implement the full capabilities of the graphics card.

If you don't have access to the hardware - except through a proprietary driver 
which responds only to graphics-specific commands, and doesn't even have a 
published API unless you're using the crippled linux version - you're going 
to have great problems persuading the thing to do any useful computations - 
whatever the capabilities of the thing are.

Regards
Brian Beesley
_______________________________________________
Prime mailing list
[email protected]
http://hogranch.com/mailman/listinfo/prime

Reply via email to