Intel and NVidia have been having a tiff for the last couple of years.
Recently, Intel didn't allow them access to their chipset for the latest
generation of cpus which forced Nvidia to use a round about way of
processing and it is still fast.

Its like saying, Hey lets make a cake. You are making cakes together for 5
years then all of a sudden the other person doesn't want to share the
recipe.

On Fri, Jun 25, 2010 at 5:07 AM, Martin Baxter <martinbaxt...@gmail.com>wrote:

>
>
> Oughtta be fun...
>
>
> On Thu, Jun 24, 2010 at 5:32 PM, Mr. Worf <hellomahog...@gmail.com> wrote:
>
>>
>>
>> I think that in the next year or two Intel and Nvidia will be competing
>> head to head on the cpu market.
>> NVIDIA thanks Intel for saying GPUs are 'only' 14 times faster than CPUs
>>  By Donald Melanson <http://www.engadget.com/editor/donald-melanson>
>> <http://www.engadget.com/editor/donald-melanson/rss.xml> posted Jun 24th
>> 2010 1:52PM
>>
>> <http://www.engadget.com/2010/06/24/nvidia-thanks-intel-for-saying-gpus-are-only-14-times-faster-t/>
>> Well, we've gone a full month since the last 
>> episode<http://www.engadget.com/2010/05/24/nvidia-intels-moorestown-is-like-an-elephant-on-a-diet-ipad-s/>of
>>  NVIDIA's and Intel's
>> ongoing <http://www.engadget.com/tag/intel,nvidia> public feud, but it
>> looks like Intel has now stoked the flames once again (albeit inadvertently)
>> in a paper presented at the recent International Symposium on Computer
>> Architecture. That attempted to debunk the "100X GPU vs. CPU myth," but it
>> also contained the tidbit that GPUs are "only" up to 14 times faster than
>> CPUs in running application kernels, which NVIDIA has more than a happily
>> latched onto. In a blog post, NVIDIA's Andy Keane says that it's a "rare
>> day" when a competitor states that their technology is *only* 14x faster,
>> and that he can't recall another time when he's "seen a company promote
>> competitive benchmarks that are an order of magnitude slower." Of course, he
>> then further goes on to note that Intel's tests were done with NVIDIA's
>> previous generation GeForce GTX 280, and that the codes were simply run
>> out-of-the-box without any optimization -- but, still, he seems more than
>> happy to accept this bit of "recognition." In Intel's defense, however, the
>> overall finding of the paper (linked below) is that the performance gap
>> between a GTX 280 GPU and Core i7 960 processor is actually just 2.5X "on
>> average," which NVIDIA hasn't highlighted for some reason.
>>
>> --
>> Celebrating 10 years of bringing diversity to perversity!
>> Mahogany at:
>> http://groups.yahoo.com/group/mahogany_pleasures_of_darkness/
>>
>
>
>
> --
> "If all the world's a stage and we are merely players, who the bloody hell
> wrote the script?" -- Charles E Grant
>
> http://www.youtube.com/watch?v=fQUxw9aUVik
>
>
> 
>



-- 
Celebrating 10 years of bringing diversity to perversity!
Mahogany at: http://groups.yahoo.com/group/mahogany_pleasures_of_darkness/

Reply via email to