On Thu, 10 Sep 2015 12:20:39 +0000 (UTC) james <wirel...@tampabay.rr.com> wrote:

> Fernando Rodriguez <frodriguez.developer <at> outlook.com> writes:
> 
> > > albeit in it's infancy. Naturally it's going to take a while to 
> > > become mainstream useful; but that more like a year or 2, at most.
> > 
> > The value I see on that technology for desktop computing is that we
> > get the GPUs for what they're made (graphics processing) but their
> > resources go unused by most applications, not in buying powerful
> > GPUs for the purpose of offloading general purpose code, if that's
> > the goal you're better off investing in more general purpose cores
> > that are more suited for the task.

It is true.
 
> I think most folks when purchasing a workstation include a graphics
> card on the list of items to include. So my suggestions where geared
> towards informing folks about some of the new features of gcc that
> may intice them to consider the graphics card resources in an
> expanded vision of general resources for their workstation.
> 
> > To trully take advantage of the GPU the actual algorithms need to be
> > rewritten to use features like SIMD and other advanced parallelization
> > features, most desktop workloads don't lend themselves for that kind
> > of parallelization.

And it is also true.

> Not true if what openacc hopes to achived indeed does become a reality.

Hopes almost never becomes a reality.

> Currently, you are most correct.

Absolutely correct.

...

> 
> When folks buy new hardware, it is often a good time to look at what
> is on the horizon for computers they use.

I also considered "what is on the horizon" when bought a brand new
ATI Radeon R4770 graphic card about 6 years ago for computing purposes.

In half a year it was discovered that it has much worse performance than
ATI guys hoped for and, to improve it, they have to rewrite their proprietary
drive for this graphic card.

Instead of doing it, they just shamelessly dropped the support of the parallel
computing feature of this graphic card in all subsequent versions of their 
drive.

And as far as I know, no open source drive have ever supported the parallel
computing feature of this graphic card as well.

So, it was just a waste of money. Even more: I almost never worked at my 
assembled
almost 7 year-old 4-core AMD computer with this graphic card as for all other
purposes I prefer to work at my 10 year-old 2-core AMD computer with a very 
cheap
on-board video card. Just to avoid extra heating and aircraft noise produced by 
R4770.

So, Rich Freeman was absolutely right when he wrote in reply to your words 
above that

> If all you need today is a $30 graphics card, then you probably should
> just spend $30.  If you think that software will be able to use all
> kinds of fancy features on a $300 graphics card in two years, you
> should just spend $30 today, and then wait two years and buy the fancy
> graphics card on clearance for $10.

> It is pretty rare that it is a wise move to spend money today on
> computer hardware that you don't have immediate plans to use.


Reply via email to