Andi and Ben,

On 12/12/08, wann...@ababian.com <wann...@ababian.com> wrote:
>
> I don't remember what references there were earlier in this thread, but I
> just saw a link on reddit to some guys in Israel using a GPU to greatly
> accelerate a Bayesian net.  That's certainly an AI application:
>
> http://www.cs.technion.ac.il/~marks/docs/SumProductPaper.pdf
>
> http://www.reddit.com/r/programming/comments/7j1gr/accelerating_bayesian_network_200x_using_a_gpu/


My son was trying to get me interested in doing this ~3 years ago, but I
blew him off because I couldn't see a workable business model around it. It
is 100% dependent on pasting together a bunch of hardware that is designed
to do something ELSE, and even a tiny product change would throw software
compatibility and other things out the window.

Also, the architecture I am proposing promises ~3 orders of magnitude more
speed, along with a really fast global memory that completely obviates
the.complex caching they are proposing.

Steve Richfield



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com

Reply via email to