Hmmm... the reply on Tool AI is interesting, but Holden's original critique
of SI is also worth reading:

http://lesswrong.com/lw/cbs/thoughts_on_the_singularity_institute_si/

It's a good deal more discerning than Hugo's critique, I'd say...


ben

On Tue, Aug 21, 2012 at 6:43 PM, Michael Anissimov
<[email protected]>wrote:

> For those interested in current Singularity Institute research:
>
> http://singularity.org/research/
>
> Also possibly of interest, our executive director's recent reply to Holden
> Karnofsky on what he calls "tool AI":
>
> http://lesswrong.com/lw/cze/reply_to_holden_on_tool_ai/
>
> We have recently hired two research fellows, Alex Altair and Kaj Sotala.
> Both are exclusively focused on AI research.
>
> --
> Michael Anissimov
> Singularity Institute
> www.singularity.org
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/212726-11ac2389> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-- 
Ben Goertzel, PhD
http://goertzel.org

"My humanity is a constant self-overcoming" -- Friedrich Nietzsche



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to