Matt.

On 10/20/08, Matt Mahoney <[EMAIL PROTECTED]> wrote:
>
> The singularity list is probably more appropriate for philosophical
> discussions about AGI.


Only those discussions that relate AGI to singularity.

Another one for Ben's list:

*Basic Economic Feasibility: It has been proposed that intelligent but not
super-intelligent machines may have great economic value. Others have said
that we already have way too many such biological machines, making more such
intelligence worthless. This has been countered by arguments that there are
hazardous and/or biologically impossible environments where only an
intelligent machine could work. This seems to fall into the realm of basic
business plan projections, where the cost of engineering and manufacture is
returned by sales through market penetration. An abbreviated business plan
showing quantitatively how a profit might be made would go a LONG way to
settling this argument.*

Steve Richfield



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to