On 27 August 2014 05:31, Terren Suydam <[email protected]> wrote:

> For what it's worth, the kind of autonomous human-level (or greater) type
> of AI, or AGI, will *most likely* require an architecture, yet to be well
> understood, that is of an entirely different nature relative to the kind of
> architectures being engineered by those interests who desire highly
> complicated slaves.  In other words, I'm not losing any sleep about the
> military accidentally unleashing a terminator.  If I'm going to lose sleep
> over a predictable sudden loss of well being, I will focus instead on the
> much less technical and much more realistic threats arising from
> economic/societal collapse.
>
> Or the - probably even more likely - threat of ecological disaster.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to