One other remark.

>From the previously linked article:

"This may seem more like super-stupidity than super-intelligence. For
humans, it would indeed be stupidity, as it would constitute failure to
fulfill many of our important terminal values, such as life, love, and
variety. The AGI won't revise or otherwise change its goals, since changing
its goals would result in fewer paperclips being made in the future, and
that opposes its current goal. It has one simple goal of maximizing the
number of paperclips; human life, learning, joy, and so on are not
specified as goals. An AGI is simply an optimization process—a goal-seeker,
a utility-function-maximizer. Its values can be completely alien to ours.
If its utility function is to maximize paperclips, then it will do exactly
that."

   Not being capable of altering goals is indeed dumb!


On Fri, Sep 5, 2014 at 11:15 AM, Terren Suydam <terren.suy...@gmail.com>
wrote:

> http://wiki.lesswrong.com/wiki/Paperclip_maximizer
>
>
> On Fri, Sep 5, 2014 at 11:13 AM, Stephen Paul King <
> stephe...@provensecure.com> wrote:
>
>> AFAIK, if the AGI and humanity are not competing for the same resources,
>> no conflict need arise...
>>
>>
>> On Fri, Sep 5, 2014 at 11:08 AM, Terren Suydam <terren.suy...@gmail.com>
>> wrote:
>>
>>>
>>> On Fri, Sep 5, 2014 at 10:57 AM, John Clark <johnkcl...@gmail.com>
>>> wrote:
>>>
>>>>
>>>>
>>>>
>>>> On Thu, Sep 4, 2014 at 6:09 AM, Telmo Menezes <te...@telmomenezes.com>
>>>> wrote:
>>>>
>>>>>
>>>>> > Intelligence is clearly a process that can be bootstrapped -- we
>>>>> know this from biology.
>>>>>
>>>>
>>>> Yes, adults tend to be smarter than infants and infants are smarter
>>>> than one celled zygotes.
>>>>
>>>>
>>>>> > What I don't understand is how people expect to have a human-level
>>>>> AI (many degrees of freedom) and then also be able to micro-manage it.
>>>>>
>>>>
>>>> I also don't understand the people who talk about a "friendly A" I when
>>>> what they really mean is a AI that will happily remain our slave and place
>>>> our interests at a higher level than its own. It's just not possible to
>>>> consistently outsmart something that is vastly more intelligent than you
>>>> are.
>>>>
>>>>   John K Clark
>>>>
>>>>
>>> You're presupposing that an AGI must necessarily have interests that
>>> conflict with ours.
>>>
>>> It's obviously a really difficult problem, if for no other reason, you'd
>>> have to have faith that a much smarter AI was acting in our interests. Even
>>> if you could mathematically prove beforehand that an AGI would be friendly
>>> (which I doubt is possible), something way smarter than us would behave in
>>> unpredictable ways and make decisions that seem contrary to our interests,
>>> simply because we wouldn't be smart enough to follow the reasoning (which
>>> might take hundreds of years to explain to mere humans).
>>>
>>> Terren
>>>
>>> --
>>> You received this message because you are subscribed to a topic in the
>>> Google Groups "Everything List" group.
>>> To unsubscribe from this topic, visit
>>> https://groups.google.com/d/topic/everything-list/YJeHJO5dNqQ/unsubscribe
>>> .
>>> To unsubscribe from this group and all its topics, send an email to
>>> everything-list+unsubscr...@googlegroups.com.
>>> To post to this group, send email to everything-list@googlegroups.com.
>>> Visit this group at http://groups.google.com/group/everything-list.
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>>
>>
>> --
>>
>> Kindest Regards,
>>
>> Stephen Paul King
>>
>> Senior Researcher
>>
>> Mobile: (864) 567-3099
>>
>> stephe...@provensecure.com
>>
>>  http://www.provensecure.us/
>>
>>
>> “This message (including any attachments) is intended only for the use of
>> the individual or entity to which it is addressed, and may contain
>> information that is non-public, proprietary, privileged, confidential and
>> exempt from disclosure under applicable law or may be constituted as
>> attorney work product. If you are not the intended recipient, you are
>> hereby notified that any use, dissemination, distribution, or copying of
>> this communication is strictly prohibited. If you have received this
>> message in error, notify sender immediately and delete this message
>> immediately.”
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to everything-list+unsubscr...@googlegroups.com.
>> To post to this group, send email to everything-list@googlegroups.com.
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>  --
> You received this message because you are subscribed to a topic in the
> Google Groups "Everything List" group.
> To unsubscribe from this topic, visit
> https://groups.google.com/d/topic/everything-list/YJeHJO5dNqQ/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
> everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>



-- 

Kindest Regards,

Stephen Paul King

Senior Researcher

Mobile: (864) 567-3099

stephe...@provensecure.com

 http://www.provensecure.us/


“This message (including any attachments) is intended only for the use of
the individual or entity to which it is addressed, and may contain
information that is non-public, proprietary, privileged, confidential and
exempt from disclosure under applicable law or may be constituted as
attorney work product. If you are not the intended recipient, you are
hereby notified that any use, dissemination, distribution, or copying of
this communication is strictly prohibited. If you have received this
message in error, notify sender immediately and delete this message
immediately.”

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to