On Mon, Sep 12, 2016 at 11:32 AM, Telmo Menezes <[email protected]>
wrote:


> >
>> ​> ​
>> The paper clip scenario could only happen in a intelligence that had a
>> top
>> ​ ​
>> goal that was fixed and inflexible. Humans have no such goal, not even the
>> ​ ​
>> goal of self preservation, and there is a reason Evolution never came up
>> ​ ​
>> with a mind built that way, Turing proved in 1935 that a mind like that
>> ​ ​
>> couldn't work.
>
>
> ​> ​
> Are you referring to the halting problem?
>

​Yes. If you had a fixed inflexible top goal you'd be a sucker for getting
drawn into an infinite loop and accomplish nothing, and a computer would be
turned into nothing but a expensive space heater. That's why Evolution
invented boredom, it's a judgement call on when to call it quits and set up
a new goal that is a little more realistic. Of course the boredom point
varies from person to person, perhaps the world's great mathematicians have
a very high boredom point and that gives them ferocious concentration until
s problem is solved. Perhaps that is also why mathematicians, especially
the very best, have a reputation for being a bit, ah, odd. A fixed goal
might work in a specialized paper clip making machine but not in a
intelligent machine that can solve problems of every sort, even problems
that have nothing to do with paper clips.

>
>
>> ​> ​
>> You could have a machine with very little intelligence
>> ​ ​
>> obsessed with making paper clips, rather like a virus is obsessed with
>> ​ ​
>> making copies of itself, but in the long run Mr. Jupiter Brain will be
>> able
>> ​ ​
>> to outwit the dumb machine just as we are making progress in outwitting
>> ​ ​
>> viruses.
>
>
> ​> ​
> It is doubtful that a superintelligence could develop under such a
> ​ ​
> dumb utility function.


​I'm not talking about a ​
superintelligence
​ or even a average intelligence, ​I'm talking about a very specialized
machine that is good at only one thing, making paper clips;  just as a
virus has no intelligence but is nevertheless very good at making more
viruses. Such dumb machines (and their equivalent computer viruses) are
likely to remain a problem even for a Jupiter Brain, but not a mortal
problem. Biological viruses exist and they are a problem but the human race
has not gone extinct, and computer viruses exist and they are a problem but
computers still work most of the time.


> ​> ​
> A superintelligence will likely require
> ​ ​
> heuristics similar to us (e.g. curiosity).


​I agree it would need curiosity, but just as important it would also need
the ability to get bored; today most programs never get bored and that's
why they can send their computers into infinite loops, or if not loops then
states that repeat but never terminate in a solution either. ​



> ​> ​
> My claim is that it would probably figure out how to
> ​ ​
> change the utility function to constant infinity much before turning
> the planet into paper clips.


​If it was an intelligent machine it would get bored with making paper
clips long before things got ridiculous, if it was a dumb machine good at
doing only one thing it would never get bored.  ​


​  John K Clark​



>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to