Hi John,

Thanks, it does have bearing on what I'm saying. Both your letter and
Juergen's paper made me think about issues of self-referentiality that
I might have sweeped under the rug too much.

My argument is quite blunt, but this is also why I think it's
powerful. Of course, it does not help us with understanding what an
evolved Jupiter-sized brain would do (it would not understand itself
either, as you say).

Cheers,
Telmo.

On Sat, Sep 10, 2016 at 7:14 PM, John Clark <[email protected]> wrote:
> On Sat, Sep 10, 2016 at 9:06 AM, Telmo Menezes <[email protected]>
> wrote:
>
>> >
>>  published this working paper on arxiv, same title as this email:
>> http://arxiv.org/abs/1609.02009
>>
>>
>> Criticisms most welcome!
>
>
> This may have some bearing on what you're saying, more than t
> wenty years ago on May 13 1996
> I sent this
> letter
> to Nick Bostrom commenting
> on some of his ideas:
>
>
>
> I think that's the central flaw in your argument. You're making an
> assumption
> that can not be true. We don't understand
> ourselves, a Jupiter Brain wouldn't understand itself either. It may be able
> to produce internal models that are far more complex than anything we can
> do, but the thing it is trying to model,
> itself, is also far more complex. A Jupiter Brain could make improvements in
> itself, but it would have to be very careful, and could certainly not
> "transform itself" to any mental state "at will" because it wouldn't know
> how.
>
> Science and understanding are about prediction, and the only way a computer
> program can know how it will react if it is given a particular input is to
> run it, and even that will not always work. Turing proved in 1935 that a
> computer program or a Turing Machine can not predict its own behavior, and
> unless you believe in a mystical soul, that's exactly what we or a Jupiter
> brain are, a computer program running on a Turing Machine.
>
> Today some human beings become drug addicts and accomplish nothing, in the
> future some Jupiter Brains may decide to directly, and crudely, stimulate
> the pleasure centers of it's brain with the equivalent of a drug like Heroin
> and accomplish nothing. That will be a problem, but I don't see why it will
> be a bigger problem than it is now, and not everyone is a junkie. Your
> assumption is that the Jupiter Brain will have access to something
> infinitely more powerful than Heroin that will be impossible to resist, but
> that is not true.
>
> I agree with you that if it was possible to change your emotions to anything
> you wanted, alter modes of thought, radically change your personality, swap
> your goals as well as your philosophy of life at the drop of a hat it would
> be very dangerous. Once you change yourself you may not want to change back,
> even if your behavior became bizarre or suicidal.
>
> Ever want to accomplish something but been unable to  because it's
> difficult, well just change your goal in life to something simple and do
> that; better yet, flood your mind with a feeling of pride and self
> satisfaction and don't bother accomplishing anything at all. Think all this
> is a terrible idea and stupid as well, no problem, just change your mind
> (and I do mean
> CHANGE
> YOUR MIND) now you think it's a wonderful idea. I don't have the blueprints
> for a Jupiter brain in my pocket but I do know that complex mechanisms don't
> do well in a positive feedback loop, not electronics, not animals, not
> people and not Jupiter brains.
>
> There is another reason, that may strike closer to home, to be happy that
> total
> self awareness is impossible, if it weren't AI researchers would be out of a
> job. If we understood how our mind worked nobody would study computer
> science in school because it would be too trivial, we'd all know
> instinctively how to make an AI, but that's not how the world works. For the
> things the mind does best, creativity, language recognition, depth
> perception, pattern identification, muscle coordination, we have no self
> awareness at all. It's only in things that were lousy at, like solving
> differential equations, that we become aware of our mental activity. We only
> become aware of mental processes when we're confused.
>
> People always asked creative men like Feynman how he came up with such
> brilliant ideas, but he didn't know, all he could say was things like "the
> idea just popped into my head". If he could really answer that question
> t
> hen we could just follow his instructions and we'd all be as smart as he
> was.  Self awareness means knowing what goes on inside your mind, but when
> you try to think about your present mental state you immediately change it,
> and if you don't completely understand something then you can't change it to
> anything you want at will.
>
> John K Clark
>
>
>
>
>
>
>
>
>
>
>
>
>
>>
>>
>>
>> Criticisms most welcome!
>>
>> Best,
>> Telmo.
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> To post to this group, send email to [email protected].
>> Visit this group at https://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/d/optout.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at https://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to