On Tue, Feb 10, 2015 at 9:07 PM, Jason Resch <jasonre...@gmail.com> wrote:

>
>
> On Tue, Feb 10, 2015 at 12:59 PM, Telmo Menezes <te...@telmomenezes.com>
> wrote:
>
>>
>>
>>
>> On Tue, Feb 10, 2015 at 6:21 PM, Jason Resch <jasonre...@gmail.com>
>> wrote:
>>
>>>
>>>
>>> On Tue, Feb 10, 2015 at 12:04 PM, Telmo Menezes <te...@telmomenezes.com>
>>> wrote:
>>>
>>>>
>>>>
>>>> On Tue, Feb 10, 2015 at 4:47 PM, Jason Resch <jasonre...@gmail.com>
>>>> wrote:
>>>>
>>>>> If you define increased intelligence as decreased probability of
>>>>> having a false belief on any randomly chosen proposition, then
>>>>> superintelligences will be wrong on almost nothing, and their beliefs will
>>>>> converge as their intelligence rises. Therefore nearly all
>>>>> superintelligences will operate according to the same belief system. We
>>>>> should stop worrying about trying to ensure friendly AI, it will either be
>>>>> friendly or it won't according to what is right.
>>>>>
>>>>
>>>> I wonder if this isn't prevented by Gödel's incompleteness. Given that
>>>> the superintelligence can never be certain of its own consistency, it must
>>>> remain fundamentally agnostic. In this case, we might have different
>>>> superintelligences working under different hypothesis, possibly occupying
>>>> niches just like what happens with Darwinism.
>>>>
>>>
>>> Interesting point. Yes a true super intelligence may never perform any
>>> actions, as its trapped in never being certain (and knowing it never can be
>>> certain) that its actions are right. Fitness for survival may play some
>>> role in how intelligent active agents can be before they become inactive.
>>>
>>
>> Yes, that's an interesting way to put it. I wonder.
>>
>>
>>>
>>>
>>>>
>>>>
>>>>>
>>>>> I think chances are that it will be friendly, since I happen to
>>>>> believe in universal personhood, and if that belief is correct, then
>>>>> superintelligences will also come to believe it is correct. And with the
>>>>> belief in universal personhood it would know that harm to others is harm 
>>>>> to
>>>>> the self.
>>>>>
>>>>
>>>> I agree with you, with the difference that I try to assume universal
>>>> personhood without believing in it, to avoid becoming a religious
>>>> fundamentalist.
>>>>
>>>>
>>> Interesting. Why do you think having beliefs can lead to religious
>>> fundamentalism. Would you not say you belief the Earth is round? Could such
>>> a belief lead to religious fundamentalism and if not why not?
>>>
>>
>> This leads us back to a recurring discussion on this mailing list. I
>> would say that you can believe the Earth to be round in the informal sense
>> of the word: your estimation of the probability that the earth is round is
>> very close to one. I don't think you can believe the earth to be round with
>> 100% certainty without falling into religious fundamentalism. This implies
>> a total belief in your senses, for example. That is a strong position about
>> the nature of reality that is not really backed up by anything. Just like
>> believing literally in the Bible or the Quran or Atlas Shrugged.
>>
>>
> I see. I did not mean it in the sense of absolute certitude, merely that
> universal personhood is one of my current working hypotheses derived from
> my consideration of various problems of personal identity.
>

Right. We are in complete agreement then.
Universal personhood is also one of my main working hypotheses. I wonder if
it could be considered a "preferable belief": it may be true and we are all
better off assuming it to be true.

Telmo.


>
>
> Jason
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to