On Tue, Feb 10, 2015 at 6:21 PM, Jason Resch <[email protected]> wrote:

>
>
> On Tue, Feb 10, 2015 at 12:04 PM, Telmo Menezes <[email protected]>
> wrote:
>
>>
>>
>> On Tue, Feb 10, 2015 at 4:47 PM, Jason Resch <[email protected]>
>> wrote:
>>
>>> If you define increased intelligence as decreased probability of having
>>> a false belief on any randomly chosen proposition, then superintelligences
>>> will be wrong on almost nothing, and their beliefs will converge as their
>>> intelligence rises. Therefore nearly all superintelligences will operate
>>> according to the same belief system. We should stop worrying about trying
>>> to ensure friendly AI, it will either be friendly or it won't according to
>>> what is right.
>>>
>>
>> I wonder if this isn't prevented by Gödel's incompleteness. Given that
>> the superintelligence can never be certain of its own consistency, it must
>> remain fundamentally agnostic. In this case, we might have different
>> superintelligences working under different hypothesis, possibly occupying
>> niches just like what happens with Darwinism.
>>
>
> Interesting point. Yes a true super intelligence may never perform any
> actions, as its trapped in never being certain (and knowing it never can be
> certain) that its actions are right. Fitness for survival may play some
> role in how intelligent active agents can be before they become inactive.
>

Yes, that's an interesting way to put it. I wonder.


>
>
>>
>>
>>>
>>> I think chances are that it will be friendly, since I happen to believe
>>> in universal personhood, and if that belief is correct, then
>>> superintelligences will also come to believe it is correct. And with the
>>> belief in universal personhood it would know that harm to others is harm to
>>> the self.
>>>
>>
>> I agree with you, with the difference that I try to assume universal
>> personhood without believing in it, to avoid becoming a religious
>> fundamentalist.
>>
>>
> Interesting. Why do you think having beliefs can lead to religious
> fundamentalism. Would you not say you belief the Earth is round? Could such
> a belief lead to religious fundamentalism and if not why not?
>

This leads us back to a recurring discussion on this mailing list. I would
say that you can believe the Earth to be round in the informal sense of the
word: your estimation of the probability that the earth is round is very
close to one. I don't think you can believe the earth to be round with 100%
certainty without falling into religious fundamentalism. This implies a
total belief in your senses, for example. That is a strong position about
the nature of reality that is not really backed up by anything. Just like
believing literally in the Bible or the Quran or Atlas Shrugged.

Telmo.


>
> Jason
>
>
>> Telmo.
>>
>>
>>>
>>> Jason
>>>
>>> On Tue, Feb 10, 2015 at 2:19 AM, Alberto G. Corona <[email protected]>
>>> wrote:
>>>
>>>> I can´t even enumerate the number of ways in which that article is
>>>> wrong.
>>>>
>>>> First of all, any intelligent robot MUST have a religion in order to
>>>> act in any way. A set of core beliefs. A non intelligent robot need them
>>>> too: It is the set of constants. The intelligent robot  can rewrite their
>>>> constants from which he derive their calculations for actions and if the
>>>> robot is self preserving and reproduce sexually, it has to adjust his
>>>> constants i.e. his beliefs according with some darwinian algoritm that must
>>>> take into account himself but specially the group in which he lives and
>>>> collaborates..
>>>>
>>>> If the robot does not reproduce sexually and his fellows do not execute
>>>> very similar programs, it is pointless to teach them any human religion.
>>>>
>>>> These and other higher aspects like acting with other intelligent
>>>> beings communicate perceptions, how a robot elaborate philosophical and
>>>> theological concepts and collaborate with others, see my post about
>>>> "robotic truth"
>>>>
>>>> But I think that a robot with such level of intelligence will never be
>>>> possible.
>>>>
>>>> 2015-02-09 21:59 GMT+01:00 meekerdb <[email protected]>:
>>>>
>>>>>
>>>>> In two senses of that term! Or something.
>>>>>
>>>>> http://bigthink.com/ideafeed/robot-religion-2
>>>>>
>>>>> http://gizmodo.com/when-superintelligent-ai-arrives-
>>>>> will-religions-try-t-1682837922
>>>>>
>>>>> --
>>>>> You received this message because you are subscribed to the Google
>>>>> Groups "Everything List" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>>> an email to [email protected].
>>>>> To post to this group, send email to [email protected].
>>>>> Visit this group at http://groups.google.com/group/everything-list.
>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Alberto.
>>>>
>>>> --
>>>> You received this message because you are subscribed to the Google
>>>> Groups "Everything List" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>> an email to [email protected].
>>>> To post to this group, send email to [email protected].
>>>> Visit this group at http://groups.google.com/group/everything-list.
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>>
>>>  --
>>> You received this message because you are subscribed to the Google
>>> Groups "Everything List" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to [email protected].
>>> To post to this group, send email to [email protected].
>>> Visit this group at http://groups.google.com/group/everything-list.
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>>  --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> To post to this group, send email to [email protected].
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>  --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to