On 10 Feb 2015, at 22:26, Telmo Menezes wrote:
On Tue, Feb 10, 2015 at 9:07 PM, Jason Resch <[email protected]>
wrote:
On Tue, Feb 10, 2015 at 12:59 PM, Telmo Menezes <[email protected]
> wrote:
On Tue, Feb 10, 2015 at 6:21 PM, Jason Resch <[email protected]>
wrote:
On Tue, Feb 10, 2015 at 12:04 PM, Telmo Menezes <[email protected]
> wrote:
On Tue, Feb 10, 2015 at 4:47 PM, Jason Resch <[email protected]>
wrote:
If you define increased intelligence as decreased probability of
having a false belief on any randomly chosen proposition, then
superintelligences will be wrong on almost nothing, and their
beliefs will converge as their intelligence rises. Therefore nearly
all superintelligences will operate according to the same belief
system. We should stop worrying about trying to ensure friendly AI,
it will either be friendly or it won't according to what is right.
I wonder if this isn't prevented by Gödel's incompleteness. Given
that the superintelligence can never be certain of its own
consistency, it must remain fundamentally agnostic. In this case, we
might have different superintelligences working under different
hypothesis, possibly occupying niches just like what happens with
Darwinism.
Interesting point. Yes a true super intelligence may never perform
any actions, as its trapped in never being certain (and knowing it
never can be certain) that its actions are right. Fitness for
survival may play some role in how intelligent active agents can be
before they become inactive.
Yes, that's an interesting way to put it. I wonder.
I think chances are that it will be friendly, since I happen to
believe in universal personhood, and if that belief is correct, then
superintelligences will also come to believe it is correct. And with
the belief in universal personhood it would know that harm to others
is harm to the self.
I agree with you, with the difference that I try to assume universal
personhood without believing in it, to avoid becoming a religious
fundamentalist.
Interesting. Why do you think having beliefs can lead to religious
fundamentalism. Would you not say you belief the Earth is round?
Could such a belief lead to religious fundamentalism and if not why
not?
This leads us back to a recurring discussion on this mailing list. I
would say that you can believe the Earth to be round in the informal
sense of the word: your estimation of the probability that the earth
is round is very close to one. I don't think you can believe the
earth to be round with 100% certainty without falling into religious
fundamentalism. This implies a total belief in your senses, for
example. That is a strong position about the nature of reality that
is not really backed up by anything. Just like believing literally
in the Bible or the Quran or Atlas Shrugged.
I see. I did not mean it in the sense of absolute certitude, merely
that universal personhood is one of my current working hypotheses
derived from my consideration of various problems of personal
identity.
Right. We are in complete agreement then.
Universal personhood is also one of my main working hypotheses. I
wonder if it could be considered a "preferable belief": it may be
true and we are all better off assuming it to be true.
It might be useful after death, but I am not sure if it is a
preferable belief/assumption on the terrestrial (effective) plane. It
makes sense only through a personal understanding, for example of the
universal person that all machine can recognized by themselves to be
when introspecting, in case they are enough self-referentially
correct. If not, it will becomes a statement that the parrots will
repeat and impose without understanding, and that will quickly lead to
a threat to freedom. Like I said: it is double edged. It might be a
type of knowledge belonging to a []* \ [] sort of logic: you can grasp
it from inside, but it would not make sense to tell others. You can
still suggest means to access that knowledge, but not much more. I
think, and extrapolate from the "correct" machine self-reference.
Bruno
Telmo.
Jason
--
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it,
send an email to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it,
send an email to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.
http://iridia.ulb.ac.be/~marchal/
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.