On 2/10/2015 5:49 PM, Jason Resch wrote:
On Tue, Feb 10, 2015 at 6:40 PM, meekerdb <[email protected]
<mailto:[email protected]>> wrote:
On 2/10/2015 8:47 AM, Jason Resch wrote:
If you define increased intelligence as decreased probability of having a
false
belief on any randomly chosen proposition, then superintelligences will be
wrong on
almost nothing, and their beliefs will converge as their intelligence rises.
Therefore nearly all superintelligences will operate according to the same
belief
system. We should stop worrying about trying to ensure friendly AI, it will
either
be friendly or it won't according to what is right.
The problem isn't beliefs, it's values. Humans have certain core values
selected by
evolution; and in addition they have many secondary culturally determined values.
What values will super-AI have and where will it get them and will they evolve?
That seems to be the main research topic at the Machine Intelligence Research Institute.
Were all your values set at birth and driven by biology, or are some of your values
based on what you've since learned about the world?
Isn't that what I wrote just above?
If values can be learned, and if morality is a field that has objective truth, then why
wouldn't a super intelligence will approach a correct value system.
What would correct mean? Is vanilla *really* better than chocolate?
I think there are core values - self-preservation, love of offspring, desire for
companionship, desire for power that are provided by evolution and adapt people to live in
extended families or small tribes. The other values we learn from our culture are the
result of cultural evolution selecting values and ethics that let us realize our core
values while living in towns and cities and nations.
Brent
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.