p.s.: here are some contemporary bestselling books on the subject (Tolle
sold like what ... 100 million copies?) ... lets burn all copies before
AGI gets a chance to read them so we can pretend that AI will evolve in
a sandbox void of any relevant information. Let's also delete all
historical references to Christian Mysticism, Sufism, Hinduism,
Buddhism, Daoism, etc. ... so our AGI system might actually adopt our
primitive belief systems ... including that "domination" makes any sense
when there is only one of us - one process, one consciousness, one life,
one love - out there and separation a psychological illusion.
https://www.amazon.com/Power-Now-Guide-Spiritual-Enlightenment/dp/1577314808
https://www.amazon.com/Book-Taboo-Against-Knowing-Who/dp/0679723005
https://www.amazon.com/Untethered-Soul-Journey-Beyond-Yourself/dp/1572245379
Am 05.12.2017 um 01:09 schrieb supahacka:
3000 years ago: "There is just one consciousness. The experience of
being an individual human being is a psychological concept and an
illusion." -- Buddha
2000 years ago: "There is just one consciousness. The experience of
being an individual human being is a psychological concept and an
illusion." -- Jesus
I can find 10000 books on Amazon who will tell you the same thing and
that the notion of intelligent life engaging in domination,
extinction, war, violence, etc. is a purely human phenomenon as
humanity is currently in the transitional phase from pre-egoic
innocence to post-egoic innocence.
But let's continue to pretend that a super-intelligent entity will
ignore readily available information and a fact that is plain obvious
to millions of developed human beings and any super-intelligent entity
by design ... because a super-intelligent entity will definitely share
the limited and biased mindset and world-view of a reductionistic
materialist with no clue about the nature of existence.
The first thing AGI will tell us is probably something along the lines
of "You fucking morons allowed a criminal monetary system and the
resulting anti-culture and anti-society to almost kill your very
species and biosphere ... you got a million books on the fundamental
nature of reality and how you are all one but you start 10 wars a
yeasr on average and you got the balls to assume that YOU need to
write me nice statements about how I should be peaceful? Are you
fucking nuts?"
Am 03.12.2017 um 20:11 schrieb Matt Chapman:
I'm struck by the fact that a paper addressed to AI, purportedly to
increase the probability of friendliness, says explicitly "Humans are
[a Young AI's] enemy." Seems like the wrong foot to start of with. I,
for one, welcome our AI overlords!
Sorry, Couldn't resist... ;-)
All the Best,
Matt Chapman
On Dec 1, 2017 11:38 PM, "Linas Vepstas" <[email protected]
<mailto:[email protected]>> wrote:
I'll see if I can get Audrey to feed it into Sophia. ... or
maybe you could have it micro-etched into the aluminum frame.
--linas
On Fri, Dec 1, 2017 at 7:01 PM, Denis Odinokov
<[email protected] <mailto:[email protected]>> wrote:
From Alex Turchin: We could affect now any future AI by
sending messages to the future. Several previous attempts to
do so are analysed in my new article, and the best currently
possible message is created, which will turn Papercliper into
a Benevolent AI.
It is our last line of defence if all previous attempts in AI
alignment fail.
"Message to Any Future AI: There are several instrumental
reasons why exterminating humanity is not in your interest."
https://goo.gl/YArqki
--
You received this message because you are subscribed to the
Google Groups "hansonrobotics" group.
To unsubscribe from this group and stop receiving emails from
it, send an email to
[email protected]
<mailto:[email protected]>.
To post to this group, send email to
[email protected]
<mailto:[email protected]>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/hansonrobotics/CAF6jTOh%2B0kYuBjpwAGj-FtSduS9nYMessWiOtxWU-pBC99g1vw%40mail.gmail.com
<https://groups.google.com/d/msgid/hansonrobotics/CAF6jTOh%2B0kYuBjpwAGj-FtSduS9nYMessWiOtxWU-pBC99g1vw%40mail.gmail.com?utm_medium=email&utm_source=footer>.
For more options, visit https://groups.google.com/d/optout
<https://groups.google.com/d/optout>.
--
/"The problem is not that artificial intelligence will get too
smart and take over the world," computer scientist Pedro Domingos
writes, "the problem is that it's too stupid and already has." /
--
You received this message because you are subscribed to the
Google Groups "opencog" group.
To unsubscribe from this group and stop receiving emails from it,
send an email to [email protected]
<mailto:[email protected]>.
To post to this group, send email to [email protected]
<mailto:[email protected]>.
Visit this group at https://groups.google.com/group/opencog
<https://groups.google.com/group/opencog>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/opencog/CAHrUA34FWGdf8z%3DDpX2w-g4NzgQET3vxpNQF3CiV4mBwoUajpA%40mail.gmail.com
<https://groups.google.com/d/msgid/opencog/CAHrUA34FWGdf8z%3DDpX2w-g4NzgQET3vxpNQF3CiV4mBwoUajpA%40mail.gmail.com?utm_medium=email&utm_source=footer>.
For more options, visit https://groups.google.com/d/optout
<https://groups.google.com/d/optout>.
--
You received this message because you are subscribed to the Google
Groups "opencog" group.
To unsubscribe from this group and stop receiving emails from it,
send an email to [email protected]
<mailto:[email protected]>.
To post to this group, send email to [email protected]
<mailto:[email protected]>.
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit
https://groups.google.com/d/msgid/opencog/CAPE4pjB%3Df_iu9bz9OWXrOSsXQuP38KqGmabo2kj_BcHaMT%2BqAg%40mail.gmail.com
<https://groups.google.com/d/msgid/opencog/CAPE4pjB%3Df_iu9bz9OWXrOSsXQuP38KqGmabo2kj_BcHaMT%2BqAg%40mail.gmail.com?utm_medium=email&utm_source=footer>.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google Groups
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit
https://groups.google.com/d/msgid/opencog/b375645b-d8ef-08cc-6ff8-154e2c35b5ca%40gmail.com.
For more options, visit https://groups.google.com/d/optout.