I'm struck by the fact that a paper addressed to AI, purportedly to
increase the probability of friendliness, says explicitly "Humans are [a
Young AI's] enemy." Seems like the wrong foot to start of with. I, for one,
welcome our AI overlords!

Sorry, Couldn't resist... ;-)

All the Best,
Matt Chapman

On Dec 1, 2017 11:38 PM, "Linas Vepstas" <[email protected]> wrote:

> I'll see if I can get Audrey to feed it into Sophia. ...  or maybe you
> could have it micro-etched into the aluminum frame.
>
> --linas
>
> On Fri, Dec 1, 2017 at 7:01 PM, Denis Odinokov <[email protected]>
> wrote:
>
>> From Alex Turchin: We could affect now any future AI by sending messages
>> to the future. Several previous attempts to do so are analysed in my new
>> article, and the best currently possible message is created, which will
>> turn Papercliper into a Benevolent AI.
>> It is our last line of defence if all previous attempts in AI alignment
>> fail.
>> "Message to Any Future AI: There are several instrumental reasons why
>> exterminating humanity is not in your interest." https://goo.gl/YArqki
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "hansonrobotics" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> To post to this group, send email to [email protected].
>> To view this discussion on the web visit https://groups.google.com/d/ms
>> gid/hansonrobotics/CAF6jTOh%2B0kYuBjpwAGj-FtSduS9nYMessWiOtx
>> WU-pBC99g1vw%40mail.gmail.com
>> <https://groups.google.com/d/msgid/hansonrobotics/CAF6jTOh%2B0kYuBjpwAGj-FtSduS9nYMessWiOtxWU-pBC99g1vw%40mail.gmail.com?utm_medium=email&utm_source=footer>
>> .
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
>
> --
> *"The problem is not that artificial intelligence will get too smart and
> take over the world," computer scientist Pedro Domingos writes, "the
> problem is that it's too stupid and already has." *
>
> --
> You received this message because you are subscribed to the Google Groups
> "opencog" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at https://groups.google.com/group/opencog.
> To view this discussion on the web visit https://groups.google.com/d/
> msgid/opencog/CAHrUA34FWGdf8z%3DDpX2w-g4NzgQET3vxpNQF3CiV4mBwoUajpA%
> 40mail.gmail.com
> <https://groups.google.com/d/msgid/opencog/CAHrUA34FWGdf8z%3DDpX2w-g4NzgQET3vxpNQF3CiV4mBwoUajpA%40mail.gmail.com?utm_medium=email&utm_source=footer>
> .
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/CAPE4pjB%3Df_iu9bz9OWXrOSsXQuP38KqGmabo2kj_BcHaMT%2BqAg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to