I pulled in some extra context from earlier messages to
illustrate an interesting event, here.
On Jan 27, 2008, at 12:24 PM, Richard Loosemore wrote:
--- Richard Loosemore <[EMAIL PROTECTED]> wrote:
Matt Mahoney wrote:
Suppose you
ask the AGI to examine some operating system or server software
to look for
security flaws. Is it supposed to guess whether you want to
fix the flaws or
write a virus?
If it has a moral code (it does) then why on earth would it have to
guess whether you want it fix the flaws or fix the virus?
If I hired you as a security analyst to find flaws in a piece of
software, and
I didn't tell you what I was going to do with the information,
how would you
know?
This is so silly it is actually getting quite amusing... :-)
So, you are positing a situation in which I am an AGI, and you want
to hire me as a security analyst, and you say to me: "Please build
the most potent virus in the world (one with a complete AGI inside
it), because I need it for security purposes, but I am not going to
tell you what I will do with the thing you build."
And we are assuming that I am an AGI with at least two neurons to
rub together?
How would I know what you were going to do with the information?
I would say "Sorry, pal, but you must think I was born yesterday.
I am not building such a virus for you or anyone else, because the
dangers of building it, even as a test, are so enormous that it
would be ridiculous. And even if I did think it was a valid
request, I wouldn't do such a thing for *anyone* who said 'I cannot
tell you what I will do with the thing that you build'!"
In the context of the actual quotes, above, the following statement
is priceless.
It seems to me that you have completely lost track of the original
issue in this conversation, so your other comments are meaningless
with respect to that original context.
Let's look at this again:
--- Richard Loosemore <[EMAIL PROTECTED]> wrote:
Matt Mahoney wrote:
Suppose you
ask the AGI to examine some operating system or server software
to look for
security flaws. Is it supposed to guess whether you want to
fix the flaws or
write a virus?
If it has a moral code (it does) then why on earth would it have to
guess whether you want it fix the flaws or fix the virus?
Notice that in Matt's "Is it supposed to guess whether you want to
fix the
flaws or write a virus?" there's no suggestion that you're asking the
AGI
to write a virus, only that you're asking it for security
information. Richard
then quietly changes "to" to "it", thereby changing the meaning of
the sentence
to the form he prefers to argue against (however ungrammatical), and
then he
manages to finish up by accusing *Matt* of forgetting what Matt
originally said
on the matter.
--
Randall Randall <[EMAIL PROTECTED]>
"Someone needs to invent a Bayesball bat that exists solely for
smacking people [...] upside the head." -- Psy-Kosh on reddit.com
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=90357410-2b6273