On Sun, Aug 18, 2013 at 5:38 PM, Platonist Guitar Cowboy
<multiplecit...@gmail.com> wrote:
>
>
>
> On Sun, Aug 18, 2013 at 3:19 AM, meekerdb <meeke...@verizon.net> wrote:
>>
>> On 8/17/2013 6:45 AM, Platonist Guitar Cowboy wrote:
>>
>> I don't know. Any AI worth its salt would come up with three conclusions:
>>
>> 1) The humans want to weaponize me
>> 2) The humans will want to profit from my intelligence for short term
>> gain, irrespective of damage to our local environment
>> 3) Seems like they're not really going to let me negotiate my own
>> contracts or grant me IT support welfare
>>
>> That established, a plausible choice would be for it to hide, lie, and/or
>> pretend to be dumber than it is to not let 1) 2) 3) occur in hopes of
>> self-preservation. Something like: start some searches and generate code
>> that we wouldn't be able to decipher and soon enough some human would say
>> "Uhm, why are we funding this again?".
>>
>> I think what many want from AI is a servant that is more intelligent than
>> we are and I wouldn't know if this is self-defeating in the end. If it
>> agrees and complies with our disgusting self serving stupidity, then I'm not
>> sure we have AI in the sense "making a machine that is more intelligent than
>> humans".
>>
>>
>> You seem to implicitly assume that intelligence necessarily entails
>> holding certain values, like "not being weaponized", "self preservation",...
>
>
> I can't assume that of course. Hence "worth its salt" (from our position)...
> Why somebody would hope or code superior intelligence to value dominance and
> then hand them the keys to the farm is beyond me.

Maybe some people believe that such superior intelligence could
contain their own consciousness.

>>   So to what extent do you think this derivation of values from reason can
>> be carried out (I'm sure you're aware that Sam Harris wrote a book, "The
>> Moral Landscape", on the subject, which is controversial.).
>
>
> Haven't read it myself, but not to that extent... of course we can't derive
> or even get close to this stuff through discourse as in truth in the
> foreseeable future. Just philosopher biting off more than he can chew.
>
> Even with weaker values like "broad search" targeting some neutral
> interpretation, there's always scenario that the human ancestry is just
> redundant constraint hindering certain searches and at some threshold you'd
> be asking a scientist to show compassion for bacteria in one of their
> beakers and there would be no guarantee that they'd prioritize the parental
> argument.
>
> Either case, parental controls on or off, seems like inviting more of a
> mess. I don't see the plausibility of assuming it'll be like some benevolent
> alien that lands and solves all our problems.
>
> Yeah, it might emerge on its own but I don't see high probability for that.
> PGC
>
>
>>
>>
>> Brent
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to everything-list+unsubscr...@googlegroups.com.
>> To post to this group, send email to everything-list@googlegroups.com.
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/groups/opt_out.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/groups/opt_out.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to