On Sun, Aug 18, 2013 at 1:23 AM, Platonist Guitar Cowboy
<multiplecit...@gmail.com> wrote:
>
>
>
> On Sat, Aug 17, 2013 at 10:07 PM, Telmo Menezes <te...@telmomenezes.com>
> wrote:
>>
>> On Sat, Aug 17, 2013 at 2:45 PM, Platonist Guitar Cowboy
>> <multiplecit...@gmail.com> wrote:
>> >
>> >
>> >
>>
>> PGC,
>>
>> You are starting from the assumption that any intelligent entity is
>> interested in self-preservation.
>>
>> I wonder if this drive isn't
>> completely selected for by evolution. Would a human designed
>> super-intelligent machine be necessarily interested in
>> self-preservation? It could be better than us at figuring out how to
>> achieve a desired future state without sharing human desires --
>> including the desire to keep existing.
>
>
> I wouldn't go as far as self-preservation at the start and assume instead
> that intelligence implemented in some environment will notice the
> limitations and start asking questions.

Ok.

> But yes, in the sense that
> self-preservation extends from this in our weird context and would be a
> question it would eventually raise.

I agree it would raise questions about it.

> Still, to completely bar it, say from the capacity to question human
> activities in their environments, and picking up that humans self-preserve
> mostly regardless of what this does to their environment, would be
> self-defeating or a huge blind spot.

But here you are already making value judgements. The AI could notice
human behaviours that lead to mass extinction or its own destruction
and simply not care. I think you're making the mistake of assuming
that certain human values are fundamental at a very low level.

>>
>> One idea I wonder about sometimes is AI-cracy: imagine we are ruled by
>> an AI dictator that has one single desire: to make us all as happy as
>> possible.
>>
>
> Even with this, which is weird because of "Matrix-like zombification of
> people being spoon fed happiness" scenarios,

The human in the Matrix were living in an illusion that resembled our
own flawed world. But imagine you get to live in a state of complete,
permanent bliss. Would you chose that at the expense of everything
else?

> AI would have to have enough
> self-referential capacity to simulate with enough accuracy human
> self-reference.

I wonder. Maybe it could get away with epidemiological studies and
behave more like an empirical scientist where we are the object of its
research. But maybe you're right, I'm not convinced either way.

> This ability to figure out desired future states with
> blunted self-reference it may not apply to itself seems to me a
> contradiction.
> Therefore I would guess that such an entity censored in its self-referential
> potential is not granted intelligence. It is more a tool towards some
> already specified ends, wouldn't you say?

Humm... I agree that self-reference and introspection would be
necessary for such an advanced AI, I'm just not convinced that these
things imply a desire to survive or the adoption of any given set o
values.

> Also, differences between the Windows, Google, Linux or the Apple version of
> happiness would only be cosmetic because without killing and dominating each
> other for some rather long period it seems, it would be some "Disney surface
> happiness"

You might be interested in this TV show:
http://en.wikipedia.org/wiki/Black_Mirror_(TV_series)

More specifically, season 1, episode 2: "15 Million Merits"

> with some small group operating a "more for us few here at the
> top, less for them everybody else" agenda underneath ;-) PGC

This is a heavily discussed topic in the context of mind emulation and
an hypothetical diaspora to a computationally simulated universe. A
new form of dominance/competition could be based on computational
power.

I do not condone the AI-cracy, I just think it's a useful (or maybe
just amusing) thought experiment.

Telmo.


>>
>> >>
>> >> Telmo.
>> >>
>> >> > Brent
>> >> >
>> >> > --
>> >> > You received this message because you are subscribed to the Google
>> >> > Groups
>> >> > "Everything List" group.
>> >> > To unsubscribe from this group and stop receiving emails from it,
>> >> > send
>> >> > an
>> >> > email to everything-list+unsubscr...@googlegroups.com.
>> >> > To post to this group, send email to
>> >> > everything-list@googlegroups.com.
>> >> > Visit this group at http://groups.google.com/group/everything-list.
>> >> > For more options, visit https://groups.google.com/groups/opt_out.
>> >>
>> >> --
>> >> You received this message because you are subscribed to the Google
>> >> Groups
>> >> "Everything List" group.
>> >> To unsubscribe from this group and stop receiving emails from it, send
>> >> an
>> >> email to everything-list+unsubscr...@googlegroups.com.
>> >> To post to this group, send email to everything-list@googlegroups.com.
>> >> Visit this group at http://groups.google.com/group/everything-list.
>> >> For more options, visit https://groups.google.com/groups/opt_out.
>> >
>> >
>> > --
>> > You received this message because you are subscribed to the Google
>> > Groups
>> > "Everything List" group.
>> > To unsubscribe from this group and stop receiving emails from it, send
>> > an
>> > email to everything-list+unsubscr...@googlegroups.com.
>> > To post to this group, send email to everything-list@googlegroups.com.
>> > Visit this group at http://groups.google.com/group/everything-list.
>> > For more options, visit https://groups.google.com/groups/opt_out.
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to everything-list+unsubscr...@googlegroups.com.
>> To post to this group, send email to everything-list@googlegroups.com.
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/groups/opt_out.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/groups/opt_out.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to