On 9/11/2019 9:33 PM, Tomasz Rola wrote:
On Tue, Sep 10, 2019 at 10:43:40AM -0700, 'Brent Meeker' via Everything List 
wrote:

On 9/9/2019 10:16 PM, Tomasz Rola wrote:
On Mon, Sep 09, 2019 at 07:34:19PM -0700, 'Brent Meeker' via Everything List 
wrote:
On 9/9/2019 6:55 PM, Tomasz Rola wrote:
On Mon, Sep 09, 2019 at 06:40:44PM -0700, 'Brent Meeker' via Everything List 
wrote:
Why escape to space when there a lots of resources here?  An AI with
access to everything connected to the internet shouldn't have any
trouble taking control of the Earth.
[...]

You reason like human - "I will stay here because it is nice and I can
have internet".

[...]
Cooperation is one of our most important survival strategies.  Lone
human beings are food for vultures.

  Humans in tribes rule the world.
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This is just one of those godlike delusions I have written
about. Either this or you can name even one such tribe. Hint: explain
how many earthquakes and volcanic eruptions those rulers have
prevented during last decade.

I only meant relative to other sentient beings.  Of course no one has changed the speed of light either and neither will a super-AI. My point is that cooperation is an inherent trait of humans, selected by evolution.  But an AI will not necessarily have that trait.


[...]
nice air of being godlike. Again, I guess AI will have no need for
feeling like this, or not much of feelings at all. Feeling is
adversarial to judgement.
I disagree.  Feeling is just the mark of value,  and values are
necessary for judgement, at least any judgment of what action to
take.
I disagree. I can easily give something a value without feeling about
it. Example: gold is just a yellow metal. I know other people value it
a lot, so I might preserve it for trading, but it does not make very
good knives. Highly impractical in the woods or for plowing
fields. But it might be used for catching fish, perhaps. They seem to
like swallowing little blinking things attached to a hook.

I was referring to fundamental values.  Of course many things, like gold and fish hooks, have instrumental value which derive from there usefulness in satisfying fundamental values, the ones that correlate with feelings.  If the AI has no fundamental values, it will have no instrumental ones too.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/fdccc63f-60ac-6644-adc4-60151b17a878%40verizon.net.

Reply via email to