On 9/13/2019 4:02 AM, Bruno Marchal wrote:
On 12 Sep 2019, at 06:52, 'Brent Meeker' via Everything List
<[email protected]> wrote:
On 9/11/2019 9:33 PM, Tomasz Rola wrote:
On Tue, Sep 10, 2019 at 10:43:40AM -0700, 'Brent Meeker' via Everything List
wrote:
On 9/9/2019 10:16 PM, Tomasz Rola wrote:
On Mon, Sep 09, 2019 at 07:34:19PM -0700, 'Brent Meeker' via Everything List
wrote:
On 9/9/2019 6:55 PM, Tomasz Rola wrote:
On Mon, Sep 09, 2019 at 06:40:44PM -0700, 'Brent Meeker' via Everything List
wrote:
Why escape to space when there a lots of resources here? An AI with
access to everything connected to the internet shouldn't have any
trouble taking control of the Earth.
[...]
You reason like human - "I will stay here because it is nice and I can
have internet".
[...]
Cooperation is one of our most important survival strategies. Lone
human beings are food for vultures.
Humans in tribes rule the world.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This is just one of those godlike delusions I have written
about. Either this or you can name even one such tribe. Hint: explain
how many earthquakes and volcanic eruptions those rulers have
prevented during last decade.
I only meant relative to other sentient beings. Of course no one has changed
the speed of light either and neither will a super-AI. My point is that
cooperation is an inherent trait of humans, selected by evolution. But an AI
will not necessarily have that trait.
There is not total (everywhere defined) universal Turing machine, so they are
born with a conflict between security (limiting itself to a subset of the total
recursive functions) and liberty/universality (getting all total computable
function, but then also some strictly partial one, and never being able to know
that in advance).
That explain why the universal machine are never satisfied, and evolves, in a
escaping forward sort of way. Cooperation and evolution is inevitable in the
setting.
Cooperation with who? and at what cost? That's like saying our
cooperation with cattle is inevitable.
[...]
nice air of being godlike. Again, I guess AI will have no need for
feeling like this, or not much of feelings at all. Feeling is
adversarial to judgement.
I disagree. Feeling is just the mark of value, and values are
necessary for judgement, at least any judgment of what action to
take.
I disagree. I can easily give something a value without feeling about
it. Example: gold is just a yellow metal. I know other people value it
a lot, so I might preserve it for trading, but it does not make very
good knives. Highly impractical in the woods or for plowing
fields. But it might be used for catching fish, perhaps. They seem to
like swallowing little blinking things attached to a hook.
I was referring to fundamental values. Of course many things, like gold and
fish hooks, have instrumental value which derive from there usefulness in
satisfying fundamental values, the ones that correlate with feelings. If the
AI has no fundamental values, it will have no instrumental ones too.
It will have all of this with simple universal goal, like “help yourself”, or
“do whatever it takes to survive”.
Why would it even have a simple goal like "survive"? And to help
yourself is saying no more that it will have some fundamental
goal...otherwise there's no distinction between "help" and "hurt".
Brent
That can be expressed through small codes (genetic, or not). The probability
that such code appears on Earth might still be very low, making us rare in the
local physical reality, even if provably infinitely numerous in the global
arithmetical reality.
Bruno
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/b212022f-9313-a6c3-6309-61ab0719fd9a%40verizon.net.