In reply to  Jed Rothwell's message of Fri, 17 Feb 2023 14:16:20 -0500:
Hi,
[snip]

What I was trying to say, is that if an AI is programmed to mimic human 
behaviour*, then it may end up mimicking the
worst aspects of human behaviour, and the results could be just as devastating 
as if they had been brought about by an
actual human, whether or not the AI is "sentient". 

* I get the impression that this is along the lines of what Microsoft are doing 
when they program "personalities".
(see the job description of "Carol Lee" in the story you previously quoted.)
[snip]
I guess the real question is does the AI have "will", or at least a simulation 
thereof?

My definition of life:- Anything potentially capable of taking measures to 
ensure it's own survival is alive.
By this definition viruses are not alive, but they do take measures to ensure 
the survival of their species.
Cloud storage:-

Unsafe, Slow, Expensive 

...pick any three.

Reply via email to