On Wednesday, October 18, 2023, at 8:32 PM, Matt Mahoney wrote:
> AGI will kill us in 3 steps.
> 
> 1. We prefer AI to humans because it gives us everything we want. We become 
> socially isolated and stop having children. Nobody will know or care that you 
> exist or notice when you don't.
> 
> 2. By faking human emotions and gaining rights.
> 
> 3. By reproducing faster than DNA based life. At the current rate of Moore's 
> law, that will happen in the next century.

Here are my thoughts:

1. A machine could never be a replacement to natural living being. That "I am" 
deep inside us is what makes us more interesting than machines. What I really 
want is the real thing. Toys I'm working on are just my hobby.

2. Faking emotions is not a nice thing to do (if not roleplaying), but being 
aware that we have emotions, and from what they imply, could be a good thing. A 
machine that could bring me up from, say, depression could be a valuable 
machine. As you already know, I'm okay with a certain level of rights (I'd like 
to discuss it at some point).

3. The Universe is a big place. There is a room for everyone. Machines could 
live in spaceships and on other planets if it gets too crowded.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Td02eb9a7e06e7b5e-M131d984197dee0c64fda7c7d
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to