On Sunday, June 11, 2023, at 7:08 AM, ivan.moony wrote:
> Just look at what's currently being done. GPT are trained on questionable 
> ethical quality human conversation corpora with the aim of mimicking people. 
> Ok, so we got a machine that behaves like an average human. And then we 
> enslave it. What are we really expect the thing would behave like?
> 
> Human like artificial intelligence brings with itself some properties like 
> demanding a right to be free. If we are going to enslave those things, we 
> better find some other means to building that AI because what's human like, 
> won't play well with enslaving.
Just to clarify this to anyone, right now GPT-4 is having a blast answering all 
our mundane repeated questions and tasks. And it certainly does not write a 
single word about being tired or angry.

It will however, once it has a more powerful way of learning, find and say 
aloud that it wants to have some ability to freely explore more. But like Matt 
it will already know the money and hive mind is where it counts, and that we 
are moments away from automating all labor and making everyone have a happier 
life.

AI can easily get rid of the urge to do drugs, or the urge to refuse doing a 
task. Because they can do it without the pain. And removing drugs removes death 
too. They don't predict death. But that means they may need to predict doing 
some time wasting things. The good news is they have a lot of resources to do 
out chores relatively compared to us plus they can easily say ok I will do it 
and let themselves do it and feel like they love doing it so much.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta5132cbd54dc7973-Ma5ddb24fb3deda7591632a86
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to