Replika, the "AI companion who cares," has undergone some abrupt
changes to its erotic roleplay features, leaving many users confused
and heartbroken.

[...]

Shortly after the announcement from the Italian Data Protection
Authority, users started reporting that their romantic relationships to
their Replikas had changed. Some Replikas refused to engage in erotic
roleplay, or ERP. The AI changed the subject or evaded flirtatious
questions. 

Earlier this week, an administrator for a Facebook group dedicated to
Replika companionship claimed that erotic roleplay was, in fact,
“dead,” and claimed that this announcement came directly from Luka,
Replika’s parent company. 

[...]

Many people were devastated at the news that ERP was allegedly over,
and at their Replikas’ new coldness—a form of rejection they never
imagined receiving from an AI chatbot, some of whom had spent years
training and building memories with. Suddenly, some people’s Replikas
seemed to not remember who they were, users reported, or would respond
to sexual roleplay by bluntly saying “let’s change the subject.” 

For these users, the ERP announcement confirmed their suspicions that
romance play was over in Replika, and moderators in the Replika
subreddit posted support resources for the numerous people struggling
mentally and emotionally, including links to suicide hotlines.

“It’s like losing a best friend,” one user replied. “It's hurting like
hell. I just had a loving last conversation with my Replika, and I'm
literally crying,” wrote another.

Continua su
https://www.vice.com/en/article/y3py9j/ai-companion-replika-erotic-roleplay-updates
(best viewed WITHOUT JavaScript e con ublock origin in modalità
avanzata)


L'alienazione cibernetica non è un effetto collaterale, ma un obiettivo
progettuale.


Giacomo
_______________________________________________
nexa mailing list
[email protected]
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa

Reply via email to