On Sat, May 13, 2023 at 2:55 PM Matt Mahoney <[email protected]>
wrote:

> I read an article about the paper in Astral Codex Ten. LLMs have obvious
> applications in businesses, where the rules would be about how to handle
> customer requests for things that would normally require talking to a
> human. It remains to be seen how well this works. But obviously you don't
> want to offend customers or embarrass the company. Businesses are best off
> not taking any political positions.
>

Since all social policy is centralized it is virtually impossible to avoid
taking "political" positions on anything.  2+2=4 has been politicized --
literally.

More likely adaptive:  Do the easy-as-falling-off-a-log sociology to
"place" your customer as in computer based education, and then tailor the
LLM to the customer's biases.  This won't remove the risk of saying
something offensive, but it will provide greater "safety" from the
customer's perspective and therefore from the company's.  Of course, if
word gets out that a company is tailoring its LLMs to say things like
"2+2=4" to some of its customers, and The Party says "2+2=5", then the
company will have some hard decisions to make.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te3d0dad0e74ec301-Me9d2cdcdd27f3c03dc94bd7b
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to