I read an article about the paper in Astral Codex Ten. LLMs have obvious
applications in businesses, where the rules would be about how to handle
customer requests for things that would normally require talking to a
human. It remains to be seen how well this works. But obviously you don't
want to offend customers or embarrass the company. Businesses are best off
not taking any political positions.

LLMs are still experimental at this point, but will have enormous power to
control social media content for the benefit of both government and
corporations. This is a dangerous trend. Young people are leaving Facebook,
which is about connecting with people in real life, and joining TikTok,
which is more about addictive entertainment. It's part of the general trend
toward AI isolating us and making us irrelevant. When AI surpasses human
intelligence, we prefer its company. When you no longer care about other
people, they won't care about you.

On Fri, May 12, 2023, 8:54 AM James Bowery <[email protected]> wrote:

>
>
> On Thu, May 11, 2023 at 9:32 AM Matt Mahoney <[email protected]>
> wrote:
>
>> ...
>> Anyway there is a technique called constitutional learning. You give a
>> large language model a set of rules like "don't say anything racist"
>>
>
> Too bad they didn't try aligning an LLM with "don't say anything
> misleading".
>
> The results themselves might have been misleading but would at least be
> more interesting than "don't say anything racist".
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/Te3d0dad0e74ec301-M7414c82aeffe0b95bc881f3e>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te3d0dad0e74ec301-Me79dfc35ec98b12e6952d1de
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to