On Thu, Jun 15, 2023 at 7:55 AM James Bowery <[email protected]> wrote:

>
>
> On Thu, Jun 15, 2023 at 12:26 AM YKY (Yan King Yin, 甄景贤) <
> [email protected]> wrote:
>
>> On Wed, Jun 14, 2023, 00:46 James Bowery <[email protected]> wrote:
>>
>>>
>>>
>>> On Tue, Jun 13, 2023 at 11:06 AM YKY (Yan King Yin, 甄景贤) <
>>> [email protected]> wrote:
>>>
>>>> ...
>>>> If someone tells the truth, it is not considered racist.
>>>>
>>>
>>> Your passive voice elides the truth of the word "racist" as it is in
>>> common usage.
>>>
>>
>> The word "racist" in common usage usually refers to something bad or
>> unethical.   It means making distinctions based on race AND extracting
>> advantages or privileges from such a distinction.
>>
>
> Sorry but you're missing the critical distinction between connotation and
> denotation in natural language usage.
>
> It is quite frequent for pejoratives to masquerade as denotative soas to
> import pejorative connotations and vis versa.  It is incredibly sleazy yet
> it is foundational to the moral zeitgeist.
>
> All you need to see this is look at the physical assault of Charles Murray
> on college campuses due to his having coauthored "The Bell Curve".
>

And by the way, this is *exactly* the problem with the entire field of
"algorithmic bias" in its ignorance of AIXI's unification of "is" and
"ought" embodied as Algorithmic Information Theory's "is" unified with
Sequential Decision Theory's "ought".  This is such a trivially obvious and
rigorous approach to the ethics of AGI that it is entirely reasonable to
posit the "algorithmic bias" industry's ignorance of it renders the entire
field without ethics.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te3d0dad0e74ec301-M48170b9ef080f858cb2e66a6
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to