On 27/1/23 09:20, Tom Worthington wrote:

> AI will be able to mimic idiom, empathy, and keep it up day in and day out, 
> when a human would get tired and cranky. It doesn't take much to mimic 
> empathy. 
Isn't that a problem?  I think most people are pretty quick to detect when 
they're talking to a machine and it's rarely a positive experience, either for 
the individual or the organisation.

We also have to consider how much responsibility and authority an AI system 
carries.  Does the machine which allows a student an  extension of time for 
their end-of-semester submission (:-) or grants a customer more time to settle 
a debt have to explain their judgements to some human who ultimately carries 
the can?  Or will they not be given power to make those judgements?

It seems to me there's rather a divergence in our social licensing here.  As a 
society we're happy to allow self-driving cars on the roads even though they 
can kill people, and have done so.  But I wouldn't like to try telling a bank 
manager they're personally responsible for the autonomous decisions of some AI 
system.

David Lochrin
_______________________________________________
Link mailing list
[email protected]
https://mailman.anu.edu.au/mailman/listinfo/link

Reply via email to