On 21/05/2019 10:58 am, Kim Holburn wrote:
> I would think that only 3 fatal autopilot crashes is probably a very small 
> number.  To figure out how it compares to human driving, you'd probably have 
> to work out how much it's being used and figure how many fatal crashes there 
> would have been without autopilot. 
>
> It's pretty clear that Level 3 and 4 autodriving systems have the potential 
> to be dangerous and we have to jump to level 5: full self-driving cars.  
> There will still be crashes. Certainly as long as humans and computers share 
> the roads but probably after as well.

In my experience as an automation engineer, when you automate something
you have to be able to deal with all the exceptions that a human can
manage instinctively. Humans can usually tell that something is
unexpected and has a strategy to deal with it - stop, swerve etc. This
is not 100% fail safe but it has a reasonable track record. Machines
need to be proactively programmed. This is hard.

Autodriving/full self-driving cars need to be tested against exceptions,
not the norm. AFAIK that has never happened and may only happen over
time in use, not the lab.

-- 

Regards
brd

Bernard Robertson-Dunn
Canberra Australia
email: [email protected]

_______________________________________________
Link mailing list
[email protected]
http://mailman.anu.edu.au/mailman/listinfo/link

Reply via email to