David Lochrin wrote: Only a human can assume moral or legal responsibility, so who would be > responsible for a death caused by the actions of a vehicle computer?
A company has responsibilities if your electric kettle explodes, or your new fence falls on a passing pedestrian. It's not a new legal or moral problem, just a different product. Some people will have problems with self driving cars, sooner of later someone will be killed by one. Just like human-driven cars, new drugs, electric kettles, whatever. The usual considerations will apply. Jim _______________________________________________ Link mailing list [email protected] http://mailman.anu.edu.au/mailman/listinfo/link
