Aha, so I was wrong about which of Aristotle's writings you were referencing.
The telos of an autonomous vehicle is transportation of cargo (human or not) from point A to point B. The autonomous car in the xkcd cartoon Cody found is fulfilling its telos. The efficient cause of an autonomous vehicle includes human user(s) and the humans who made the vehicle. The programmers are part of that second group. Black Hat in the xkcd is a human user and the efficient cause of the vehicle in the comic trying to drive to Alaska. The formal cause of an autonomous vehicle is the form of a vehicle. The material cause of the vehicle is probably the weakest of the four causes. Such a vehicle will be made of metal, plastic (oil), glass (sand, fire), and lots of other materials. This is where Aristotle's philosophy smacks into modern technology. In Aristotlean terms, the material cause of an autonomous vehicle is mostly earth with some fire. However, I have no idea how the virtual would fit into his philosophy - is software air or water? At the risk of being unpopular on this group, I would point out that many gun-owners have made the argument that none of their guns have spontaneously fired. Referring back to Ethics - an arm (whether or not it holds a sword) does not harm without voluntary movement by the person. Ray Parks Consilient Heuristician/IDART Old-Timer V: 505-844-4024 M: 505-238-9359 P: 505-951-6084 On Aug 5, 2015, at 1:35 PM, glen wrote: Sorry for being vague. I meant the 4 causes: formal, efficient, material, and final. Rosen yapped endlessly about agency, efficient cause. They're rolling over in their graves because the idea that the automatic car is _not_ responsible but the programmers or the drivers _are_ avoids the separation of cause into the 4 types. Useless anecdote: I opened the fridge one day and noticed the CO2 regulator on the keg was broken. I asked my office mate about it. He said: "Yeah, the regulator broke." I asked: "It just spontaneously broke all by itself?" He didn't respond. On 08/05/2015 12:26 PM, Parks, Raymond wrote: Ok, I think I get the reference to Aristotle's Nicomachean Ethics III regarding voluntary action or volition. However, you have once again puzzled me as I don't understand how Robert Rosen is relevant. Are you thinking that the programmers of an autonomous vehicle do not have a relationship with the actions of that vehicle? They are responsible for the metabolic and repair subsystems of the vehicle. I would argue that the software algorithms that control the vehicle are metabolic. On Aug 5, 2015, at 1:04 PM, glen wrote: Heh, Aristotle and Robert Rosen just rolled over in their graves. On 08/05/2015 10:25 AM, Parks, Raymond wrote: Her gripe with the second is that a car (or truck or ...) has no volition - it must be controlled by someone. The driver hit the group of children with the car under their control. This will still be true for autonomous vehicles - even if the passengers in the car have no control (unlikely), the software developers who program the algorithms of the autonomous vehicle will be liable when the car hits the school children - the programmers hit the school children. -- ⇔ glen ============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
