% Autopilot is not autonomous, driver is responsible for all actions %

http://www.topspeed.com/cars/car-news/tesla-model-s-driver-crashes-into-car-blames-autopilot-feature-ar173297.html
Tesla Model S Driver Crashes Into Car, Blames Autopilot Feature
05.19.2016  Kirby Garlitos

Tesla’ s Autopilot system may be leaps and bounds ahead of anything other
automakers can offer today, but it’s still far from a finished product. Two
incidents in the past week have put the spotlight on the Autopilot feature
as two drivers were on the receiving end of accidents that they believe were
caused by malfunctions on the part of Tesla’s Autopilot feature. For its
part, Tesla has denied taking any responsibility, instead shifting the blame
to the drivers and their carelessness in understanding the abilities of the
technology. 

The case of Arianne Simpson is the more recent of the two occurrences.
Speaking with Ars Technica, Simpson detailed the trail of events that led to
her Model S rear-ending a car at 40 mph. According to Simpson, the Autopilot
feature on her Model S didn’t brake like it was supposed to, forcing her to
slam the brakes herself. Unfortunately, her reaction came too late as the
Model S crashed into the car. Simpson blames the tech for not responding on
time, but according to Tesla and its data log, the blame rests on the
shoulder of the driver, who it says hit the brake pedal and deactivated the
car’s “autopilot and traffic aware cruise control”, thus returning the car
to manual control instantly.

Fortunately, neither Simpson nor the driver and passengers from the car she
rear-ended were hurt from the accident. Her Model S though appears to have
suffered significant damages on the front and will likely need some serious
repairs before it can return to driving. It’s the second case in less than a
week that a Model S owner is crying foul over what they perceives to be
serious flaws on the autonomous driving feature.

Just a few days before Simpson’s accident, a separate incident occurred in
Utah where Model S owner Jared Overton claimes that his car started on its
own, leading to it crashing into the back of a trailer. According to
Overton, he was running errands on April 29 when he went and parked his
Model S in one of his errand stops. No sooner than five minutes after
getting out of the car, Overton and a worker from the business he was
visiting saw his Model S had driven under a parked trailer, causing the
car’s windshield to smash. Overton’s complaint reached Tesla, who like in
the incident with Simpson, responded by reviewing the vehicle’s log and
determining that the crash was caused by Overton and his inattentiveness of
the situation. According to Tesla, the car’s Summon feature, which allows
the Model S to park by itself, among other functions, was “initiated by a
double-press of the gear selector stalk button, shifting from Drive to Park
and requesting Summon activation.” This led to the car driving straight into
the trailer.

Even if Tesla is right in both instances, having these customer complaints
still paints an unflattering picture of the company’s Autopilot feature.
Right or wrong, the electric carmaker needs to understand that complaints
like these will continue to happen and it’s on the company to ensure that
proper awareness of the functions of the Autopilot feature is disseminated
correctly to those who can access the technology from their cars.

Why It Matters
This is a very tricky situation for Tesla because even if it isn’t at fault
in both cases – data logs seem to back up its claim – it doesn’t give the
company any passes from escaping public perception. The truth is, autopilot
technology is still in its infancy and those who think that they can let
their cars drive them to their destinations are kidding themselves.

Whatever ignorance the public may have of the Autopilot feature is on them
to understand more and learn about what the tech can and can’t do. But
here’s the thing. I don’t think Tesla is still in the clear here, no matter
what its data logs say. Part of offering this technology is making sure that
people know how to use it. It’s one thing to have terms and conditions and
have owners agree to them, it’s another thing to really be proactive in
saying what the Autopilot tech can and can’t do. Let’s face it, a lot of
people skip through these terms and conditions and just sign up regardless
of what the fine print says. Yes, that’s wrong because these people don’t
know what they’re signing up for.

Tesla should know this and it should go above and beyond just offering a T&C
to escape legal liabilities. If Overton was right about something in the
aftermath of his Model S crashing into the trailer, it’s that it could’ve
been worse, and whether Tesla believes that it’s right or wrong, a more
serious accident that’s blamed at the Autopilot feature could have
far-reaching ramifications, not just for the Tesla and the tech itself, but
for the industry as a whole.

I’m not saying that Tesla’s in the wrong here because it has the data to
back up its findings. This is a matter of understanding the human nature of
experimentation and making sure everyone who gets the Autopilot feature that
it is something that shouldn’t be experimented on in busy roads.

Do something more than issue responses to these claims because some people
will be inclined to believe the people who were involved in the crashes.
Raise awareness – the proper and comprehensive kind – on the abilities and
limits of the Autopilot feature. Don’t settle with wavers and fine print;
this is technology which has the potential to shift the industry as a whole.
It deserves more than that.
[© topspeed.com]



http://www.autoblog.com/2016/05/16/second-tesla-model-s-driver-blames-autopilot-failure-for-crash/
Second Tesla Model S driver blames Autopilot failure for crash
May 16th 2016  Danny King

[image  
http://o.aolcdn.com/dims-shared/dims3/GLOB/legacy_thumbnail/800x450/format/jpg/quality/85/http://o.aolcdn.com/hss/storage/midas/c86206ac37abebd56ae0e30c45d09fed/203825156/autopilot.png
(drawing)
]

California driver blames car, Tesla says system worked properly

A driver of a Tesla Model S in California is blaming the electric vehicle's
Autopilot feature and its failure to properly engage the car's braking
system for a crash on Interstate 5. According to Ars Technica, Tesla driver
Arianna Simpson says her April 26 accident was the result of Tesla's
Autopilot feature not properly engaging its collision-avoidance feature. As
a result, she applied the brakes too late and rear-ended the vehicle in
front of her at about 40 miles per hour. No one was hurt in the accident.

Autopilot "does not turn a Tesla into an autonomous vehicle and does not
allow the driver to abdicate responsibility."
Tesla says that, just prior to the accident, Simpson disengaged Autopilot's
emergency-braking system by hitting the brakes and taking control of the
steering wheel. "Since the release of Autopilot, we've continuously educated
customers on the use of the feature, reminding them that they're responsible
for remaining alert and present when using Autopilot and must be prepared to
take control at all times," the company added in a statement to Autoblog.
"Autopilot is by far the most advanced such system on the road, but it does
not turn a Tesla into an autonomous vehicle and does not allow the driver to
abdicate responsibility."

The incident follows up a case in Utah where driver Jared Overton said his
Model S, which didn't have a driver inside at the time, rolled into a parked
trailer. He told The Verge that the Model S, which is equipped with the
Summon feature that allows the car to pull itself out of or into a parking
space, "went rogue." As with the Utah incident, Tesla says the California
accident was caused by driver error
[© 2016 AOL]
...
http://www.teslarati.com/second-tesla-model-s-driver-crashes-blames-autopilot-not-stopping/
Second Model S driver crashes and blames Tesla Autopilot for not stopping
May 16, 2016
...
http://speedlux.com/another-driver-blames-tesla-model-s-for-the-accident/
Another driver blames Tesla Model S for the accident
May 19, 2016



http://electric-vehicle-discussion-list.413529.n4.nabble.com/Auton-tester-allowed-Tesla-S-ram-into-truck-trailer-breaking-EV-s-windshield-tp4681986.html
Auton-tester allowed Tesla-S ram into truck-trailer> breaking EV's
windshield
May 13 2016




For EVLN EV-newswire posts use: 
http://evdl.org/evln/


{brucedp.150m.com}

--
View this message in context: 
http://electric-vehicle-discussion-list.413529.n4.nabble.com/Tesla-S-EV-driver-Simpson-caused-crash-disengaged-Autopilot-s-emergency-braking-tp4682124.html
Sent from the Electric Vehicle Discussion List mailing list archive at 
Nabble.com.
_______________________________________________
UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
http://lists.evdl.org/listinfo.cgi/ev-evdl.org
Read EVAngel's EV News at http://evdl.org/evln/
Please discuss EV drag racing at NEDRA (http://groups.yahoo.com/group/NEDRA)

Reply via email to