This is an issue that I've been aware of for years now, going back to when
it sounded like only paranoid lunatics who love conspiracy theories would
be interested.  I was concerned enough back then to contact the US Homeland
Security departments for infrastructure and cyber threats, and Tesla as
well.

It wasn't something I wanted to be too vocal about at the time for obvious
reasons.  This is the first time I've seen it discussed openly, and I think
it's about time.  There's more to it than what is covered here - there are
also very troubling CANBUS vulnerabilities that make it a concern for
non-autonomous vehicles, and no one has addressed how these attacks might
be scaled up to a coordinated, nationwide level.

The bottom line here is that once you get into the system you can do pretty
much anything you want.  Those who would wish us harm don't  really need a
lot of expertise to wreak some pretty nasty and creative havoc as things
are right now.

I would hope we're smart enough to recognize the danger and work
effectively and pre-emptively to prevent such attacks, but experience shows
we're much more likely to wait until a successfully deadly attack catches
us "off-guard." THEN we'll act like it's a serious concern.

Chris

On Sat, Jun 29, 2019, 12:41 AM brucedp5 via EV <[email protected]> wrote:

>
>
>
> https://www.gpsworld.com/tesla-model-s-and-model-3-vulnerable-to-gnss-spoofing-attacks/
> Tesla Model S and Model 3 vulnerable to GNSS spoofing attacks
> June 28, 2019
>
> [image
> https://www.gpsworld.com/wp-content/uploads/TeslaModel3.jpg
> Tesla Model 3. (Photo: Tesla)
> ]
>
> Autopilot Navigation Steers Car off Road, Research from Regulus Cyber Shows
>
> Tesla Model S and Model 3 — electric cars built for speed and safety — are
> vulnerable to cyberattacks aimed at their navigation systems, according to
> recent research from Regulus Cyber.
>
> During a test drive using Tesla’s Navigate on Autopilot feature, a staged
> attack caused the car to suddenly slow down and unexpectedly veer off the
> main road. Regulus Cyber, the first company to deal with smart-sensor
> security across a wide range of applications including automotive, mobile,
> and critical infrastructure, initially discovered the Tesla vulnerability
> during its ongoing study of the threat that easily accessible spoofing
> technology poses to GNSS receivers.
>
> The Regulus Cyber researchers found that spoofing attacks on the Tesla GNSS
> receiver could easily be carried out wirelessly and remotely, exploiting
> security vulnerabilities in mission-critical telematics, sensor fusion, and
> navigation capabilities.
>
> Regulus Cyber experts traveled to Europe last week to test-drive the Tesla
> Model 3 using Navigate on Autopilot. An active guidance feature for its
> Enhanced Autopilot platform, it’s meant to make following the route to a
> destination easier, which includes suggesting and making lane changes and
> taking interchange exits, all with driver supervision.
>
> While it initially required drivers to confirm lane changes using the turn
> signals before the car moved into an adjacent lane, current versions of
> Navigate on Autopilot allow drivers to waive the confirmation requirement
> if
> they choose, meaning the car can activate the turn signal and start turning
> on its own. Tesla emphasizes that “in both of these scenarios until truly
> driverless cars are validated and approved by regulators, drivers are
> responsible for and must remain ready to take manual control of their car
> at
> all times.”
>
> Designed to reveal how the semi-autonomous Model S and Model 3 would react
> to a spoofing attack, the Regulus Cyber test began with the car driving
> normally and the autopilot navigation feature activated, maintaining a
> constant speed and position in the middle of the lane.
>
> Although the car was three miles away from the planned exit when the
> spoofing attack began, the car reacted as if the exit was just 500 feet
> away
> — abruptly slowing down, activating the right turn signal, and making a
> sharp turn off the main road. The driver immediately took manual control
> but
> couldn’t stop the car from leaving the road.
>
> The testing revealed another unexpected finding that significantly
> amplified
> the threat—a link between the car’s navigation and air suspension systems.
> This resulted in the height of the car changing unexpectedly while moving
> because the suspension system “thought” it was driving through various
> locations during the test, either on smooth roadways, when the car was
> lowered for greater aerodynamics, or “off-road” streets, which would
> activate the car elevating its undercarriage to avoid any obstacles on the
> road.
>
> Yoav Zangvil, Regulus Cyber CTO and co-founder, explains that GNSS spoofing
> is a growing threat to ADAS and autonomous vehicles. “Until now, awareness
> of cybersecurity issues with GNSS and sensors has been limited in the
> automotive industry. But as dependency on GNSS is on the rise, there’s a
> real need to bridge the gap between its tremendous inherent benefits and
> its
> potential hazards. It’s crucial today for the automotive industry to adopt
> a
> proactive approach towards cybersecurity.”
>
> The Regulus Cyber testing is designed to assess the impact of spoofing with
> low-cost, open source hardware and software, the same kind of technology
> that is accessible to anyone via e-commerce websites and open source
> projects on GitHub. Taking control of Tesla’s GPS with off-the-shelf tools
> took less than one minute.
>
> The researchers were able to remotely affect various aspects of the driving
> experience, including navigation, mapping, power calculations, and the
> suspension system. Under attack, the GNSS system displayed incorrect
> positions on the maps, making it impossible to plot an accurate route to
> the
> destination.
>
> Prior to the Model 3 road test, Regulus Cyber provided its Model S research
> results to the Tesla Vulnerability Reporting Team, which responded with the
> following points at that time:
>
> Any product or service that uses the public GPS broadcast system can be
> affected by GPS spoofing, which is why this kind of attack is considered a
> federal crime. Even though this research doesn’t demonstrate any
> Tesla-specific vulnerabilities, that hasn’t stopped us from taking steps to
> introduce safeguards in the future which we believe will make our products
> more secure against these kinds of attacks.
>
> The effect of GPS spoofing on Tesla cars is minimal and does not pose a
> safety risk, given that it would at most slightly raise or lower the
> vehicle’s air suspension system, which is not unsafe to do during regular
> driving or potentially route a driver to an incorrect location during
> manual
> driving.
>
> While these researchers did not test the effects of GPS spoofing when
> Autopilot or Navigate on Autopilot was in use, we know that drivers using
> those features must still be responsible for the car at all times and can
> easily override Autopilot and Navigate on Autopilot at any time by using
> the
> steering wheel or brakes, and should always be prepared to do so.
>
> “This is a distressing answer by a car manufacturer that is the
> self-proclaimed leader in the autonomous vehicle race,” Zangvil comments.
> “As drivers and safety/security experts, we’re not comforted by vague hints
> towards future safeguards and statements that dismiss the threats of GPS
> attacks.” He offers the following counterpoints in response:
>
>     Attacks against any GPS system are indeed considered a crime because
> their effects are dangerous, as we’ve shown, yet the same devices we used
> to
> simulate the attacks are legally accessible to any person, online via
> e-commerce sites
>
>     Taking steps to “introduce safeguards for the future” indicates that
> spoofing is, in fact, a major issue for Tesla, which relies heavily on GNSS
>
>     In the case of cars, a spoofing attack is confusing in the best case,
> and a threat to safety in more severe scenarios
>
>     The more GPS data is leveraged in automated driver assistance systems,
> the stronger and more unpredictable the effects of spoofing becomes
>
>     The fact that spoofing causes unforeseen results like unintentional
> acceleration and deceleration, as we’ve shown, clearly demonstrates that
> GNSS spoofing raises a safety issue that must be addressed
>
>     In addition, the spoofing attack made the car engage in a physical
> maneuver off the road, providing a dire glimpse into the troubled future of
> autonomous cars that would have to rely on unsecure GNSS for navigation and
> decision-making
>
>     Given that the trust of the public still has to be earned as the
> automotive industry moves towards autonomy, the leading players are
> accountable for a responsible deployment of new technology
>
>     As Tesla clearly stated, drivers are responsible for overriding
> autopilot under a spoofing attack, so it appears its auto pilot system
> can’t
> be trusted to function safely under a spoofing attack.
>
>     Because every GNSS/GPS broadcast system can be affected by GNSS/GPS
> spoofing, the issue is everyone’s problem and shouldn’t be ignored;
> furthermore, governments and regulators that have a mandate to protect the
> public’s safety must engage in proactive measures to ensure only safe GNSS
> receivers are used in cars.
>
> “According to Tesla, they’ll soon be releasing completely autonomous cars
> utilizing GNSS, which means that, in theory, an attacker could remotely
> control the car’s route planning and navigation,” Zangvil said. “We’re
> obligated to ask what steps they’re taking to address this threat, and
> whether new safeguards will be implemented in its next generation of
> entirely autonomous cars.”
>
> Although Regulus Cyber researchers tested only the Model S and Model 3,
> they
> concluded that the “disturbing vulnerability” of Tesla’s GNSS system is
> most
> likely company-wide, as the same chipsets are used across the Tesla fleet.
>
> “Just a few months ago we saw that during a spoofing incident in a car show
> in Geneva, seven different car manufacturers complained that their cars
> were
> being spoofed. This incident proves that many other automotive companies
> that are working on the next generation of autonomous cars are also
> vulnerable to these attacks. As an industry, to win public trust and
> succeed, every car manufacturer should be proactive and prepare against
> these threats,” Zangvil said.
> [© gpsworld.com]
>
>
>
>
> For EVLN EV-newswire posts use:
>  http://evdl.org/archive/
>
>
> {brucedp.neocities.org}
>
> --
> Sent from: http://electric-vehicle-discussion-list.413529.n4.nabble.com/
> _______________________________________________
> UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
> http://lists.evdl.org/listinfo.cgi/ev-evdl.org
> Please discuss EV drag racing at NEDRA (
> http://groups.yahoo.com/group/NEDRA)
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<http://lists.evdl.org/private.cgi/ev-evdl.org/attachments/20190629/821ae379/attachment.html>
_______________________________________________
UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
http://lists.evdl.org/listinfo.cgi/ev-evdl.org
Please discuss EV drag racing at NEDRA (http://groups.yahoo.com/group/NEDRA)

Reply via email to