Attack on Tesla Autopilot highlights Bigger Risk of Insecure Sensors

Researchers from the firm Regulus Cyber say that they demonstrated a type of GPS spoofing attack that caused vehicles by Tesla to veer off the road. The impact could be much broader than just Tesla, however.

A remote spoofing attack targeting the auto-navigation system of the Tesla Model S and Model 3 electric vehicles can steer them suddenly off the road out of the driver’s control, demonstrating a persistent vulnerability in autonomous-driving systems, researchers found.

New research from Regulus Cyber staged an attack on the Tesla Model S during a test drive using Telsa’s Navigate on Autopilot feature. The wireless and remote attack saw researchers take control of the system’s GPS with off-the-shelf tools in less than one minute. They were able to successfully slow the car down and caused it to veer unexpectedly off the main road.

This is not the first time a Tesla has been remotely controlled in an erratic manner by an attack on Autopilot, showing that clearly the feature is vulnerable. Hackers previously managed to take remote control of a Tesla Model S by fooling the camera in Autopilot to recognize the lanes on the road erroneously, driving the car off the road.

Regulus Cyber researchers focused on another aspect of the Tesla Model 3’s auto-navigation system–its use of global navigation satellite systems (GNSS), or GPS, receivers to help the system correctly position the automobiles on the road. Regulus is the first security company to focus on security for sensors, especially those increasingly being used in automobiles and other modes of transportation, such as airplane and ships.

Not just a Tesla problem

GNSS is also found in other autonomous cars and is vulnerable to a spoofing attack, said Yoav Zangvil, CTO and co-founder of Regulus Cyber. In this type of attack, a system masquerading as the proper one sends incorrect data to the actual system, causing it to make decisions based on information that’s not true.

Next privacy trap for consumers? Their cars.

“We have been testing different car models for the past few months, and we discovered that every single car we tested so far is vulnerable to GNSS spoofing,” he said. “This is an indication of an industry-wide problem. Every single autonomous car level 2+ that utilizes GNSS for any task can be affected similarly–or worse.”

Navigate on Autopilot is Tesla’s semi-autonomous mode, which only can be activated if the car is driving on a road that has two lanes in each direction and if the car has a predefined destination, Zangvil explained.

“This mode includes all the features of cruise and autopilot with the addition of two new additional activities–changing lanes to maintain maximum speed to pass slow vehicles…and autonomously exiting highways at the relevant interchange,” he said.

The Tesla Model 3 (Source: Tesla)

The feature to exit a highway–which Regulus specifically targeted in an attack–does not require driver confirmation, and the car can automatically turn on a blinker, change lanes and physically turn off the highway onto an exit, driving about 250 meters before a driver is again required to regain control of the automobile.

A sudden turn…with the help of hackers

During a Regulus test drive, a Tesla S was driving on a main highway at 95 mph when it activated Navigate on Autopilot. It was meant to reach an exit in three miles and was driving on cruise control, maintaining constant speed in one lane.

When the spoofing attack started, it caused the system to believe it was less than 500 feet from the designated interchange and thus preparing to exit, Zangvil explained.

“The car immediately slowed to 40 mph, activated right blinker and took a sharp right turn into an emergency pit stop,” he said. “This all happened very fast and–even though the driver took manual control of the car as soon as it started drifting–it was too late and the car ended up turning into the pit stop.”

While the Model 3 is currently the only Tesla with this feature enabled, identical or similar capability will be available in other models from both Tesla and other car manufacturers soon, Zangvil cautioned.

Automakers need to step it up

Indeed, this latest research shows that Tesla and other manufacturers developing cars with autonomous driving features clearly have some work to do to make them less vulnerable to attack.

Prior to the Tesla Model 3 road test, Regulus said it reached out to Tesla with findings about the Tesla Model S and its vulnerability to spoofing attacks. The company responded in a lukewarm way, saying the effect of GPS spoofing on Tesla cars is “minimal and does not pose a safety risk,” according to Regulus.

Moreover, the company stressed that drivers using features like Autopilot and Navigate on Autopilot should always be ready to use the steering wheel and breaks to override the systems should an error occur.

Calling this lackluster response “distressing,” Zangvil said automakers need to step it up–particularly Tesla, which relies heaving on GNSS in its auto-navigation systems–since it’s clear that the vehicles are vulnerable to attack using common technology accessible to many.

Zangvil recommends that Tesla and other auto manufacturers equip their vehicles with new anti-spoofing technologies–including ones Regulus currently is working closely with the industry to develop–to prevent attack scenarios that could put drivers in danger.

“It is the responsibility of every car manufacturer to ensure its cars are safe, both physically and cyber-wise,” he said. “I would recommend every OEM, including Tesla, to equip only cyber-secure GNSS antennas and receivers that are designed to protect against such fake satellite signals.”

Comments are closed.