Hackers Trick Tesla Into Breaking Speed Limit By 50MPH With 2 Inches Of Tape

Free Speech Website Management
Fight Censorship, Share This Post!

Hackers Trick Tesla Into Breaking Speed Limit By 50MPH With 2 Inches Of Tape

Welcome to the future we deserve.

Technicians at McAfee, Inc. wanted to test out exactly how well Tesla’s Autopilot system worked. So they decided to take a strip of electrical tape and put it across the middle of the “3” in a “35 mile per hour” speed limit sign, tricking the car into thinking the sign said “85 miles per hour”. 

The test concludes 18 months of research, according to Bloomberg, that illustrate weaknesses in machine learning systems used for automated driving. Steve Povolny, head of advanced threat research at McAfee, says changes in the physical world can “confuse” these systems.

[youtube https://www.youtube.com/watch?v=4uGV_fRj0UA]

For the test, McAfee’s researchers used a 2016 Model S and Model X that had camera systems supplied by Mobileye under Tesla’s old agreement with the company that ended in 2016. Tests performed on Mobileye’s newest camera system didn’t reveal the same vulnerabilities.

Mobileye defended their technology in a statement to Bloomberg, claiming humans could have also been fooled by the same type of sign modification. 

“Autonomous vehicle technology will not rely on sensing alone, but will also be supported by various other technologies and data, such as crowd sourced mapping, to ensure the reliability of the information received from the camera sensor and offer more robust redundancies and safety,” Mobileye said.

Also according to McAfee technicians, Teslas don’t rely on traffic sign recognition.

Povolny commented: 

“Manufacturers and vendors are aware of the problem and they’re learning from the problem. But it doesn’t change the fact that there are a lot of blind spots in this industry.”

The real-world threats of something similar happening are relatively low. Self-driving cars remain in development stage and are mostly being tested with safety drivers behind the wheel. That is, of course, unless you’re one of the “lucky” beta testers driving around with your Tesla on Autopilot. 

McAfee’s researchers say they were only able to trick the system by duplicating a “sequence involving when a driver-assist function was turned on and encountered the altered speed limit sign.”

“It’s quite improbable that we’ll ever see this in the wild or that attackers will try to leverage this until we have truly autonomous vehicles, and by that point we hope that these kinds of flaws are addressed earlier on,” Povolny concluded.

The weakness isn’t just specific to Tesla or Mobileye technology: it’s inherent in all self-driving systems. 

Missy Cummings, a Duke University robotics professor and autonomous vehicle expert, summed it up: “And that’s why it’s so dangerous, because you don’t have to access the system to hack it, you just have to access the world that we’re in.”


Tyler Durden

Thu, 02/20/2020 – 20:45


This post has been republished with implied permission from a publicly-available RSS feed found on Zero Hedge. The views expressed by the original author(s) do not necessarily reflect the opinions or views of The Libertarian Hub, its owners or administrators. Any images included in the original article belong to and are the sole responsibility of the original author/website. The Libertarian Hub makes no claims of ownership of any imported photos/images and shall not be held liable for any unintended copyright infringement. Submit a DCMA takedown request.


Fight Censorship, Share This Post!

-> Click Here to Read the Original Article <-

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.