The test concludes 18 months of research, according to Bloomberg, that illustrate weaknesses in machine learning systems used for automated driving.
Steve Povolny, head of advanced threat research at McAfee, says changes in the physical world can “confuse” these systems.
For the test, McAfee’s researchers used a 2016 Model S and Model X that had camera systems supplied by Mobileye under Tesla’s old agreement with the company that ended in 2016.
Tests performed on Mobileye’s newest camera system didn’t reveal the same vulnerabilities.
Mobileye defended their technology in a statement to Bloomberg, claiming humans could have also been fooled by the same type of sign modification.
The real-world threats of something similar happening are relatively low though.
Self-driving cars remain in development stage and are mostly being tested with safety drivers behind the wheel.
That is, of course, unless you’re one of the “lucky” beta testers driving around with your Tesla on Autopilot.
The weakness isn’t just specific to Tesla or Mobileye technology: it’s inherent in all self-driving systems.
Missy Cummings, a Duke University robotics professor and autonomous vehicle expert, summed it up:
“And that’s why it’s so dangerous, because you don’t have to access the system to hack it, you just have to access the world that we’re in.”
Source: Zero Hedge