Signs from above: Drone with projector successfully trolls car AI
After a recent demo using GNSS spoofing confused a Tesla, a researcher from Cyber@BGU reached out about an alternative bit of car tech foolery. The Cyber@GBU team recently demonstrated an exploit against a Mobileye 630 PRO Advanced Driver Assist System (ADAS) installed on a Renault Captur, and the exploit relies on a drone with a projector faking street signs.
The Mobileye is a Level 0 system, which means it informs a human driver but does not automatically steer, brake, or accelerate the vehicle. This unfortunately limits the “wow factor” of Cyber@BGU’s exploit video—below, we can see the Mobileye incorrectly inform its driver that the speed limit has jumped from 30 km/h to 90 km/h (18.6 to 55.9 mph), but we don’t get to see the Renault take off like a scalded dog in the middle of a college campus. It’s still a sobering demonstration of all the ways tricky humans can mess with immature, insufficiently-trained AI.
Ben Nassi, a PhD student at CBG and member of the team spoofing the ADAS, created both the video and a page succinctly laying out the security-related questions raised by this experiment. The detailed academic paper the university group prepared goes further than the video in interesting directions—for instance, the Mobileye ignored signs of the wrong shape, but the system turned out to be perfectly willing to detect signs of the wrong color and size. Even more interestingly, 100ms was enough display time to spoof the ADAS even if that’s brief enough many humans wouldn’t spot the fake sign at all. The Cyber@BGU team also tested the influence of ambient light on false detections: it was easier to spoof the system late in the afternoon or at night, but attacks were reasonably likely to succeed even in fairly bright conditions.
Read 4 remaining paragraphs | Comments