GhostStripe Attack Haunts Self-Driving Cars by Making Them Ignore Road Signs

A group of researchers, primarily from Singapore-based universities, has demonstrated the feasibility of attacking autonomous vehicles by exploiting their reliance on camera-based computer vision systems. Dubbed GhostStripe, the attack manipulates the sensors used by brands like Tesla and Baidu Apollo, which rely on complementary metal oxide semiconductor (CMOS) sensors. By exploiting the way CMOS cameras capture images line by line with an electronic rolling shutter, GhostStripe alters the light to create a pattern that makes road signs unrecognizable to the vehicle's classification system.

Security Officer Comments:
The researchers developed two versions of the attack, GhostStripe1 and GhostStripe2, achieving success rates of 94% and 97%, respectively, in real-world tests. Countermeasures include replacing CMOS cameras with charge coupled devices (CCD) or randomizing the capture of line images. This study adds to concerns about the safety and security of AI-driven autonomous vehicles, highlighting the need for robust solutions to address adversarial attacks on their systems.

If members are interested in accessing the full analysis and technical details, there is a collaborative paper authored by various researchers and institutions available here: