Graffiti on stop signs could trick driverless cars into driving dangerously

Driverless cars over the years have shown great progress. But one the daunting challenges that the industry is facing currently,  is that at times these cars can simply choose to ignore or avoid road signs on the streets. And one of the major reasons for this mishap is the simple vandalism of signboards on streets.

While there may have been lot of probing and brainstorming in the backdrop for ways to protect autonomous cars against hackers, more conventional attacks could confuse them vehicles into misreading road signs that would appear normal to any human drivers.

Covering a certain or a whole portion of a road sign with stickers or posters can trick a smart car. And what it leads into is the car eventually ignoring stop signs, even if visually they appear the same to the human drivers.

Graffiti on stop signs could trick driverless cars into driving dangerously

Researchers at the University of Washington, recently demonstrated as to how cyber hackers who gained access to the mainframe software within the vehicle, could create simple alterations to road signs that would cause a stir for the cars to ignore them.

Such tricks and stirs within the network can reprogram the entire system of the car “to misbehave in unexpected and potentially dangerous ways.”

Many cases of examples were taken by the Research team to facilitate the truth behind this concept. Minor changes to the signs can lead to major destruction and haphazard.
Graffiti on stop signs could trick driverless cars into driving dangerously

“All of our attack classes do not require special resources—only access to a colour printer and a camera,” the researchers said. The team said they hoped the research could help build better defensive systems into autonomous vehicles.

The dangers of such attacks could see cars driving straight through junctions or coming to a halt in the middle of the road. Some current cars are already equipped to read and detect signs, such as Tesla’s Autopilot feature on its Model S electric cars, although the vehicles are not yet programmed to react to the signs.

Threats to self-driving cars have turned out to be tricky and a sore experience for researchers to fathom. Engineers at Volvo have been trying for two years to teach their cars how to avoid kangaroo collisions, while a team at Waymo was forced to develop a pair of tiny windscreen wipers to clean bird droppings that masked the cars cameras and LIDAR systems.

This week the government announced new guidance to develop safer driverless cars. Transport minister Lord Callanan said: “We need to make sure that the designs of the vehicles in the first place are completely cyber secure so that people can’t break into them, they can’t steal them and more importantly they can’t hack them to potentially cause accidents.”