What happens if the problems caused by autonomous cars are not the result of mistakes, but the result of deliberate attacks?
21 Nov 2023
7 min. read
The robotaxis fleet hit the brakes, citing the need to “rebuild public trust”. This story has been around for a while.
It seems like nothing at first, or at least not the start of a big security story: A video shared on the social networking site Reddit that shows a group of robotaxis in Austin, Texas arriving on a central road and stopping for many, causing an ad hoc traffic jam scene, which became all-very often due to the increasing popularity of the platform. A quick search found IT article that mentions the event, which is not unusual at all. Driverless or autonomous vehicles are currently operating in San Francisco and Las Vegas, with pilot programs in about a dozen other cities. throughout the United States, from Seattle to Miami. And in case you’re wondering, this isn’t a uniquely American issue: Driverless cars are being developed and tested all over the place, too. Europe and ASIA as well.
Currently, the problems caused by autonomous vehicles, such as traffic jams, driving on wet concrete and blocking emergency service vehicles, are real. They are also the result of non-malicious mistakes on the part of driverless car companies. But what happens if it is not the result of mistakes, but the result of deliberate attacks?
If there’s one thing we’ve learned over the decades in computer security, it’s that any technology that’s successful will attract its entrepreneurs, looking to make money – both legally and illegally. For cybercriminals, the allure of autonomous cars should be even brighter. Apart from the more notorious criminal activities that take place entirely in the cyberdomain, such as account theft targeting consumers and Ransomware targeting businesses, with vehicles playing in the physical world also offers some interesting opportunities:
- Extorting customers for their travel history. Got a shade you don’t want to share? This is the automotive equivalent of revenge porn.
- Remote takeover of cars, aka drivesomware
- Stopping some (or all) autonomous cars in their tracks could be a new model for ransomware-style extortion.
- Threatening to wipe the cars’ local storage or overwrite their firmware so they can no longer operate creates huge costs for the car fleet owner, who not only has to recover each car, but return it. also the firmware and software of each while hopefully patching the vulnerabilities that allowed them to be exploited in the first place.
- Carjacking (whole or stripping for parts) – stop at the (chop) shop on the way home and lighten the car’s load of salable items, an on-the-go automotive diet.
- The kidnapping of passengers – even the threat of not releasing them and making them pay works for some: after all, they have a digital payment method in their pocket or wallet, which puts a great chance of ransom. Do they think they should pay more? Get them to a remote location straight out of a bad TV show plot with ropes and dim lights before they can call the police. For that, the fleet operator will extort money so that their passengers will not be kidnapped, a 21St century twist on the old protection racket.
- Sending vehicles to a specific location to cause a traffic jam. Think of it as TJaaS – Traffic Jam as a Service; think DDoS on cars.
- Target busy intersections or motorways at rush hour. For roads that are already jammed conventionally-driven vehicles, which create larger traffic jams to more slow down the traffic and then disperse the vehicles; who knows what really happened?
- Congested airports, train stations, or bus terminals can serve as vehicle barriers for bad actors looking to elude law enforcement while they do their dirty work. Traffic jams caused by autonomous vehicles may prevent police from getting to a bank robbery.
- Blocking emergency services – a variation of SWATting where you are law enforcement DISTANCEfor a price of course.
- Covering for other organized criminal activity, for example, flash mob robbery by criminal gangs; use of vehicles for the transfer of illegal goods. How does the car know it’s making a drug deal with “left luggage?”
- Disabling safety features / causing crashes. Crashes between autonomous vehicles are big news though, so if a bad actor short the company’s stock and then deploy malware into cars, this creates a hard-to-detect “insider trading” stock sell-off.
It should be noted that robotaxis are not the only vehicles that can be used for such attacks. There is an ever-increasing number of private vehicles on the road with self-driving capabilities and anti-theft/remote lockout functions that can be triggered.
If this all sounds … well, weird, for lack of a better term … we want to point out that runaway cars are no longer fiction, but reality: In October 2023, the electric car in Scotland lost all control and the driver had to ram it into a police van to stop it. While not a fully autonomous car, it has a sophisticated driver assistance system that seems to have failed, leaving the car unable to slow down or turn off the engine. Although this does not appear to be a result of any malicious activity, it certainly shows how dependent vehicles are on their computer systems.
Another possible concern about automated vehicles is commercial trucks. An autonomous truck carrying valuable cargo can be stopped or moved to a location chosen by criminals and its cargo stolen before the police arrive. Trucks can also be used to block transit hubs, such as ports where cargo is unloaded from ships.
In addition, they can also be used as battering rams to enter restricted areas separated by gates, bollards, or other barriers. It harkens back to the heady days of the hastily built steel-clad impromptu armored vehicles that spawned the A-Team but run by computer programmers with nefarious intentions.
Autonomous vehicles appear to be very open to becoming victims of more widely available GPS jamming techniques that can be localized to intercept and “retrain” the vehicles to perform the bidding of the attacker. A botnet of cars drifting at the command of its guardians can produce a powerful video that is sure to go viral, regardless of the technical details.
To be fair, any new technology, especially during the beginning of the population boom zeitgeist, boggles the imagination and is guaranteed to present obstacles. But the rise in popularity has also attracted technozealots who may help strengthen digital defenses so that robotaxis hordes don’t become the subject of B-movie plots with no expensive actors, or nothing’ y many.
Autonomous vehicles in the form of cars that can drive on the same roads as traditional human-driven cars represent one of the biggest changes in automotive technology in recent decades. It seems that some basic precautions learned from more than a century of transportation engineering should not be forgotten:
- Autonomous vehicles owned by individuals or businesses must have controls that can be operated by a human in an emergency. As good as AI is for driving, it may not be able to anticipate and respond to all situations that a human driver can. Providing steering, acceleration, and braking mechanisms that remove AI “autopilot” can mean the difference between saving lives and “just” being in an accident. Machines are good at navigating known patterns, but humans can handle wildcard events that cannot reasonably be covered by automatic training sets. A kid wearing a ghost costume running to scare you? You know what to do but your car may not.
- For vehicles intended to operate as taxi or shuttle services, the emergency braking system should be accessible to passengers, not different those emergency pull cords or buttons used in passenger rail and subway cars. Although technically it should operate differently because railways operate differently than roads, the desired result is to bring the self-driving car to a safe stop in a way that does not harm the its passengers, other vehicles around it, or nearby pedestrians. .
- It doesn’t matter if it’s a person controlling an autonomous vehicle, or just pulling the emergency brake, these actions should automatically notify fleet operations and emergency services when activated, such as existing service provided by General Motors. OnStarin Subaru STARLINKand so on AACN (advanced automatic collision notification) services are now being created.
Autonomous vehicles have the potential to create a safer future for everyone on the road. However, safety should be the primary concern for autonomous vehicle manufacturers and fleet operators (which are sometimes the same thing, and sometimes not) alike. That can only happen if these vehicles are engineered in a way that puts safety first.