Just over a year ago, two good-guy hackers managed to take control of a Jeep Cherokee using Fiat Chrysler's Uconnect telematics system. Within days, FCA recalled 1.4 million Chrysler, Dodge, Jeep, and Ram vehicles to close the loophole that the hackers had exploited.
This week, those same hackers revealed that they've continued to work on the Cherokee, finding new ways to subject the SUV to increasingly frightening maneuvers. Meanwhile, another group of techies say that they've managed to confuse Tesla's semi-autonomous Autopilot system.
Last year, Charlie Miller and Chris Valasek showed the world that they could bypass Uconnect's security layers and tinker with the Jeep Cherokee from afar. Some of their futzing was pretty innocuous--for example, playing with the radio volume and adjusting the air-conditioner. However, they also proved that they could disable the Cherokee's brakes and the transmission, apply the brakes, and in some situations, control the vehicle's steering.
The good news is, FCA's Uconnect upgrade prevented Miller and Valasek's hack from working. The bad news is, some of the 1.4 million cars in need of the upgrade haven't been fixed. (That's not just an FCA problem, though: 45 million of the 105 million vehicles recalled from 2013 to 2015 have yet to be repaired.)
The worse news is, Miller and Valasek have kept prodding and poking at the Cherokee, and they've found far more dangerous shortcomings.
Last year's hacks were limited in scope because of Uconnect's security settings. So, yes, the two could disable the Cherokee's brakes and turn the steering wheel, but only at speeds below five miles her hour. Now, however, they've learned not only how to send commands to the car, but also to disable the electronic control unit that's in charge of those commands. As Miller explained to Wired, “You have one computer in the car telling it to do one thing and we’re telling it to do something else. Essentially our solution is to knock the other computer offline.”
As a result, they can now apply the Cherokee's brakes at any speed, and they can turn the steering wheel as they please. Check out the video above, and you'll see them ditch the SUV on the side of the road while it's traveling at 30 mph.
Should you be worried? Not at this point, and not in this case. To carry out their attacks, Miller and Valasek rolled back the software on the Cherokee to pre-recall settings. And to take such a deep dive into the SUV's controls, their laptop had to be physically connected to the car. So, if your FCA vehicle's Uconnect software has been upgraded, and if you don't have a couple of strangers sitting in the backseat with a computer patched into your dashboard, you're probably okay.
But don't get too comfortable just yet. While the two hackers aren't yet able to carry out their mischief remotely, that doesn't mean that they won't find a way. Remember, we laughed at this same duo in 2013 when they hacked a Toyota Prius while wired to the car, but two years later, they were futzing with a Jeep from their living room. Where there's a will--and a network--there's a way.
More alarmingly, FCA's Uconnect upgrade wouldn't protect the Cherokee from Miller and Valasek's new hacks, because they claim that they didn't access the SUV's computers using Uconnect at all.
The real worry, though, is this: what if Miller and Valasek hadn't been such good guys? What if they'd found the backdoor in FCA's software and kept the secret to themselves--or worse, sold it to people with more nefarious minds?
Tesla Autopilot tinkering
The Jeep hacks are one thing. They're meant to take place while a driver is at the wheel, allegedly paying attention. In theory, Miller and Valasek say that attentive drivers could fight back against their steering controls, at the very least.
But what if a car were in autonomous driving mode, and things began to go wrong? Would a driver notice? Would she have time to react?
That's a concern of researchers from the University of South Carolina, Zhejiang University in China, and Chinese security company Qihoo 360. Together, they say that they've found ways to wreak havoc on Tesla's Autopilot system.
Their approach to hacking is very different from the duo that took on the Jeep Cherokee. Rather than fiddling with Tesla's onboard computers, they've found ways to confound the radar systems used by Autopilot my manipulating the environment outside the vehicle. For example, by jamming the radar, they can make the car blind to obstacles directly in its path, or make it see obstacles that aren't actually there.
Doing so isn't cheap. Technically, the equipment they used was off-the-shelf, but it cost nearly $200,000. And of course, because it doesn't run through a network, the hack can only be carried out on one vehicle at a time. (Though if that one vehicle were carrying someone very important, it could cause serious problems.)
However, the team found other, less-expensive ways of confusing Tesla vehicles--again, by playing on weaknesses of its radar systems. Using cheap, Arduino devices, they were able to make a car avoid imaginary obstacles and ignore real ones when it was in low-speed self-parking and "summon" modes. The team also proved that wrapping an object in bargain-basement acoustic-dampening foam could effectively confuse the car's radar.
This kind of work isn't quite as scary as that done on the Jeep described above. However, the team says that it does point out serious weaknesses in Tesla's radar-based autonomous driving system.
Now, it's up to Tesla to find ways to shore up its software before someone else does. As one of the researchers explains, “[Tesla] need[s] to think about adding detection mechanisms as well. if the noise is extremely high, or there’s something abnormal, the radar should warn the central data processing system and say ‘I’m not sure I’m working properly.'”