There are plenty of metaphorical dumpster fires raging in America today, but few are as fascinating to watch as the one consuming Uber.
The ride-sharing company and its proto-bro CEO, Travis Kalanick, have been in the news for months, and rarely have the headlines been flattering. (If you need to be brought up to speed on some of those unpleasant stories, click here.)
Now, Uber is facing a couple of new problems, both of which involve the technology that underpins the company's fleet of self-driving vehicles.
Artificial road rage
One of those problems involves the aggressiveness of the artificial intelligence associated with Uber's autonomous software.
As you might recall, Uber has three fleets of self-driving vehicles--one in Tempe, Arizona, one in San Francisco, California, and one in Pittsburgh, Pennsylvania. On Friday, March 24, an Uber vehicle in Tempe was involved in a collision--a fairly serious one compared to others we've seen. The Volvo XC90 was hit as it drove through an intersection, causing the SUV to roll onto its side. Thankfully, no one was seriously injured in the incident.
While there were humans in both the driver's and passenger's seats at the time of the collision, the SUV was being controlled by Uber's self-driving software. Uber temporarily suspended the autonomous car program, but it was up and running again this week.
Official accounts show that the traffic light was yellow as Uber's vehicle entered the intersection, giving it the right of way and putting the other driver at fault. However, eyewitnesses say that the SUV bears some of the responsibility "for trying to beat the light and hitting the gas so hard."
Tempe police maintain that the driver of the other vehicle was at fault. However, the eyewitness testimony does call into question how aggressively engineers have designed Uber's self-driving cars to operate.
Those concerns are well warranted, given incidents like the one recorded last November in San Francisco, when an autonomous Uber blew through a red light. (Uber initially denied that the vehicle was in self-driving mode, but internal documents suggest that the company may not be telling the whole truth.) Uber's cars have run at least five other stop lights in San Francisco since then.
It wouldn't be entirely surprising if Uber had tweaked its self-driving software to make cars maneuver more aggressively through traffic. After all, many critics have said that one of the biggest problems with autonomous vehicles is that they drive too cautiously, which could cause serious problems for the human drivers around them.
However, if the cars are driving too aggressively, that's something that Uber will need to fix ASAP. With all the bad press the company has received, the last thing Uber needs is a major accident--much less an injury, fatality, or another lawsuit.
Meanwhile, in court
Speaking of lawsuits, remember Anthony Levandowski, the former Google/Waymo employee who's at the center of a suit filed against Uber and its subsidiary, Otto? Well, he's not having a very good week.
To recap, Waymo claims that Levandowski stole its trade secrets before jumping ship in January 2016, forming a self-driving cargo company called Otto, and being acquired/hired by Uber to run the company's own self-driving program. According to Waymo:
"We found that six weeks before his resignation this former employee, Anthony Levandowski, downloaded over 14,000 highly confidential and proprietary design files for Waymo’s various hardware systems, including designs of Waymo’s LiDAR and circuit board. To gain access to Waymo’s design server, Mr. Levandowski searched for and installed specialized software onto his company-issued laptop. Once inside, he downloaded 9.7 GB of Waymo’s highly confidential files and trade secrets, including blueprints, design files and testing documentation. Then he connected an external drive to the laptop. Mr. Levandowski then wiped and reformatted the laptop in an attempt to erase forensic fingerprints."
The trial against Otto, Uber, and Levandowski hasn't yet formally begun. However, U.S. District Judge William Alsup says that the case against all three parties is very strong.
Complicating matters is the fact that Levandowski himself has refused to testify in the case, citing his fifth amendment rights, which protect people from self-incrimination. Judge Alsup says that given the strength of Waymo's arguments, he may force Uber to suspend its self-driving car program unless Levandowski comes to his employer's defense.
Uber has responded that it can prove its innocence without Levandowski's help. More specifically, one of Uber's lawyers, Arturo Gonzalez, has said that Uber can demonstrate that it's "not using any of these things that [Waymo] say[s] he may have taken." If Alsup decides that profiting from trade secrets rather than the theft of those secrets is the real issue here, Uber might have a better chance of succeeding.
Meanwhile, Uber is hoping to push the case into arbitration. That might not change the outcome, but it would certainly reduce the amount of negative press Uber might receive if it fails to make its case.