Self-driving vehicles not ready for prime time, study suggests
Driverless trucks are expected to hit Texas highways in April, but a new study suggests that self-driving vehicles are not ready to share the road with humans – given that autonomous cars in California have been involved in a high number of crashes that may have been avoidable.
In her latest research, George Mason University Professor Missy Cummings wanted to identify where self-driving vehicles need improvement. What she found in driverless cars was “phantom braking” and an inability to navigate human driving behavior, leading to a relatively high number of crashes.
Cummings, who also served as a senior advisor for safety at the National Highway Traffic Safety Administration, looked into the self-driving vehicle operations out of California. Specifically, she looked at three of the most prominent companies deploying driverless cars in the Golden State: Cruise, Waymo and Zoox.
The study first looked at crash rates. Human drivers are involved in 224 non-fatal crashes per 100 million vehicle miles traveled on non-interstate roads. Rideshare drivers (e.g. Uber and Lyft) have a much higher rate of nearly 1,400 crashes. Where do self-driving vehicles fall within this range?
Waymo yielded the best results with a rate of about 1,000 crashes per 100 million miles, better than rideshare drivers but still much worse than the average human driver. However, Cruise and Zoox experienced crash rates as high as 3,000 and 4,000 crashes per 100 million miles, respectively.
The graph below shows self-driving vehicle testing and deployment crashes from December 2021 to November 2023 compared to rideshare and average human drivers:
It is worth noting that crash rate comparisons are not statistically significant. Human drivers log more than 3 trillion miles each year. Self-driving vehicles, on the other hand, have driven only millions of miles throughout their entire lifespan.
“Waymo has consistently more miles,” Cummings told Land Line. “If we’re giving Waymo a huge benefit of the doubt, I would be comfortable saying at best, they are on par with rideshare drivers – which is about four to six times more dangerous than your average driver. That kind of result … should make people pause. What is going on with these rideshare drivers?”
Finding crash rates to be scientifically invalid, Cummings wanted to look at what is causing these crashes. This is where she discovered problems that motorists and pedestrians may find concerning.
Phantom braking
Of the more than 200 crashes between the three self-driving vehicle companies in a two-year period, nearly half were incidents where the driverless car was struck from behind.
Although the driver of the vehicle that rear-ends another vehicle is often thought to be at fault, Cummings found that self-driving vehicles may be to blame in many cases. Specifically, these vehicles are abruptly stopping for no apparent reason, known as “phantom braking.”
“I’ll tell you what really concerns me, and especially related to trucks, is the fact that we can start to see a very dangerous pattern of what we would call phantom braking,” Cummings said.
Phantom braking is when the self-driving vehicle sees something that is not there and makes a hard-braking maneuver. Data in the study suggests the self-driving vehicles’ phantom braking contributed to human drivers crashing into them. Nearly half of all crashes observed were rear-end collisions, which is about twice the rate of human driver crashes.
In fact, Zoox is under federal investigation for this exact issue. NHSTA currently has an open investigation looking into Zoox self-driving vehicles that “unexpectedly braked suddenly, leading to rear-end collisions.”
The study points out that self-driving companies are quick to blame human drivers for “inattentive following.” However, human drivers are caught off guard by sudden decelerations by driverless cars for no reason.
In the case of Cruise, Waymo and Zoox, which operate only on non-interstate roads, the resulting crashes are at lower speeds and cause minimal damage. That is not the case with driverless trucks.
“I come from a long line of truckers,” Cummings said. “That truck is going to jackknife. There’s going to be carnage on the road. So for trucks, the ramifications of the phantom braking events are far more dire than, I think, for your average passenger vehicle.”
A big issue is computer vision hallucinations, i.e. self-driving cars seeing things that are not there. Only 10% of crashes in the study were the result of self-driving cars failing to detect an object, whereas about half involved the autonomous car detecting nonexistent objects.
“Even if you’re a good driver and you’re trying to put enough room between you and the car in front of you, it’s not clear that that’s actually enough room,” Cummings said. “Indeed, one action that I tell people who are driving around self-driving cars is don’t get behind them. Really at any distance, but try to get around them or go a different way, because these phantom braking maneuvers are serious. But again, slow speed, so damage has been contained. This is just simply not going to be able to be contained at high speeds with a big truck.”
Unpredictability of human drivers
Nearly a third of self-driving vehicle crashes were caused by the unexpected actions of other vehicles. In other words, driverless cars are having a difficult time understanding how human drivers behave.
Although bad human decisions (e.g. running a red light) were the major contributing factor, a “substantial number of cases” included “the AV not understanding social norms or not assuming a defensive driving posture.”
“For example, AVs could detect a car backing out of a driveway into traffic but would not slow down or honk to alert the driver and instead collided with the car,” the study states. “Similar situations occurred with AVs hitting doors of drivers who just parked and opened their doors into traffic. Competent human drivers in such urban settings have learned to anticipate such common human mistakes.”
Cummings gave an example of passenger vehicles trying to avoid getting behind a truck. When a truck driver tries to enter the passing lane, it is not uncommon for a passenger vehicle to speed up to get around the truck first. An autonomous system’s sensors may see the passenger vehicle – but as Cummings pointed out, at some point, physics takes over. If a driverless truck overreacts at highway speeds, it could spell trouble.
Essentially, self-driving vehicles mimic intelligence but lack actual intelligence.
“These are just statistical predicting machines that are always predicting … the average behavior,” Cummings said. “Unfortunately, on the roads, we see a lot of not-average behavior.”
A lot of problematic driving by human drivers may be considered low probability – but not to truck drivers. Due to their size and typically lower speeds, trucks interact with more “low-probability” behavior than most vehicles.
How to improve self-driving vehicle technology
Despite the less-than-flattering research results, self-driving vehicle technology has improved and will continue to do so. However, more research and regulatory oversight are needed to get driverless vehicles on the roads safely.
According to Cummings, government oversight may come down to the states. A lot of the data she used in her study came from a 2021 NHTSA standing general order requiring autonomous vehicle companies to report certain crashes. That requirement may go away.
Last December, Reuters reported that President Donald Trump’s transition team recommended getting rid of NHTSA’s standing general order reporting requirements. Tesla, owned by Elon Musk, has opposed the reporting requirements. The electric car manufacturer accounts for 84% of the more than 2,000 crashes reported by all companies. Musk has served as a senior advisor to Trump.
Cummings said California has done a good job of collecting crash data from companies. Texas, on the other hand, has not. That could be problematic, with driverless trucks scheduled to deploy on Texas public highways in April.
Government agencies need to collect that data so researchers like Cummings can continue studying the issue to identify problems and potential solutions. Without that information, it is up to the self-driving vehicle companies to self-analyze their data transparently.
In addition to its recommendation to fix issues with phantom braking, the study also recommends finding ways to teach self-driving vehicles how to drive defensively. Autonomous vehicles have demonstrated they “can drive in a way that is generally technically legal,” the study states, but “severely violate ‘norms of the road’ that are expected by human drivers.”
Just one fatal crash involving a self-driving vehicle could send a company – and the entire autonomous vehicle industry – backward.
While some companies are racing to be the first to deploy commercially, Cummings suggested that being second, third or fourth may be better. She said stakeholders should proceed cautiously.
“Any massive self-driving car or truck accident is going to reflect negatively on the whole industry,” Cummings said. “There’s a lot on the line.”
As part of the Cantor Global Technology Conference on Tuesday, March 11, Aurora Innovation confirmed that it still plans to deploy driverless trucks in April along a designated route between Dallas and Houston. LL