Tesla’s ‘shatterproof’ window a metaphor for self-driving tech industry

November 22, 2019

Tyson Fisher

|

For the past several years, tech companies have been optimistic about the introduction of self-driving vehicles to the market. However, it seems for every step forward, the autonomous vehicle industry takes two more back. This week alone, two steps back was taken twice within a 24-hour period, ending with a shatter from Tesla.

This is what happened this week:

  • Tuesday: The National Transportation Safety Board found Uber has an “inadequate safety culture.”
  • Wednesday: Senate self-driving vehicle hearing was a mess.
  • Friday:

Below is a breakdown of the self-driving tech industry’s bad week.

Uber’s ‘inadequate safety culture’

On Tuesday, Nov. 19, the National Transportation Safety Board released its findings on a fatal crash involving a self-driving Uber test car and a pedestrian. The verdict was not favorable for the autonomous vehicle industry.

A quick refresher on what went down from the NTSB report:

On March 18, 2018, at 9:58 p.m., an automated test vehicle, based on a modified 2017 Volvo XC90 SUV, struck a female pedestrian walking across the northbound lanes of N. Mill Avenue in Tempe, Arizona. The SUV was operated by the Advanced Technologies Group of Uber Technologies Inc., which had modified the vehicle with a proprietary developmental automated driving system. A female operator occupied the driver’s seat of the SUV, which was being controlled by the (automated driving system). The road was dry and was illuminated by street lighting.

In other words, while testing a self-driving vehicle Uber struck and killed a pedestrian.

NTSB concluded that the probable cause “was the failure of the vehicle operator to monitor the driving environment and the operation of the automated driving system because she was visually distracted throughout the trip by her personal cellphone.”

Two things are of concern. First, this was not some motorist who did not understand the capabilities of his or her new Tesla’s AutoPilot system. Rather, this is an Uber employee with training to test a self-driving vehicle. If anyone should had known to be attentive, it would have been her.

Second, the report states that the vehicle detected the pedestrian 5.6 seconds before impact.

“Although the ADS continued to track the pedestrian until the crash, it never accurately classified her as a pedestrian or predicted her path,” the report states. “By the time the ADS determined that a collision was imminent, the situation exceeded the response specifications of the ADS braking system.”

The bottom line here is that a human who spots another human 5.6 seconds out would have recognized the situation and acted accordingly. The ADS did not.

“At the time of the crash, the Uber Advanced Technologies Group had an inadequate safety culture, exhibited by a lack of risk assessment mechanisms, of oversight of vehicle operators, and of personnel with backgrounds in safety management,” NTSB stated in its report.

How could this have happened? That brings us up to the second step backward that occurred the very next day.

Senate hearing highlights problems with self-driving vehicles

On Wednesday, Nov. 20, the U.S. Senate Committee on Commerce, Science and Transportation held a hearing called “Highly Automated Vehicles: Federal Perspectives on the Deployment of Safety Technology.” A big part of the hour-and-a-half discussion: NTSB’s Uber crash report.

The witness panel included:

  • Robert Sumwalt, National Transportation Safety Board chairman.
  • Dr. James Owens, National Highway Traffic Safety Administration acting administrator.
  • Joel Szabat, U.S. Department of Transportation acting undersecretary of transportation for policy.

Referring to the Uber crash, Chairman Roger Wicker, R-Miss., said, “It is imperative that manufacturers learn from this incident and prevent similar tragedies from happening again.”

In her opening statement, ranking member Maria Cantwell, D-Wash., also referred to the crash, but she took it one step further. Cantwell pointed out that more than 80 companies are testing automated vehicles on the public roadways. Keep that number in mind.

Per automated vehicles guidelines established by National Highway Traffic Safety Administration, self-driving vehicle manufacturers can voluntarily submit a safety assessment. You read that right – “voluntary” and “self-administered.”

This takes us back to Cantwell’s statement.

“However, some of these self-assessments read more like a marketing brochure than critical assessments,” Cantwell said. “Noticeably missing from the list of companies that submitted voluntary assessments were Tesla and Uber, both of which had these fatal incidents.”

Sumwalt agrees. During the hearing, he said that NHTSA should require manufacturers to submit a safety self-assessment. Additionally, those assessments should be independently reviewed, because … well … they are self-assessed.

Defending his recommendation, Sumwalt pointed out that of the approximately 80 self-driving vehicle manufacturers, only 16 have submitted an assessment. Neither Uber nor Tesla, the two manufacturers with high-profile fatal crashes, are among those companies.

Despite the glaring need for mandatory assessments and minimum standards, NHTSA is still bending a knee to private industry stakeholders. Owens explained that NHTSA’s lax oversight is to allow the private sector to compete and innovate.

Essentially, NTSB and NHTSA are at odds over how to look over the progress and introduction of self-driving vehicles. NTSB is taking a more cautious approach. However, NHTSA wants to put complete trust in the private sector. The kicker: NHTSA has regulatory authority. NTSB has none.

Self-driving vehicle tech is nowhere near ready

Even the senators are at odds over how to proceed. On one hand, Sen. Gary Peters, D-Mich., said that the country needs to move “very quickly” to put Level 4 and 5 vehicles (fully self-driving) on the road. Why? Because the general public is overestimating the ability of Level 2 and 3 vehicles. So rather than slowly acclimate ourselves to new technology, just quickly give the people technology that literally drives itself.

On the other hand, Sen. Tom Udall, D-N.M., pointed out that NHTSA has spent years and tens of millions of dollars in research and technology to stop driving under the influence. Yet not much has improved. Owens explained that the tech is promising, but not quite there yet.

Udall shot back, stating that if NHTSA cannot figure out DUI technology, how are we to believe it can handle technology as complex as autonomous vehicles?

A few days after the hearing on Friday, Nov. 22, Tesla made headlines again for all the wrong reasons. In one of his signature introduction stage show extravaganzas, Elon Musk revealed the Tesla pickup truck. With thousands, perhaps millions, of people watching, Musk wanted to show off how the windows are shatterproof. This brings us back to this video:

Billions of dollars invested in Tesla. Some of the most brilliant engineers are on the payroll. Yet, Tesla could not get a window right during a highly anticipated PR event. And we want no minimum standards and self-assessed evaluations?

At least that Tesla vehicle isn’t on the road. Currently, Tesla cars are everywhere and many owners are unaware of what the vehicle can and cannot do.

During the senate hearing, Sen. Ed Markey, D-Mass., showed a screenshot of a YouTube video that tells Tesla owners how to circumvent software that detects hands on the wheel. Markey pointed out that all anyone needs to do is go to YouTube to find similar AutoPilot hacks. NHTSA had no answer for that.

Future of self-driving cars

With lawmakers disagreeing on how to proceed and the two biggest traffic safety federal agencies at odds over how to oversee self-driving vehicle technology, it seems clear that no one has a firm grasp of what is ahead.

Even Owens refused to give senators a timeline of when to expect self-driving cars on the road. Many experts will give their predictions. Some say within a few years. Meanwhile, others will say a few decades. In short, nobody knows.

Also, let’s not forget that consumers are not even ready for self-driving vehicles.

Bottom line: Self-driving vehicles will become a reality. Investors are pouring in too much money for it not to become a thing. Certainly, it will become the future norm for travel. However, if we want to prevent as many deaths as possible, it is imperative we tread lightly as if walking on eggshells.

Sen. Peters and NHTSA need to listen to NTSB Chairman Sumwalt. If we move fast and loose on this technology, bad things will happen. Tesla’s “shatterproof” window is a perfect example.