IE 11 is not supported. For an optimal experience visit our site on another browser.

Tesla banned from crash probe after leaking details

The investigation could have a big impact on the development of autonomous vehicles — if nothing else through the court of public opinion.
Image: Rescue workers attend the scene where a Tesla electric SUV crashed into a barrier on U.S. Highway 101 in Mountain View
Rescue workers attend the scene where a Tesla electric SUV crashed into a barrier on U.S. Highway 101 in Mountain View, California, March 25, 2018.KTVU FOX 2 / via Reuters file

Problems are — once again — mounting for electric carmaker Tesla. The company has been booted out of an investigation launched by the National Transportation Safety Board to find the cause of a California crash that last month killed the driver of a Tesla Model X.

Tesla had confirmed that the driver was running in semi-autonomous Autopilot mode when the crash occurred — technology that has been partly blamed for at least one other fatal crash — then went public with some of the preliminary findings, despite reportedly agreeing to keep a lid on the data until the investigation was complete.

"Releases of incomplete information often lead to speculation and incorrect assumptions about the probable cause of a crash," the NTSB said in a statement explaining why it “decided to revoke Tesla’s status” as a party in the probe.

In turn, Tesla now plans to file a complaint with Congress, contending in its own statement that the NTSB is “more concerned with press headlines than actually promoting safety.”

That position might seem odd, several observers suggested, considering that Tesla used what has been learned so far in an apparent effort to put the blame on the driver, rather than its vehicle or the Autopilot system.

Tesla’s initial role as a direct party to the investigation is not entirely unprecedented, experts noted. Uber is similarly involved in an investigation of a March 19 incident in which one of its prototype autonomous vehicles struck and killed a pedestrian near Phoenix.

NTSB investigators have the power of subpoena, and can seek information from a car company, whether it’s involved in a probe or not. But, “the level of complexity, especially with the deployment of newer systems NTSB may not be familiar with, is one reason it might want a manufacturer involved at this level,” Jim Sayer, director of the University of Michigan Transportation Research Institute, told NBC News.

Not everyone is comfortable with that approach, however.

Conflict of interest

“Having the subject of the investigation be part of the investigation would seem to be a conflict of interest,” said John Simpson, director of technology for Consumer Watchdog, a California-based non-profit that has been pressing regulators and legislators to slow the rush to test self-driving vehicles on public roads.

There’s little doubt that the investigations involving Tesla and Uber could have a big impact on the development of autonomous vehicles — if nothing else through the court of public opinion. Tesla took heat when the NTSB put a sizable share of the blame for a May 2016 crash on the Autopilot technology, subsequently updating the system with both new hardware and software.

But critics fear Tesla may not have gone far enough. It clearly views Autopilot as a competitive advantage. Its website boasts that “All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safe level substantially greater than that of a human driver.”

The small print notes that Tesla is still developing the software for full autonomous functionality. And, in its own comments on the incident that killed the California driver, Tesla noted he went at least six seconds before the crash without touching the steering wheel — Tesla says drivers must do so at least every five seconds — and did not respond to warnings to take manual control before the collision occurred.

Perhaps not, but UMTRI’s Sayer said the very name, “Autopilot, would lead you to the impression” Tesla’s cars can drive themselves. He adds that there is a “substantial body of knowledge” about how humans can get distracted. “The question is whether they (Tesla) did an adequate job in designing out the risk” of having a driver not recognize and respond to a failure of the Autopilot system in a timely manner.

Indeed, some experts believe it may be impossible to ensure that a human motorist will be able to respond quickly enough. That’s why Waymo, the autonomous vehicle spin-off of Google, plus Ford and several other manufacturers now question whether the best approach is to eliminate human drivers entirely with even more sophisticated self-driving systems.

Ultimately, the riff between Tesla and the NTSB could eventually cause problems for the carmaker.

“It’s going to take more, not less, teamwork between the automakers and the policy people” to ensure that autonomous vehicles are safe and reliable, said David Cole, director-emeritus of the Center for Automotive Research.