IE 11 is not supported. For an optimal experience visit our site on another browser.

Tesla driver charged with manslaughter in deadly Autopilot crash raises new legal questions about automated driving tech

A Tesla Model S driver accused of crashing his car while Autopilot was activated had run a red light and slammed into a Honda Civic, killing its occupants.
Get more newsLiveon

A deadly 2019 crash involving a Tesla Model S has prompted renewed scrutiny over who should be held liable in such cases.

The Tesla slammed into a Honda Civic, killing both occupants

The Tesla driver, Kevin George Aziz Riad, 27, was charged with two felony counts of vehicular manslaughter with gross negligence for the deaths of Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez. 

An April 2020 civil lawsuit filed by Gilberto’s family in Los Angeles County Superior Court alleged that at the time of the December 2019 crash, Riad was traveling at an “excessively high rate of speed” while using Tesla’s Autopilot feature. The semi-automated driving technology can steer, brake and accelerate the car on its own.

The National Highway Traffic Safety Administration confirmed that Autopilot was active at the time of the crash after sending officials to investigate, The Associated Press reported.

Prosecutors said they could not immediately release further details about the case.

The charges against Riad appear to mark the first time a driver in the United States was prosecuted for a felony while using semi-automated driving technology. The case has not only renewed discussion about the dangers of misusing the technology but could potentially set a standard for holding motorists legally liable in similar incidents.

“Autopilot or not, from the second they get in a car, drivers are responsible for every single thing that happens,” said Aron Solomon, chief legal analyst for Esquire Digital, in an interview with NBC News. Referring to the gross negligence charges Riad is facing, Solomon said the legal concept for negligence is “situational.”

“The question is always what was reasonable in that situation. That’s always what the law is going to look at.”

Michael Brooks, the chief operating officer at the Center for Auto Safety, a nonprofit advocacy group that focuses on the U.S. automotive industry, said he hopes Tesla drivers and owners see this case and understand that Autopilot has limitations. “It will not drive them from any point A to any point B always safely, and they need to be responsible for the actions of the vehicle,” Brooks said.

Jonathan Handel, a lecturer in law at the University of Southern California Gould School of Law and an expert in autonomous vehicles, said the case will hopefully show that semi-autonomous systems, like Autopilot, are not a replacement for human drivers. 

“I think that it will have an impact on the way drivers approach the technology, and because of that, it will hopefully have an impact on the way the industry operates,” he said, adding that he believes Tesla should be held to account over the deaths.

Tesla did not respond to multiple emailed requests for comment. In a court filing, the company argued that “the Model S meets or exceeds all of Tesla’s internal standards as well as applicable industry standards, including but not limited to those promulgated by the American National Standards Institute.”

Lopez and Nieves-Lopez were driving through the intersection of Artesia Boulevard and Vermont Avenue in the Southern California suburb of Gardena when their vehicle was struck by Riad’s car, according to the complaint. 

Riad’s vehicle had been exiting a freeway when it ran a red light. Lopez and Nieves-Lopez died at the scene. Riad, a limousine service driver, and his passenger were hospitalized with non-life-threatening injuries.

On its website, Tesla says the Autopilot features are “designed to assist” drivers and require “active driver supervision and do not make the vehicle autonomous.” 

Brooks, however, said using the term autopilot could lead people to incorrectly assume the vehicle is more capable than it is, a presumption that can have deadly consequences. 

Over the years, Tesla has made headlines after accidents involving its semi-autonomous technology. In 2019, a finance executive in Florida was driving home with his Tesla Model S on Autopilot when he bent down to pick up his cellphone, according to The New York Times. The car drove past a stop sign and a blinking red light, and crashed into another vehicle, killing a 22-year-old college student. The Tesla driver was not charged in the incident. 

Last May, Param Sharma was arrested in California and charged with two counts of reckless driving and disobeying a peace officer after police noticed he was in the back seat of his Tesla as it drove down a freeway with Autopilot activated. No one was injured.

“People think of it more as a self-driving or an autonomous vehicle when in fact, all that Teslas have is an advanced driver assistance system that uses things like lane departure, adaptive cruise, braking, and other functions to provide the driver with a less burdensome experience,” Brooks explained. “And so we tend to see people overestimating the capabilities of the vehicles.”

Bryant Walker Smith, a law professor at the University of South Carolina who studies automated vehicles, said he hopes the case alerts all motorists to pay better attention to the road, regardless of the vehicle’s capabilities.

“I don’t want people to hear this and think, ‘Oh, that’s not my problem. I don’t have a fancy expensive car.’ Because distraction and reckless driving are problems in the millions of Teslas on the road, in the tens of millions of other vehicles with driver assistance systems, and in the literally hundreds of millions of vehicles without those systems,” he said.

“This should be a wake-up call to everybody that they are driving dangerous machines.”