JAKARTA - An impending homicide trial in Los Angeles over a fatal accident caused by a Tesla operating on Autopilot presents the first test on the legal liability of a human driver in a partially self-driving car.

The trial, due to begin November 15, comes as a civil case heads to court next year over an accident involving Tesla's Autopilot system and adds to scrutiny of a system that Tesla founder Elon Musk has touted as a move towards fully autonomous driving.

Critics say Tesla's and Autopilot's claims have contributed to accidents and deaths by making drivers negligent.

The US Department of Justice is investigating whether Tesla should also face criminal charges over these autonomous car claims.

The Los Angeles trial could shape public perceptions – and future jurors – of Tesla and could serve as a test case for whether the technology has progressed faster than legal standards.

"Who's at fault, man or machine?" said Edward Walters, an assistant professor at Georgetown University's law school who specializes in laws governing self-driving cars. "The state will have a hard time proving human drivers wrong because some parts of the job are being handled by Tesla," Walters said.

For example in the midnight case on December 29, 2019, Kevin George Aziz Riad (28), exiting the freeway in Gardena, California, in a Tesla Model S, hit a red light and crashed into a Honda Civic. The Civic's driver and passengers, Gilberto Lopez and Maria Guadalupe Nieves-Lopez, died on the spot.

The car's Autopilot system, which can control speed, braking, and steering, was activated at the time of the accident.

But Tesla faces no charge in this case, and legal experts say the standard for criminal cases against the company is high.

Tesla did not respond to a Reuters request for comment. Instead, Tesla says on its website that its driver assistance system "requires active driver supervision and does not make the vehicle "truly" autonomous.

The Gilberto Lopez family is suing Tesla and a trial is scheduled for July.

"I can't say the driver is innocent, but Tesla's systems, Autopilot, and Tesla spokesmen encourage drivers to be less considerate," said Donald Slavik, a lawyer whose firm represented the Lopez family in the lawsuit against Tesla.

Slavik said Tesla understood the risks to its system but failed to manage them. "Tesla knows people will use Autopilot and use it in dangerous situations," he said.

Musk said last September that he believed Tesla had a "moral obligation" to launch what he called "Full Self Driving" software, even if it wasn't perfect and Tesla was sued, because it could save lives.

Prosecutors said the Riad's speed and failure to brake was reckless. His lawyer, Arthur Barens, said last May that the Riad should not have been charged with a crime. However, both declined to comment further.

Robert Blecker, a professor of criminal law at New York Law School, said an investigation by the Department of Justice (DOJ) into Tesla's claims could make it difficult for California prosecutors at trial.

"The DOJ investigation helped him because his claim was 'I relied on their advertising. Therefore, I didn't realize the risk there,'" Blecker said.

"Tesla's legal and regulatory oversight can shape the company's perception of a visible risk to defend itself in an upcoming lawsuit," said Bryant Walker Smith, a law professor at the University of South Carolina, who is also an adviser on emerging transportation technologies.

"Tesla's narrative has the potential to shift from an innovative tech company that does cool things to this company that's just mired in legal trouble. That's the risk, and narrative is so important in civil litigation because both parties tell the jury a story," he said.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)