JAKARTA - A former Tesla employee, has spoken to The New York Times to reveal Elon Musk's automaker may have undermined safety in designing the Autopilot driver assistance system to match his chief executive's vision.
"Unlike the technology at almost every other company working on self-driving vehicles, Musk insists that autonomy can be achieved only with cameras tracking their environment. But many Tesla engineers question whether it's safe enough to rely on cameras without the benefits of other, such as sensing devices. and whether Musk over-promises drivers about Autopilot's capabilities," authors Cade Metz and Neal E. Boudette pondered on the story.
The National Highway Traffic Safety Administration is investigating Musk "after at least 12 accidents in which Tesla Cars on Autopilot drove into parked fire trucks, police cars and other emergency vehicles. These accidents killed one person and injured 17 others.
"The victim's family later sued Tesla for the fatal accident, and Tesla customers sued the company for misrepresenting Autopilot and a series of sister services called Full Self Driving, or FSD," the article reads.
"What I'm concerned about is the language used to describe the vehicle's capabilities," said Jennifer Homendy, chair of the National Transportation Safety Board, as quoted by Rawstory.com. “This can be very dangerous.
The hardware has also been questioned with concerns raised over security issues.
"Within Tesla, some argue for pairing the camera with radar and other sensors that perform better in heavy rain and snow, bright sunlight, and other difficult conditions," the authors report.
"For several years, Autopilot included radar, and for a time Tesla developed its own radar technology. But the three people working on the project said that Musk had repeatedly told members of the Autopilot team that humans can drive with only two eyes and that this meant cars should be able to drive with a camera only," the report said.
In early November, Tesla recalled nearly 12,000 vehicles that were part of the feature's FSD beta test, after deploying a software update that the company said could cause crashes due to an unexpected activation of the car's emergency braking system, the Times reported.
Schuyler Cullen oversees a team exploring the possibilities of autonomous driving at South Korean tech giant Samsung, and says Musk's camera-only approach is fundamentally flawed.
“Cameras are not eyes! Pixels are not retinal ganglia! A computer FSD is nothing like the visual cortex!” said Cullen.
Mobileye Chief Executive Officer Amnon Shashua said Musk's idea to only use cameras in self-driving systems could eventually work, but the technology doesn't exist yet.
"One should not get hung up on what Tesla has to say," said Shashua. “The truth is not necessarily their ultimate goal. The end goal is to build a business.”
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)