Partager:

JAKARTA – A 'deeply disturbing' video shows a Tesla car in full self-driving mode crashing into a child-sized mannequin during a test by a safety campaign group.

The Dawn Project said the vehicle failed to detect the presence of a stationary dummy on the road and hit it repeatedly at an average speed of 25 mph (40 km/h). They claim that the experiments were carried out under 'controlled conditions' on a test track in California.

Tesla, founded by billionaire entrepreneur Elon Musk, was asked for comment by MailOnline but has not yet responded to the video.

The US National Highway Traffic Safety Administration (NHTSA) confirmed that it is 'currently conducting an open and active investigation into Tesla's Autopilot active driver assistance system'.

A spokeswoman said this included full self-driving software and that the agency was "considering all relevant data and information that could aid its investigation".

The Dawn Project released the video as part of the launch of a nationwide public safety advertising campaign highlighting the dangers of Tesla's fully self-driving software.

“The deeply disturbing results of our full self-driving Tesla safety test should be a call to action,” said Dan O'Dowd, founder of The Dawn Project.

“Elon Musk says Tesla's Full Self-Driving software is "amazing". Not this one. This is a deadly threat to all Americans,” said O'Dowd, as quoted by the Daily Mail.

He added that more than 100,000 Tesla drivers are already using fully self-driving cars on public roads, putting children at great risk in communities across the country.'

Safety tests were conducted at the Willow Springs International Raceway and test track in Rosamond, California on June 21.

This trial was recorded and footage shows the Tesla in full self-driving mode repeatedly walking on a child-sized mannequin.

"NHTSA currently has an open and active investigation into Tesla's autopilot active driver assistance system, including its "fully self-driving" software, an NHTSA spokesperson said.

"As the technical analysis is ongoing, NHTSA is considering all relevant data and information that could assist the agency in its investigation," they added.

In February this year, Tesla recalled nearly 54,000 cars and SUVs because their full self-driving software was found to be faulty for letting them pass stop signs without stopping completely.

An over-the-internet software update, which is being tested by a number of drivers, allows vehicles to pass through intersections with stop signs at up to 5.6 miles per hour.

Documents shared by US safety regulators said Tesla had approved the recall after two meetings with officials from NHTSA.

Tesla said at the time that it was not aware of any crashes or injuries caused by the feature, and that there were no warranty claims as a result of issues with rolling start updates.

The recall covers the Model S sedan and SUV X from 2016 to 2022, as well as the 2017 Model 3 sedan through 2022 and the 2020 Model Y SUV through 2022.

Beta testing of this fully self-driving software was conducted by randomly selected Tesla drivers. However, cars cannot drive themselves and drivers must be ready to take action at all times.

Safety advocates complain that Tesla should not be allowed to test vehicles on public roads with untrained drivers, and that Tesla's software could malfunction, putting other motorists and pedestrians in danger.

Most car companies perform similar software tests with trained human safety drivers.

In November last year, NHTSA said it was seeking complaints from Tesla drivers that full self-driving software had caused the crash.

The driver said their Model Y got into the wrong lane and was hit by another vehicle. The SUV alerted the driver in the middle of a turn and the driver tried to turn the wheel to avoid other traffic, according to the complaint.

“But the car took control and 'forced itself into the wrong lane,' said the driver.

No one was injured in the November 3 crash in Brea, California, according to the report.

In December, Tesla agreed to update its less sophisticated autopilot driver assistance system after NHTSA opened an investigation.

The company agreed to stop allowing video games to be played on the center touchscreen while the vehicle was in motion.

The agency is also investigating why a Tesla on autopilot repeatedly hit an emergency vehicle parked on the road.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)