YouTube Removes Tesla FSD Test Video That Tests On Children Actually, Dangerous!
JAKARTA - YouTube has removed a video showing Tesla drivers carrying out their own safety tests to determine whether the Full Self-Driving (FSD) EV (electric vehicle) capabilities will make it stop automatically when encountering children walking across or standing in the street. This was first reported by CNBC.
The video is titled "Does the Tesla Full-Self Driving Beta really run over kids?" originally posted on the Whole Mars Catalog YouTube channel and involved Tesla owner and investor Tad Park testing Tesla's FSD feature with his own children.
During the video, Park drives a Tesla Model 3 towards one of his children who is standing in the street, and then tries again with his other child crossing the street. The vehicle stopped before reaching the children a second time.
Does Tesla Full Self-Driving Beta really run over kids?Thanks to @MinimalDuck and @_tadpark for your help on this! @elonmusk @RealDanODowdhttps://t.co/A1Pq1LkDyQ pic.twitter.com/bR19cVkWcL
— Whole Mars Catalog (@WholeMarsBlog) August 14, 2022
As outlined on its support page, YouTube has specific rules against content that "harms the emotional and physical well-being of minors," including "dangerous acts, challenges, or jokes".
YouTube spokeswoman Ivy Choi told The Verge that the video violated its policies on harmful content, and that the platform "does not allow content that shows minors participating in harmful activities or encouraging minors to engage in harmful activities." Choi said YouTube decided to remove the video.
"I've tried the FSD beta before, and I trust my kids' lives with them," Park said during the now-deleted video. "So I'm pretty sure it (FSD) will detect my kids, and I'm also controlling the steering wheel so I can brake at any time," Park told CNBC that the car never goes more than eight miles per hour, and "make sure the car doesn't go any further." recognize the child."
As of August 18, the video had over 60,000 views on YouTube. The video was also posted to Twitter and is still available to watch. The Verge contacted Twitter to see if there were plans to remove it but did not receive an immediate response.
The crazy idea to test FSD with real, living, breathing children came after a video campaign and advertisement posted to Twitter showed a Tesla vehicle apparently failing to detect it and colliding with a child-sized doll placed in front of the vehicle.
Tesla fans didn't buy it, sparking debate about the feature's limitations on Twitter. The Whole Mars Catalog, an EV-driven Twitter and YouTube channel run by Tesla investor Omar Qazi, later hinted at making videos involving real children in an attempt to prove the original results wrong.
In response to the video, the National Highway Traffic Safety Administration (NHTSA) issued a statement warning against using children to test automated driving technology.
"No one should risk their life, or the lives of others, to test the performance of vehicle technology," the agency told Bloomberg. "Consumers should not attempt to create their own test scenarios or use real people, and especially children, to test the performance of vehicle technology."
Tesla's FSD software does not make the vehicle fully autonomous. It is available to Tesla drivers for an additional $12,000 per year (or a $199 subscription per month).
Once Tesla determines that a driver meets a certain safety score, they can then unlock access to the FSD beta, which allows drivers to enter a destination and make the vehicle drive there using Autopilot, the vehicle's advanced driver assistance system (ADAS). But the driver must keep the wheel and be ready to take control at any time.
Earlier this month, the California DMV accused Tesla of making false claims about Autopilot and FSD. The agency alleges the names of the two features, as well as Tesla's description of them, falsely imply that they allow vehicles to operate independently.
In June, NHTSA released data on driver-assisted accidents for the first time, and found that Tesla vehicles using Autopilot vehicles were involved in 273 accidents from July 20, 2021 to May 21, 2022.
NHTSA is currently investigating a number of incidents in which Tesla vehicles using driver assistance technology collided with parked emergency vehicles, in addition to more than two dozen Tesla accidents, some of which were fatal.