Tesla Requires FSD Users To Submit Videos When Accident Happens
Tesla certainly got a video when his product car was involved in a collision. (photo credit: pixabay)

JAKARTA – Tesla is rolling out the latest beta version of its Full Self-Driving (FSD) software and is seeking driver approval to collect vehicle-identifiable footage in the event of an accident.

FSD is a driver assistance system built on Tesla's Autopilot technology. Although the name suggests the car is in full control, it is actually a semi-autonomous program that requires the driver to maintain control of the vehicle at all times.

Tesla launched the FSD beta as a premium tier Autopilot in October last year. Tesla initially provided software to select "expert and careful" drivers, charging them $10.000 for the privilege.

The company has since made FSD available to more drivers by offering a subscription of $199 per month. FSD users get access to all standard Autopilot and Enhanced Autopilot features, as well as automatic steering for city streets, along with automatic traffic and stop sign control.

Tesla says it has more features planned and is continuing to iterate on the FSD by releasing regular point version updates for the technology.

With FSD 10.5, Tesla is asking users to allow it to collect video from cameras both inside and outside the vehicle during a collision. Tesla includes language in the 10.5 user agreement, and, according to Electrek, customers must agree if they want to use the latest version of the beta software.

The notice also explains that the clip is tied to a vehicle identification number (VIN), allowing Tesla to find out specifically which car - and likely which driver - was involved in the crash.

It's important to note that Tesla has long pulled footage and other data from customer cars, but it's always been anonymized and not tied to a specific driver. The EV company says that it uses that collection of information to improve the performance and safety of its vehicles.

However, it is inevitable that any system that interacts with the remote server is at risk of being hacked. That can apply to connected cars as well as PCs. Indeed, an ethical hacker has previously shown how Tesla cameras can be breached.

With this in mind, it's easy to imagine Tesla understands the risks associated with collecting clips associated with certain vehicles and their owners. However, the company recently came under scrutiny after a driver claimed their car made erratic maneuvers when FSD was used.

The incident has sparked an investigation by the US National Highway Traffic Safety Administration (NHTSA), an agency that has previously criticized automakers for its approach to autonomous driving technology.

For Tesla, being able to gather identifiable visual evidence of the moments surrounding the crash will likely prove invaluable. Not only will Tesla be able to better understand what went wrong and why, it will also likely be in a position to defend itself if the NHTSA or another agency launches future investigations into its technology.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)