Tesla Model Y Driver Complains FSD Features Unable To Avoid Collisions

JAKARTA - A Tesla Model Y driver has filed a complaint, claiming their car was in Full Self-Driving (FSD) mode when it hit another vehicle. Whereas with FSD mode the collision should have been avoided, but in reality this is not the case.

Despite its name, FSD is not a truly autonomous driving system. Instead, it's Tesla's latest iteration of its premium assisted driving technology, building on its Autopilot and Enhanced Autopilot offerings.

Currently available as a beta product, FSDs include all the lower-tier features, while they also promise additional functionality, such as automatic steering on city streets, and the ability to stop yourself at traffic lights.

The US National Highway Traffic Safety Administration (NHTSA) uses a six-point scale to describe driverless vehicles. Level 0 means no autonomous features, while Level 5 refers to full automation where no human driver is required for the vehicle to travel safely on public roads.

Based on its current features, FSD is classified as Level 2, which is partial automation. Indeed, Tesla mandates all users of its autonomous systems to keep their hands on the wheel while driving. So can not be left to sleep, while they are driving.

Earlier this month, in Brea, California, a 2021 Tesla Model Y was hit by another vehicle as the driver turned left. As The Next Web notes, a statement on the NHTSA website explains that the unnamed driver claimed their car was in FSD mode and was driving into the wrong lane during a turn.

The person said the Model Y gave a warning mid-maneuvering, but their attempts to correct the track were unsuccessful and the vehicle was hit on the driver's side. Fortunately, no injuries were reported, but NHTSA said it was investigating the incident.

The incident comes after Tesla recalled nearly 12,000 cars last month due to FSD issues.

The company released the FSD beta 10.3 update on October 23, but some drivers soon discovered issues relating to the collision warning and emergency braking features of their vehicles.

Tesla temporarily rolls users back to the previous version, before they issue a patch (update). Of course, Tesla's software is explicitly offered as a beta version, so it's unclear if the company can or will be held responsible for the crash incident.

Whatever standards Tesla will hold for its nascent technology, regulators will need to watch. This latest action by NHTSA follows an investigation into a Tesla car that reportedly hit an emergency vehicle while in Autopilot mode.

It should also be noted that this is not the first investigation related to FSD. The California Department of Motor Vehicles is also investigating Tesla over its use of the phrase "full self-driving" and whether it includes misleading false advertising.