Elon Musk’s Tesla has reportedly recalled 362,758 vehicles after the NHTSA determined that the company’s “Full Self-Driving” (FSD) system may increase the chances of crashing. Tesla’s FSD has faced constant criticism including a Super Bowl ad demonstrating cars running over mannequins of children and other serious safety failures.
CNBC reports that Tesla has voluntarily announced a recall for 362,758 U.S. vehicles with the company’s Full Self-Driving Beta (FSD Beta) driver-assistance software. The FSD Beta system may contribute to crashes if the car cannot safely navigate intersections and reacts slowly to changes in posted speed limits, according to the National Highway Traffic Safety Administration (NHTSA).
The affected vehicles may “act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution,” according to the recall notice, which was issued by the NHTSA.
Additionally, it warns that the FSD Beta system might have trouble adapting to changes in posted speed limits. Drivers who use the FSD Beta system may be at an increased risk of accidents as a result of these issues.
Model S and Model X, Model 3, and Model Y vehicles manufactured between 2016 and 2023 that have FSD Beta installed or are in the process of being installed are among the affected models. In an attempt to fix the problems, Tesla will give cars an over-the-air software update.
Tesla’s FSD Beta is best described as a collection of new features that are not yet fully debugged. The main draw is “autosteer on city streets,” which enables a Tesla to imperfectly navigate through complex urban environments automatically. FSD Beta has gained popularity among some Tesla owners despite its drawbacks as they are eager to test out the most recent driver assistance features. However, safety concerns regarding its use have been raised due to the system’s incomplete testing and debugging.
There have been numerous reports of Tesla vehicles using FSD being involved in major crashes. In 2018, a Tesla Model X was involved in a fatal crash while using the company’s Autopilot driver assistance system. The National Transportation Safety Board (NTSB) found that the driver was distracted and overly reliant on the system at the time of the collision.
The NTSB criticized Tesla for lacking measures to stop drivers from abusing the Autopilot system. Tesla has since added features like automatic emergency braking and lane departure warning to its Autopilot system, among other updates.
Musk’s FSD was the target of a Super Bowl ad criticizing the safety of the system. As Breitbart News reported:
TechCrunch reports that Tesla’s Full Self-Driving (FSD) system is under fire from the Dawn Project, a group that promotes safety. The organization recently ran a 30-second Super Bowl commercial highlighting several serious safety flaws in Elon Musk’s advanced driver assistance system (ADAS). FSD, despite its name, does not actually allow for fully autonomous driving, so drivers must always be ready to take control in the event of a malfunction or system failure.
Recent criticism has rekindled concerns about the safety of the system as Tesla has just made its most recent iteration of FSD available to about 400,000 drivers in North America. Tesla has come under fire for misrepresenting the capabilities of its lower-level automated driving system, Autopilot, and there have been reports of accidents while it was activated. The advertisement shows clips of Teslas acting erratically and claims that FSD will do things like hit a baby in a stroller, run over a child in a school crosswalk, disobey stop signs, drive on the wrong side of the road, and more.
The Dawn Project argues that Tesla’s “deceptive marketing” and “woefully inept engineering” is endangering the public. The group has called on the National Highway Traffic Safety Administration (NHTSA) and the Department of Motor Vehicles (DMV) to turn off FSD until all safety defects are fixed. Dan O’Dowd, the founder of the organization, also serves as the CEO of Green Hill Software, which develops operating systems for embedded security and safety systems as well as its own automated driving systems.
Read more at CNBC here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan