Tesla is carrying out a recall because of issues with its Full Self-Driving (FSD) Beta, according to an announcement posted on the National Highway Traffic Safety Administration (NHTSA) website, which cites the software’s potential to cause crashes. The supposed fix will come in the form of a free software update issued over the air.
According to the announcement, certain Teslas with the FSD Beta engaged could “act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.” The software also could encounter problems with “changes in posted speed limits.” The recall affects all 2016-2023 Model S and Model X vehicles, 2017-2023 Model 3s, and 2020-2023 Model Y vehicles utilizing FSD Beta. All told, as many as 362,758 could be affected.
Tesla’s autopilot technology employs machine learning and cameras to aid in steering, lane changes, braking, and speed changes. Alleged incidents, some fatal, involving cars with versions of the software have been reported over the years, while the electric vehicle maker continued to offer public testing subscriptions to its customers.
[Related: Tesla is under federal investigation over autopilot claims.]
At the end of 2021, over 475,000 vehicles faced a recall due to front trunk hood and rearview camera issues. As CNBC reports, Tesla has never disclosed the exact number of vehicles using FSD Beta, but CEO Elon Musk said in the company’s most recent earnings call that it had been deployed to “roughly 400,000 customers in North America.” He added during the call that, “This is a huge milestone for autonomy as FSD Beta is the only way any consumer can actually test the latest AI-powered autonomy.” Musk tweeted today contesting the word “recall,” while Tesla plans to release a free over-the-air (OTA) software update.
[Related: YouTube pulls video of Tesla fan testing autopilot on kid.]
In October 2022, news leaked that the Department of Justice was conducting an ongoing investigation into alleged misleading and false claims regarding its “Autopilot” systems, which still explicitly requires a human driver behind the wheel at all times. As recently as last fall, Musk said FSD mode was close to being able to drive people “to your work, your friend’s house, to the grocery store without you touching the wheel.” Tesla also faces investigations from the state of California over similar statements. Last month, the NHTSA said it was “working really fast” on another “extensive” Tesla Autopilot probe that could affect more than 830,000 vehicles.
Secretary of Transportation Pete Buttigieg has called terms like Autopilot “extremely problematic.”
The post Massive new Tesla recall focuses on dangers of self-driving software appeared first on Popular Science.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.