The National Highway Traffic Safety Administration (NHTSA) has launched a new investigation into Tesla's Full Self-Driving (FSD) technology following a series of crashes involving the automaker's vehicles. The probe, announced Friday, is focused on four incidents in which Tesla's FSD was engaged in conditions with reduced roadway visibility, such as sun glare, fog, or airborne dust. In one of these crashes, a pedestrian was struck and killed, and another incident resulted in injury.
The investigation could pose a significant setback for Tesla, which has been working to establish itself as a leader in autonomous driving technology. The NHTSA stated that the investigation would evaluate whether the FSD software's engineering controls can adequately detect and respond to visibility issues, and if similar crashes occurred under comparable conditions. The probe will cover approximately 2.4 million Tesla electric vehicles in the United States that have the capability to engage the FSD system.
Tesla has taken a unique approach in developing its FSD software, opting for a vision-only system that relies on cameras as its primary sensory input, unlike other automakers that use radar, LIDAR, and ultrasonic sensors. Tesla's vision-based system uses machine learning and artificial intelligence to process real-time information from the vehicle's cameras, allowing it to make driving decisions. CEO Elon Musk has been a vocal proponent of this approach, believing it is the most efficient path to achieving fully autonomous capabilities by leveraging data from millions of Tesla vehicles on the road.
However, this latest investigation raises concerns over the limitations of Tesla's vision-based system in challenging visibility conditions. Any doubts over the effectiveness of FSD could damage Tesla's autonomous ambitions, especially as the company recently announced its new fully autonomous Cybercab-a robotaxi that lacks pedals or a steering wheel and is set for production in 2026 with a projected price of $30,000.
The NHTSA's probe also brings attention to Tesla's previous safety record. The federal vehicle safety regulator has been closely monitoring incidents involving Tesla's advanced driver assistance systems, including Autopilot and FSD. According to NHTSA data, as of October 1, 2024, there have been 1,399 incidents involving Tesla's driver assistance systems engaged within 30 seconds of a collision, with 31 of those crashes resulting in fatalities.
This is not Tesla's first investigation regarding its FSD system. Last year, the NHTSA ordered a software update for FSD after an investigation found issues with inattentive drivers while using the technology. The Department of Justice is also conducting a separate investigation to determine if Tesla and its executives, including Elon Musk, misled investors and consumers about the capabilities of the FSD system.
Tesla's marketing of the FSD software has been a key part of the company's strategy to differentiate itself from traditional automakers, positioning itself as an artificial intelligence and technology leader. Tesla refers to FSD as a "partial driving automation system," and has previously offered it to all U.S. drivers for a one-month free trial. The new investigation could be a significant blow to Tesla's image as it tries to convince regulators, investors, and the public of the viability of autonomous driving.
Elon Musk, who has long touted Tesla's driverless technology, said recently during a marketing event that the company expects to have "unsupervised FSD" operational in Texas and California next year for its Model 3 and Model Y vehicles. Musk has promised fully driverless vehicles for years, but so far, Tesla has not demonstrated a vehicle that is truly capable of operating on public roads without human oversight.
The NHTSA's investigation will also examine Tesla's over-the-air software updates to understand the timing, purpose, and safety impact of these updates on FSD. The agency aims to determine whether Tesla's modifications were sufficient to address the identified issues and to prevent similar crashes in the future. The FSD software is currently offered as a premium option and marketed as "Full Self-Driving (Supervised)," with Tesla continuing to emphasize the need for drivers to remain attentive and ready to take over at any time.
The stakes are high for Tesla, as the investigation could have broader implications for its autonomous vehicle strategy. Analysts have warned that any regulatory action could delay Tesla's timeline for achieving full autonomy and hurt its reputation as a technology innovator. "Any doubt over Tesla's FSD software capabilities is a big blow to the company's autonomous ambitions," said an analyst following the company.