Federal regulators have escalated their scrutiny of Tesla’s Full Self-Driving (FSD) feature, launching a new investigation into 58 incidents where vehicles allegedly violated traffic laws, leading to crashes and injuries. This latest probe covers 2.9 million vehicles and casts a significant shadow over Elon Musk’s vision for widespread driverless cars, pushing Tesla—and the entire autonomous vehicle industry—to confront critical questions about software reliability and driver safety.
The National Highway Traffic Safety Administration (NHTSA) has once again put Tesla’s Full Self-Driving (FSD) technology under the microscope, initiating a fresh investigation into dozens of incidents involving alleged traffic law violations. This isn’t just another regulatory hurdle; it’s a critical moment that challenges the very foundation of Tesla’s autonomous ambitions and raises fundamental questions about the readiness of Level 2 driver-assistance systems for public roads.
The probe specifically targets 58 reported incidents where Tesla vehicles, operating in FSD mode, ran red lights, drove on the wrong side of the road, or were involved in crashes causing injuries and even fires. This wide-ranging investigation encompasses approximately 2.9 million vehicles, effectively covering all Teslas equipped with the FSD software, which critics often deem a “misnomer” for its implication of full autonomy.
The Growing Stack of Regulatory Challenges
This latest inquiry is not an isolated event but rather an addition to a series of ongoing investigations into Tesla’s driver-assistance technologies. NHTSA has previously opened probes into several critical areas:
- 2024 Driver-Assistance Features Probe: An investigation launched last year covered 2.4 million Teslas after crashes in low-visibility conditions, including one fatal incident involving a pedestrian, as reported by the Associated Press.
- “Summon” Technology Investigation: Earlier this year, reports of fender benders in parking lots prompted an investigation into Tesla’s “Summon” feature, which allows cars to drive to their owners’ locations.
- Crash Reporting Compliance: Another August investigation focused on whether Tesla was promptly reporting crashes as required by federal regulations.
Beyond regulatory probes, Tesla also faces legal battles. In a significant ruling, a Miami jury found the company partly responsible for a deadly 2019 crash in Florida involving its Autopilot system (distinct from FSD) and ordered Tesla to pay over $240 million in damages. Tesla has indicated it will appeal this decision, as detailed by the Associated Press.
“Does the Software Work?”: Expert Opinions and Community Concerns
The core of the issue, as Morningstar analyst Seth Goldstein succinctly puts it, is, “The ultimate question is, ‘Does the software work?’” This sentiment is echoed by money manager Ross Gerber, a long-time Tesla investor, who argues, “The world has become a giant testing ground for Elon’s concept of full self-driving, and it’s not working.”
These critiques highlight a widespread concern within both financial and fan communities: the performance gap between Musk’s ambitious promises and the current reality of FSD. Tesla maintains that it consistently informs drivers that the system requires constant supervision and intervention, effectively placing the responsibility on the person behind the wheel. However, regulators noted in the new probe that many drivers involved in accidents reported receiving no warning from the vehicles about unexpected behavior.
The Hardware Debate: Vision-Only vs. Radar
A significant point of contention among enthusiasts and experts is Tesla’s reliance on a vision-only system, utilizing cameras rather than supplementing with radar sensors and other hardware. Ross Gerber advocates for a hardware adjustment, stating, “they have to take responsibility for the fact that the software doesn’t work right and either adjust the hardware accordingly — and Elon can just deal with his ego issues — or somebody is gonna have to come in and say, ‘Hey, you keep causing accidents with this stuff and maybe you should just put it on test tracks until it works.’” This debate underscores a crucial technical divergence within the autonomous driving sector, with many competitors opting for a more sensor-redundant approach.
The current FSD system under investigation is classified as Level 2 driver-assistance software, meaning it requires the driver to remain fully attentive and ready to take control at all times. While a new version of FSD was released recently, and a “vastly upgraded version” promising no driver intervention has been in testing for years, the path to truly driverless operation appears fraught with challenges.
Tesla’s Broader Business Pressures and the FSD Promise
The urgency for Tesla to demonstrate success with FSD is heightened by pressures on its primary business: selling cars. The company faces a challenging market landscape, including a reported boycott by customers dissatisfied with Elon Musk’s political endorsements and growing competition from rival EV manufacturers. Companies like China’s BYD are rapidly gaining market share with more affordable, high-quality electric vehicle offerings.
In response to these market pressures, Musk recently announced the release of two new, cheaper, and stripped-down versions of existing models, including the popular Model Y. However, this move failed to impress investors, who had hoped for more substantial price reductions or entirely new models, leading to a further dip in the company’s stock, as reported by the Associated Press.
What This Means for the Future of Autonomous Driving
The ongoing NHTSA probe underscores a critical juncture for both Tesla and the wider autonomous vehicle industry. It emphasizes the immense responsibility that comes with deploying advanced driver-assistance systems and the need for rigorous testing, transparent communication, and robust safety measures.
For the fan community, these developments mean continued vigilance. The promise of fully driverless cars remains a compelling vision, but regulatory oversight and real-world performance incidents demand a cautious, evidence-based approach. The debate over hardware, software reliability, and the very definition of “self-driving” will undoubtedly intensify, shaping the evolution of automotive technology for years to come. Ultimately, the question remains: will Tesla’s software work safely enough to fulfill Musk’s vision of widespread driverless taxis by the end of next year, or will regulators be forced to intervene more decisively?