The U.S. National Highway Traffic Safety Administration has launched a significant investigation into nearly 2.9 million Tesla vehicles equipped with its Full Self-Driving (FSD) system, following over 50 reports of dangerous traffic safety violations and crashes, pushing the boundaries of autonomous vehicle regulation.
The quest for autonomous driving has long been a beacon of future innovation, promising enhanced safety and efficiency on our roads. However, the journey is fraught with complex challenges, especially concerning the regulatory oversight of advanced driver-assistance systems (ADAS). In a significant development, the U.S. National Highway Traffic Safety Administration (NHTSA) announced on Thursday, October 9, 2025, a sweeping investigation into approximately 2.88 million Tesla vehicles equipped with the company’s Full Self-Driving (FSD) system. This move comes in response to a growing number of alarming reports detailing traffic-safety violations and a series of crashes directly linked to FSD’s operation.
The Heart of the Matter: FSD’s Troubling Behavior
At the core of the NHTSA’s preliminary evaluation are more than 50 distinct reports, which collectively document 58 incidents involving traffic safety violations when FSD was engaged. These incidents include 14 crashes and 23 reported injuries. The auto safety agency asserts that FSD, despite being an assistance system requiring active driver attention and intervention, has “induced vehicle behavior that violated traffic safety laws.”
Specific violations highlighted by the agency are particularly concerning:
- Running Red Traffic Lights: Several reports describe Tesla vehicles operating with FSD driving directly through red traffic signals at intersections.
- Wrong-Way Driving During Lane Changes: Incidents where FSD caused vehicles to drive against the proper direction of travel during a lane change maneuver.
NHTSA explicitly cited six reports where a Tesla operating with FSD “approached an intersection with a red traffic signal, continued to travel into the intersection against the red light and was subsequently involved in a crash with other motor vehicles in the intersection.” Four of these specific crashes led to one or more injuries, underscoring the severity of these alleged failures.
Drivers Speak Out: A Community’s Frustration
The formal investigation amplifies the voices of Tesla owners who have experienced these issues firsthand. One poignant complaint from a driver in Houston in 2024 to NHTSA highlighted FSD’s fundamental flaw: “is not recognizing traffic signals. This results in the vehicle proceeding through red lights, and stopping at green lights.” The complainant further alleged that Tesla has been unresponsive, stating, “Tesla doesn’t want to fix it, or even acknowledge the problem, even though they’ve done a test drive with me and seen the issue with their own eyes.” These accounts resonate within the fan community, sparking discussions about FSD’s real-world reliability and the adequacy of its current design.
Tesla, which recently issued a software update to FSD this week, has not yet officially responded to NHTSA’s request for comment on this new investigation. The market reacted swiftly to the news, with Tesla shares falling 2.1% in early trading, as reported by Reuters. For more details on the specific violations and the scope of the probe, you can review
Reuters’ initial report.
The Road to Recall: What Comes Next?
This preliminary evaluation is a critical first step. Should NHTSA conclude that the FSD system poses an unreasonable risk to safety, the agency possesses the authority to demand a recall of the affected vehicles. Such a measure would carry significant implications for Tesla, not only in terms of cost but also in public perception and the future development of its autonomous technologies. The gravity of this situation is underscored by an increasing push from Congress and the recent confirmation of a new NHTSA administrator, signaling a more assertive regulatory environment for ADAS. This escalating scrutiny reflects a broader industry debate, as explored in
an in-depth analysis from The Verge, about the balance between technological advancement and public safety.
A History of Scrutiny: Tesla’s ADAS Under the Microscope
The latest probe is not an isolated event but rather the continuation of a long-standing pattern of regulatory review of Tesla’s advanced driver-assistance systems. Tesla’s FSD, which is presented as more advanced than its standard Autopilot system, has been under NHTSA investigation for a year prior to this announcement.
Previous inquiries include:
- October 2024 Inquiry: NHTSA initiated an investigation into 2.4 million Tesla vehicles equipped with FSD after four reported collisions occurred in conditions of reduced roadway visibility, such as sun glare, fog, or airborne dust. This included a fatal crash reported in 2023.
- Railroad Crossing Review: The agency also plans to specifically review FSD’s behavior when approaching railroad crossings. Democratic Senators Ed Markey and Richard Blumenthal previously urged an investigation following a rising number of reported near-collisions.
- Remote Movement Feature Probe: In January (likely 2025, given the context of the main article), NHTSA opened an investigation into 2.6 million Tesla vehicles over reports of crashes involving a feature that permits users to move their cars remotely.
- Robotaxi Deployment: Separately, NHTSA is actively reviewing Tesla’s deployment of self-driving robotaxis in Austin, Texas, which commenced in June.
These various investigations highlight a persistent concern among regulators regarding the safety and operational capabilities of Tesla’s ADAS features. Tesla itself states that FSD “will drive you almost anywhere with your active supervision, requiring minimal intervention,” but crucially, it does not claim the car is fully self-driving.
Beyond U.S. Borders: A Global Wake-Up Call
The implications of NHTSA’s investigation extend far beyond the United States. Oliver Carsten, a professor of transport safety at the University of Leeds, commented that the probe “should serve as a wake-up call for Europe.” His concern stems from the increasing proliferation of systems on the market that “blur the line between assistance and automation.” This observation resonates deeply within the global tech community, as companies worldwide race to develop and deploy similar technologies.
The challenge for regulators and consumers alike is to clearly define the capabilities and limitations of these systems. As the technology evolves, so too must the frameworks that govern its safe integration into daily life. For the onlytrustedinfo.com community, this investigation is a stark reminder that while the dream of self-driving cars is compelling, the path to achieving it responsibly is paved with rigorous testing, transparent communication, and unwavering commitment to safety.