The U.S. National Highway Traffic Safety Administration (NHTSA) has significantly expanded its investigation into nearly 2.9 million Tesla vehicles equipped with the Full Self-Driving (FSD) system, citing reports of serious traffic violations, including running red lights and driving into opposing traffic, which have led to crashes and injuries. This probe marks a critical juncture for Tesla’s autonomous ambitions and raises fundamental questions about the safety and widespread viability of current driver-assistance technologies.
The latest announcement from the National Highway Traffic Safety Administration (NHTSA) has sent ripples through the automotive and tech worlds. An official investigation has been launched into a staggering 2.88 million Tesla vehicles fitted with the company’s Full Self-Driving (FSD) system. This move comes after numerous reports detailing troubling traffic safety violations, putting Tesla’s autonomous driving claims squarely under the microscope.
The Expanding Scope of NHTSA’s Scrutiny
The probe, initiated by the agency’s Office of Defects Investigation, specifically targets vehicles operating with either the “FSD (supervised)” or “FSD (beta)” versions of the system. These versions explicitly mandate that a “fully attentive driver who is engaged in the driving task at all times” must be present, as noted in a document filed by NHTSA.
Reports leading to this investigation paint a concerning picture of vehicle behavior while FSD is engaged. Documented violations include:
- Vehicles driving through red traffic signals.
- Initiating lane changes into opposing traffic.
- Failing to remain stopped for the duration of a red traffic signal.
- Failing to stop fully at intersections.
- Inaccurately detecting and displaying the correct traffic signal state in the vehicle interface.
To date, NHTSA has received 58 safety violation reports linked to Tesla vehicles using FSD. These incidents are far from minor, encompassing more than a dozen crashes and fires, along with a reported 23 injuries. Six specific crashes where a Tesla reportedly continued into an intersection against a red light resulted in four injuries, according to a report by Reuters.
A History of Regulatory Watch
This isn’t NHTSA’s first dance with Tesla’s automated driving systems. U.S. regulators have been investigating the company’s driver-assistance features for more than three years due to dozens of crashes that raised significant safety concerns. A tragic 2024 Seattle-area crash involving Full Self-Driving, which resulted in the death of a motorcyclist, further intensified these concerns.
Beyond FSD, Tesla faces other ongoing investigations:
- A probe into its “Summon” technology, which allows cars to drive to their owners, has reportedly caused minor fender benders in parking lots.
- An investigation opened last year into driver-assistance features in 2.4 million Teslas followed several crashes in low-visibility conditions like fog, including one fatal incident involving a pedestrian.
- NHTSA launched another probe in August looking into allegations that Tesla has not been promptly reporting crashes to the agency, as required by law.
These recurring issues have led lawmakers and safety regulators to seriously question whether Tesla’s automated systems, and similar technologies, can truly operate safely on a widespread scale. A new law in California, set to take effect next year, will directly hold driverless car companies accountable for traffic violations.
Understanding Full Self-Driving: A Level 2 System
It’s crucial for enthusiasts and potential adopters to understand that Tesla’s FSD, despite its name, is classified as a Level 2 driver-assistance system. This means it provides advanced assistance with steering, acceleration, and braking, but it requires continuous driver supervision. The driver is expected to remain attentive and ready to intervene at all times.
This distinction is often a point of discussion within the fan community. As observed in online forums, many users debate the evolving capabilities of different FSD software versions, sometimes arguing that issues apply only to older iterations. However, NHTSA’s probe encompasses both “FSD (supervised)” and “FSD (beta)” versions, indicating that the core functionality across these iterations is under review for safety risks, regardless of specific software updates. This ongoing supervision requirement is a key differentiating factor from true Level 4 or Level 5 autonomous systems, which are designed to operate without human intervention in specific conditions or all conditions, respectively.
Community Concerns and Real-World Impact
The reports highlight instances where FSD allegedly failed to provide warnings of its intended behavior before committing violations, leaving drivers little time to react. This feedback from the user community underscores a fundamental tension: while FSD aims to enhance convenience and safety, unexpected or unsafe actions undermine driver trust and create dangerous situations.
Seth Goldstein, a Morningstar analyst, aptly captured the prevailing sentiment, asking, “the ultimate question is, ‘does the software work?'” This query resonates with many users who are eager for reliable self-driving features but are increasingly concerned by high-profile incidents and regulatory scrutiny. The probe’s focus on documented traffic law violations is particularly significant because these are unambiguous failures in fundamental driving tasks.
The Broader Implications for Autonomous Driving
The expanding NHTSA investigation is more than just a setback for Tesla; it has broader implications for the entire autonomous driving industry. Public trust in self-driving technology is fragile, and incidents like these can significantly hinder its widespread adoption. Regulators are increasingly scrutinizing the balance between innovation and safety, with a clear trend towards holding companies accountable.
This intensified regulatory pressure is evident not only in the U.S. but also in legislative changes like California’s upcoming law. Such measures reflect a growing consensus that as autonomous systems become more prevalent, the legal and ethical frameworks governing their operation must also mature, establishing clear lines of responsibility when things go wrong.
What’s Next for Tesla and FSD
The preliminary evaluation by NHTSA is the initial step that could lead to significant actions, including a recall of the affected vehicles if serious safety risks are confirmed. This would be a major blow to Tesla’s reputation and financial performance. Following the announcement of the probe, Tesla shares fell 1.4% on Thursday, reflecting investor concerns.
CEO Elon Musk is under considerable pressure to demonstrate that the latest advancements in Tesla’s driver-assistance features have effectively addressed these persistent problems. He recently reiterated ambitious goals, promising to deploy hundreds of thousands of self-driving Tesla cars and robotaxis on roads by the end of next year. The outcome of this NHTSA investigation will undoubtedly play a critical role in whether those ambitious timelines can be met and, more importantly, whether the public can truly trust the safety of these advanced systems, as reported by The Verge.