Red Lights and Robotaxis: Analyzing the Growing Scrutiny on Tesla’s Full Self-Driving System

8 Min Read

A new federal probe targets 2.9 million Tesla cars using Full Self-Driving (FSD) over alleged traffic violations and crashes, pushing the debate on autonomous vehicle safety into a critical new phase and putting pressure on Elon Musk’s ambitious robotaxi plans.

The landscape of automotive autonomy is once again under the microscope, as the U.S. National Highway Traffic Safety Administration (NHTSA) has announced a significant new investigation into nearly 2.9 million Tesla vehicles. This probe specifically targets cars equipped with Tesla’s Full Self-Driving (FSD) system, following a worrying accumulation of reports detailing traffic safety violations and associated crashes.

The investigation by the NHTSA’s Office of Defects Investigation encompasses both “FSD (supervised)” and “FSD (beta)” versions of Tesla’s self-driving systems. It’s crucial to remember that both these versions currently demand “a fully attentive driver who is engaged in the driving task at all times,” as highlighted by the NHTSA in its filing. This ongoing requirement for human oversight forms a core part of the regulatory body’s concerns.

The Alarming Reports: What Went Wrong?

The federal agency has received a total of 58 safety violation reports directly linked to Tesla vehicles operating with FSD. These incidents paint a concerning picture of the system’s real-world performance. Among the reported violations are vehicles allegedly:

  • Driving through red traffic signals.
  • Initiating lane changes into opposing traffic.

These violations are not isolated, leading to tangible consequences. The 58 incidents include more than a dozen crashes and fires, resulting in 23 reported injuries. Specifically, the NHTSA cited six reports where a Tesla vehicle, with FSD engaged, approached a red light, proceeded through the intersection against the signal, and subsequently crashed into other vehicles. Four of these crashes resulted in injuries, underscoring the severity of these malfunctions.

One driver in Houston, as cited by the NHTSA, reported that FSD was “not recognizing traffic signals,” causing the vehicle to proceed through red lights and inexplicably stop at green lights. This driver claimed Tesla was aware of the issue but had not acknowledged or fixed it, even after a test drive confirming the problem.

A History of Scrutiny: Tesla’s Road to Autonomy

This latest probe is far from the first time Tesla’s automated driving systems have attracted regulatory attention. U.S. regulators have been investigating Tesla’s systems for more than three years due to dozens of crashes. Key past incidents include:

  • A Seattle-area Tesla crash in 2024 involving Full Self-Driving that tragically killed a motorcyclist.
  • An investigation into Tesla’s “Summon” technology, which allows cars to drive to a driver’s location, after reports of fender benders in parking lots.
  • A probe launched last year into driver-assistance features in 2.4 million Teslas after crashes in low-visibility conditions like fog, including one fatal incident involving a pedestrian.
  • A separate investigation initiated in August by NHTSA, examining why Tesla allegedly has not been reporting crashes promptly, as required by agency rules.

In August, a Miami jury found Tesla partly responsible for a fatal 2019 crash involving its Autopilot feature, distinct from FSD, a decision the company plans to appeal. These incidents collectively highlight the complex and ongoing challenges in the development and deployment of advanced driver-assistance systems.

The Broader Implications: Regulatory Pressure and Fan Community Concerns

The mounting problems have fueled a wider debate among lawmakers and safety regulators about whether Tesla’s automated systems, and similar technologies, can truly operate safely on a widespread scale. A new law in California, set to take effect next year, aims to hold driverless car companies accountable for traffic violations, reflecting a growing legislative push for accountability. This preliminary evaluation by NHTSA could escalate to a full engineering analysis, potentially leading to a recall of the affected vehicles if significant safety risks are confirmed, as reported by Reuters.

Within the passionate Tesla fan community, discussions around FSD’s performance are always active. Many users keenly follow software updates, hoping for improvements to address common grievances such as unexpected braking or misinterpretations of road signs. The “beta” label itself often sparks debate about the acceptable level of risk and driver responsibility when utilizing such advanced, yet still developing, features.

Elon Musk, Tesla’s CEO, remains under immense pressure to demonstrate that the latest advancements in its driver-assistance features have effectively resolved these persistent safety issues. He recently made ambitious promises to deploy hundreds of thousands of self-driving Tesla cars and Tesla robotaxis on roads by the end of next year, a timeline that now faces even greater scrutiny. Following the announcement of the investigation, Tesla shares experienced a decline of 1.4% to 2% on Thursday, reflecting investor unease, according to a report by CNBC. Tesla did not immediately respond to requests for comment regarding the probe.

The NHTSA is also actively reviewing Tesla’s deployment of self-driving robotaxis in Austin, Texas, which began in June. This new layer of investigation signifies a comprehensive regulatory approach to all facets of Tesla’s autonomous and driver-assistance ambitions.

What’s Next for FSD and the Autonomous Future?

This ongoing NHTSA investigation marks a critical juncture for Tesla’s FSD system. While Tesla states that FSD “will drive you almost anywhere with your active supervision, requiring minimal intervention,” the real-world reports of traffic law violations challenge the system’s reliability. The preliminary evaluation aims to fully assess the scope, frequency, and potential safety consequences of the technology.

For enthusiasts and developers alike, the outcome of this probe will undoubtedly shape the future trajectory of autonomous driving. It underscores the immense responsibility that comes with pushing the boundaries of vehicle technology and the paramount importance of ensuring public safety on our roads.

Share This Article