Key Takeaways
- NHTSA investigating 2.88 million Tesla vehicles over Full Self-Driving safety concerns
- 58 incident reports include running red lights, wrong-way driving, and crashes
- 14 crashes and 23 injuries reported in FSD-related incidents
- Investigation could lead to massive recall if safety risks confirmed
The U.S. National Highway Traffic Safety Administration has launched a major investigation into Tesla’s Full Self-Driving technology affecting 2.88 million vehicles. Regulators identified multiple safety violations where the system allegedly broke traffic laws and caused accidents.
Red Light Violations and Railroad Crossing Dangers
According to Reuters, 58 specific reports detail Teslas running red lights, drifting into wrong lanes, and crashing at intersections. Six vehicles reportedly ran red lights before colliding with other cars.
One Houston driver reported that FSD “is not recognizing traffic signals,” noting the car stopped at green lights but ran through reds. The driver claimed Tesla witnessed the issue during a test drive but refused to address it.
The agency is also examining reports of FSD failing to handle railroad crossings safely, including one near-collision with an oncoming train.
Mounting Legal and Regulatory Pressure
This marks Tesla’s latest regulatory challenge involving its driver-assistance systems. The company faces multiple ongoing investigations into both Autopilot and FSD technologies.
In a high-profile case, a California jury ordered Tesla to pay $329 million after an Autopilot-related crash killed a woman. Another probe examines Tesla’s Robotaxi service in Austin, where passengers reported erratic driving and speeding despite human safety drivers.
California’s DMV is pursuing a false advertising lawsuit, arguing the “Full Self-Driving” name misleads consumers since the system requires constant supervision. Tesla recently rebranded it as “Full Self-Driving (Supervised)”.
Potential Recall Looms
The NHTSA investigation began shortly after Tesla’s latest FSD software update. Regulators state the system has “induced vehicle behavior that violated traffic safety laws.”
This preliminary investigation could escalate to a full recall if officials determine Tesla’s self-driving software presents significant safety risks.
Safety Recommendations for Tesla Drivers
If you own a Tesla with FSD enabled, maintain constant vigilance. The system requires active driver supervision despite its name.
- Keep hands on the wheel and eyes on the road continuously
- Manually override at intersections, crosswalks, and railroad crossings
- Regularly check for critical safety updates
- Report unsafe FSD behavior to NHTSA immediately
For other motorists, this investigation underscores that current “self-driving” technology remains supervised driving.
Broader Implications for Autonomous Vehicles
Tesla’s autonomous driving ambitions face increasing regulatory scrutiny and legal challenges. The company’s response to these safety concerns will significantly influence public trust in AI-driven transportation systems.
While automation development continues accelerating, it now operates under intensified regulatory oversight following these safety incidents.





