Federal safety authorities have opened a new investigation of Tesla’s Full Self-Driving driver-assistance software after a flood of complaints that vehicles ran red lights and swerved into opposing lanes. The National Highway Traffic Safety Administration’s Office of Defects Investigation said it has received reports of more than 50 such incidents, including four that have led to injuries.
Why Regulators Have Their Sights Set on FSD
The investigation is one of the more direct looks into Tesla’s most advanced driver-assistance features, which it calls Full Self-Driving but that it categorizes as a hands-on, Level 2 system that needs constant driver oversight. It follows on an earlier conclusion of a separate investigation into Tesla’s Autopilot that revealed 13 fatal crashes linked to the misuse of the system, and running parallel with an ongoing assessment regarding the effectiveness of Tesla’s recall-related remedies for Autopilot.
NHTSA has subjected FSD to examination before, having previously opened probes into crashes that occurred in low-visibility conditions. This new challenge further distills the focus to basic, traffic-control-like behaviors, which are arguably the baseline for safe operation of a vehicle in an urban environment and whether or not FSD’s software competently deals with them while not endangering its drivers and other road users.
What The Complaints Describe About FSD Behavior
Based on at least 18 consumer complaints and one media-reported incident, investigators say they have received allegations that FSD did not stop or remain stopped at red lights. Tesla also submitted six incidents to the agency under its Standing General Order on Crash Reporting requiring automakers to report crashes that involve advanced driver-assistance or automated driving systems.
There is a separate cluster of reports claiming FSD swerved too soon or late: 18 complaints, two media accounts, and two other Standing General Order submissions report vehicles cutting into opposing lanes while turning or after completing a turn, crossing double-yellow lines on straight runs, trying to make turns onto streets with clear wrong-way signage.
Another series of six complaints, one media report and four Standing General Order submissions describe vehicles traveling straight from a designated turning lane or making turns from the through lane. In some cases, regulators say it played out rapidly, giving drivers limited time to overrule the system.
ODI has even borne repetition in mind, it turns out. In collaboration with the Maryland Transportation Authority and State Police, among other partners, the agency investigated multiple incidents at that same intersection in Joppa, Maryland. Tesla sent a patch over-the-air to fix that geography-specific problem, and it shows how software updates can attack edge cases—though broader investigation is supposed to determine whether there are systemic problems.
How An NHTSA Probe Can Result In A Recall
The agency has opened what it terms a Preliminary Evaluation, which is the first formal step that can lead to an Engineering Analysis and then, if deemed necessary, a safety recall. The timelines differ, but it generally begins with a series of data requests ranging from software logic to fleet telemetry, driver intervention rates and incident video where possible.
Even if fixes come in the form of a software update, NHTSA can also insist they be labeled as recalls to ensure traceability and owner notification. That process was used in previous Tesla cases over driver-monitoring and Autosteer limits, and it is the same framework for any sort of remedial action here if defects are found.
The Tech Stakes For Tesla’s Full Self-Driving System
Obeying traffic signals for red-light running and lane decision-making at intersections are challenging tasks due to conflicting traffic lights, occlusions, irregular shape of intersection layout, and time-varying preferences. Tesla’s vision-first, neural-network approach is intended to train on massive amounts of real-world data, and the newest version of FSD reportedly includes footage gathered as part of a robotaxi pilot program in Austin. That can theoretically lead to better behavior at difficult junctions — but it also means that new failure modes might emerge when the learned policy confronts situations it’s never seen before.
Experts in automated driving systems say that wrong-lane and wrong-way errors frequently can be traced to misread signage, ambiguous road markings or inappropriate lane intent modeling at the time a turn is made.
Strong driver monitoring and liberal handoff thresholds can manage risk by ensuring that the human is let in early. Regulators are likely to look at how often the FSD system asks for intervention, how quickly drivers respond and how soon the system’s path-planning adjustments provide warning well before deciding on an erratic move.
It’s also worth noting that according to the handful of published disengagement reports, some of these incidents were clustered at a single intersection—evidence that geospatially-specific corner cases can persist until fleet data or map priors do trigger an update. That pattern illustrates why regulators tend to prefer wide evidence-gathering: a fix for one intersection may have little effect on other ones with similar but not identical markings.
Tesla Drivers: What You Need to Know Now
FSD is a driver-assistance system, not an autonomous chauffeur. Owners are supposed to pay attention and keep hands on the wheel at all times, and be prepared to brake — notably at intersections or when lane guidance seems vague. If the probe proceeds, Tesla could release further over-the-air updates modifying how the system reads red lights, turn lanes and opposing-traffic crossways.
For now, the questions regulators are asking cut to the heart of automated driving’s very credibility: can a consumer-facing system consistently obey basic traffic controls on an ordinary street? The response — evidence, not demos — will help set a course, if not Tesla’s roadmap, and how American safety officials are policing advanced driver-assistance tech development across the industry.