The National Transportation Safety Board has launched an investigation into Waymo after multiple reports that its driverless vehicles illegally passed stopped school buses, a scenario that safety experts consider among the most critical tests for any automated driving system.
What the NTSB Probe Signals for Waymo and Safety
The safety board says it is examining more than 20 incidents in Austin, Texas, while acknowledging similar events in at least one other state. Unlike the National Highway Traffic Safety Administration, the NTSB does not levy fines or order recalls; instead, it conducts forensic investigations to determine root causes and then issues nonbinding safety recommendations. Even so, an NTSB probe is a clear escalation and often shapes regulatory, industry, and public expectations around complex safety failures.

This is the first NTSB investigation focused on Waymo, though it follows a defect investigation opened by NHTSA into the same school bus behavior. Waymo previously pushed an over-the-air software recall intended to address the issue, but new incidents captured by school bus cameras in Austin suggest the fix was either incomplete or insufficiently generalized.
A Pattern Across Cities Raises Deeper Safety Concerns
One early incident occurred in Atlanta, where a Waymo vehicle exited a driveway, crossed in front of a school bus from the right side, then turned down the street while students were unloading. The company later said its system could not detect the stop sign and flashing lights in that specific configuration and updated its software. Subsequent footage published by Austin’s KXAN, however, showed additional illegal maneuvers around stopped buses, triggering heightened scrutiny from local officials and federal agencies.
In Austin, the school district asked Waymo to pause operations around pickup and drop-off windows. Texas law requires drivers in all lanes to stop for a bus with activated red lights or an extended stop arm on undivided roadways; violators face fines that can reach four figures for a first offense. The reported incidents span multi-lane corridors and neighborhood streets—exactly the kind of mixed environments where autonomous systems must interpret signals under occlusion, glare, and complex right-of-way dynamics.
Why School Buses Are a Critical Test for AV Perception
Stopped school buses create a moving matrix of cues: amber pre-warning lights that transition to flashing red, a deploying stop arm, crossing students who may emerge from blind spots, and trailing traffic that sometimes fails to stop. For automated vehicles, the challenge is multi-layered: reliably recognizing bus-specific hardware and lighting patterns in varied lighting, anticipating child pedestrian trajectories, and resolving ambiguous partial views—such as when the bus is approached from the side or partially blocked by parked cars or foliage.
Industry engineers describe three critical failure modes in these scenarios: late or missed detection of the stop-arm state; mistaken precedence assignments that allow movement on perceived gaps; and misclassified bus lighting when viewed at odd angles or distances. Any one of these errors can produce an illegal pass. That is why many developers hard-code conservative “school bus behaviors,” effectively treating the bus as an immediate stop condition within a wide geofence and only resuming after explicit confirmation the hazard has cleared.
The stakes are high. The National Association of State Directors of Pupil Transportation Services has long warned that illegal pass-by incidents number in the tens of millions each school year nationwide, based on its annual one-day stop-arm survey. Autonomous systems that consistently fail to meet this bar risk amplifying a known, deadly human behavior rather than reducing it.

Regulatory Context and Possible Outcomes
While the NTSB cannot mandate fixes, its findings can trigger follow-on actions from NHTSA, including recalls, and can influence state and local permitting. In other recent automated-vehicle cases, investigations have led to software updates, narrower operational domains, and additional operator oversight requirements.
Waymo, which has emphasized the overall safety performance of its fleet in public reports, is under pressure to demonstrate that its perception, prediction, and decision-making stack can robustly handle school buses across edge cases—not just the specific Atlanta scenario it previously patched. Investigators are likely to scrutinize training data coverage, the handling of occlusions and counterfactuals, and any change-management processes governing safety-critical updates.
Waymo’s Expansion Plans Face Heightened Regulatory Scrutiny
The probe lands as Waymo accelerates growth, recently adding Miami to a footprint that includes Atlanta, Austin, Los Angeles, Phoenix, and the San Francisco Bay Area. Expansion typically introduces new local bus models, lighting patterns, and road geometries—variables that can expose gaps in perception and policy layers if the system’s generalization is not robust.
Analysts say the near-term risk is operational: limits on service windows near schools, geofencing adjustments, or targeted slowdowns around bus routes. The longer-term risk is reputational—if regulators conclude that a foundational safety scenario remains unreliable after updates, it could slow autonomous deployments nationwide.
What to watch next:
- Whether the NTSB convenes a public hearing
- Whether NHTSA escalates its defect probe
- How swiftly Waymo can document measurable, repeatable improvements in school bus detection and compliance across all cities where it operates
Until then, the benchmark is simple and unequivocal—the car must stop, every time.