Federal safety regulators are examining Waymo after one of the driverless vehicles operated by the subsidiary of Google’s parent company struck a pedestrian in a hit-and-run crash — an incident that casts new doubts on how automated systems will perform on some of the trickiest portions of any vehicle’s journey: getting through intersections and school zones.
Federal Probe Focuses on School Bus Stop Compliance
The National Highway Traffic Safety Administration said it was investigating the performance of Waymo’s self-driving system around stopped school buses. The robotaxi first stopped behind a bus with its stop arm and crossing gate extended, and then it crept forward, steered around the front of the bus and made a low-speed left turn as students were getting off the vehicle within its vicinity.

Investigators expressed fear of a repeat, and said one focus will be how well the system can detect stop arms, flashing lights and children in close proximity — and if it gives enough warning when visibility is compromised. The agency’s defect authority permits it to seek recalls or software fixes if it finds an unreasonable safety risk.
What Waymo Says Happened Near the Stopped School Bus
Waymo said the bus was partly blocking a driveway the vehicle was pulling out of, obscuring it from view by the vehicle’s stop sign and lights. The company said the robotaxi progressed slowly to establish line of sight and steered around the front of the bus, at a distance from children, and then made some sort of turn. Software updates have already been pushed out to address this edge case, and the company plans additional refinements.
The explanation underscores one of the perennial technical challenges of automated driving: occlusions. When a big vehicle blocks important clues, it has to determine whether to wait forever, edge ahead or reroute. That decision rests on risk-calculating models, safe speed set-points and attentive responses to vulnerable road users. Buses to and from school, they point out, raise the stakes, as the law and safe-practice norms require unerring caution.
Why It Matters for AV Safety and School Bus Laws
U.S. traffic laws mandate that drivers stop for school buses with flashing red lights and an extended stop arm, subject to few exceptions on divided highways. Human drivers consistently fail this test: The National Association of State Directors of Pupil Transportation Services tabulates tens of thousands of suspected illegal passes in a one-day national survey. AVs not only have to clear that bar, but also show predictable and repeatable compliance in each individual market where they’re deployed.
Child pedestrians have been a long-time target for safety researchers when it comes to perception systems. The kids are smaller, they move unpredictably — and right in the way of the bus we hope will protect them. AV stacks need to integrate lidar, radar, and cameras in order to infer pedestrians beyond line of sight, they need conservative path planning and the ability to recognize specialized cues such as stop arms and hazard strobes — even if partially occluded.

Some proponents say vehicle-to-everything technology could help, with school buses transmitting a uniform signal that would compel nearby vehicles — both those operated by humans and ones driven or controlled by computers — to stop. That is promising, but it depends on wide adoption and robust interoperability standards, which are still in the works.
More Scrutiny of Waymo’s Driving System and Recalls
The new investigation by the highway traffic safety agency follows two other investigations of Waymo, including its performance in crashes with stationary objects and traffic control devices. The company had earlier announced a voluntary recall on all of its fleet in order to modify software behavior and minimize contact with fixed hazards.
Waymo has said that it is a safer driver than humans, and estimates do bear out lower rates of injury-causing crashes across its operations. The company offers robotaxi service in several U.S. cities and has indicated that it is eyeing international expansion. Regulators, though, are focused on specific failure modes — such as school bus stops — where rare errors have outsized risk.
What Comes Next in the Federal Waymo Investigation
The federal investigation is expected to demand extensive data logs, video and simulation results on how Waymo’s system classifies school buses, the way it decides if stop arms are extended, specifics for no-go zones set around loading zones and adversarial scenarios like what happens when key cues are occluded. These researchers usually check whether a problem is specific, reproducible and fixable with an update without introducing any unexpected side effects.
To the riders and residents in Waymo’s service areas, these near-term changes may be subtle: slower approach speeds, a wider buffer zone and more aggressive no-overtake logic around large vehicles. The message for the industry is clear. Navigating school bus operations is more a table-stakes competency than an optional feature — one that will heavily impact regulatory runway and public trust for broader autonomous deployments.
The stakes are as much practical as they are ethical. Millions of children ride school buses each day, and every drop-off is a high-stakes environment. Whether there is a human or robot behind the wheel, if he/it can’t without fail yield to an all-flashing red light and an out-flung stop sign, he/it has no business being in that space. NHTSA’s conclusions could indicate whether Waymo’s latest revisions meet that standard — and how the government wants all autonomous fleets to demonstrate it.