Waymo is readying a voluntary software recall to correct how some of its autonomous vehicles perceive and interact with school buses, an effort that reaches back more than 18 months, according to the company’s annual update on disengagements. It also provides a window onto mainstream concerns about robotaxi operations when children are present. The company says it has installed an update to its fleet and will file paperwork with federal regulators, adding that no injuries are associated with the behavior under scrutiny.
What prompted the recall and the school bus incident reports
The government’s National Highway Traffic Safety Administration, Office of Defects Investigation (ODI) conducted a probe after footage surfaced showing a Waymo vehicle operating in its autonomous mode was driving around a stopped school bus as children were being dropped off in Atlanta. The Austin Independent School District also submitted additional reports indicating multiple alleged illegal bus passes by Waymo vehicles, including kids who said they’d been nearly hit, a number of which the school district says happened after the company had sent out an update to its software.

In a letter to us, federal regulators requested more information, including detailed logs; data on decision-making; and information about how we deploy our fleet for interactions with school buses, down to the fifth of a second.
Waymo, for its part, claims it will work closely and continue to make safety updates public as changes are implemented.
What the software fix is designed to improve and prevent
Waymo said the recall is focused on situations in which the vehicle must stop and slow itself around school buses. The company says that its latest over-the-air update includes better detection and policy handling when buses deploy flashing red lights or extended stop arms and that it has adjusted decision-making to cut down on risky maneuvers in complex, multi-lane scenarios. The company didn’t share every technical detail, but such changes generally involve improved perception of bus-specific cues and more cautious motion planning as well as stricter adherence to state-based rules for passing buses.
That would exceed human driver performance in this set of scenarios, based on data the company holds internally, according to Waymo’s chief safety officer. The company adds it will keep collecting evidence, watching edge cases and iterating on the policy as school districts and regulators offer further feedback.
Why School Buses Are a Special Challenge
School bus stops are dynamic, slow-speed situations where small errors can have large results. Kids can pop out unexpectedly from behind parked cars, sight lines can be tricky and the most important signals — stop arms, flashing lights, hand signals — might not all end up in the right places or even within a driver’s view. Add in state-by-state variations such as divided-highway exemptions, multi-lane rules and more, and an automated system has to meld nuanced legal reasoning with tough-to-distinguish visual- and radar-based recognition when drenched in unknown variables.

The stakes are high. The National Association of State Directors of Pupil Transportation Services for years has conducted a once-a-year survey, and it estimates on a single day hundreds of thousands of illegal passes are made across the nation: most recently, numbers have proved to be nearly 242,000 when adjusted. Even if most of those are the fault of human drivers, any robot system will have to be proven much more cautious — accepting delays rather than risking a misread.
Regulatory scrutiny of robotaxis and formal AV recalls
Software updates are already a familiar tool in the modern car industry, but they would be especially common for fleets of autonomous vehicles, where solutions to problems come packaged as code. Regulators want companies to report a formal recall, even if they have already installed corrective software over the air, so there is an auditable record and owners will receive transparent notification. The school bus focus in AV oversight comes as part of a growing trend for greater scrutiny on autonomous vehicle programs where regulators are demanding transparency on incident data, safety cases, and verification methods.
Waymo adds that its vehicles have safely driven millions of autonomous miles and points to internal case analyses demonstrating significantly fewer pedestrian injury crashes than human drivers — a 12x decrease in those specific metrics. Regulators and researchers continue to emphasize the need for independent validation, but by positioning the recall as an aspect of continuous improvement rather than a retreat from autonomy, the company attempts to preserve its vision.
What’s next for Waymo and its school bus safety update
The company intends to submit the recall to NHTSA and says every active vehicle will have or already had the software update installed. Anticipate comprehensive regulators’ questions about how the system works in detecting school buses, interpreting laws across various jurisdictions, and enabling conservative fallback behavior when cues are unclear or peeking.
Waymo has issued a string of software recalls in the last two years, including one that was spurred after a slow-speed crash with a telephone pole in Phoenix. Those concrete steps are not an anomaly at a time when history is repeating itself with autonomous operators — with escalating over-the-air remedies, formal filings to secure the fix, and public accountability through federal scrutiny. The bigger test now is whether robotaxis can show that they can, repeatedly and verifiably, operate more safely around the most vulnerable road users — children — every time, in every city and under every circumstance.
