Waymo has begun a software recall for its driverless fleet after receiving reports that some of its robotaxis improperly passed stopped school buses in Austin. The company will do this through an over-the-air update, which means the vehicles will stay on the road while the fix filters through its ride-hailing business.
What prompted the recall: reports of school bus stop-arm violations
The action comes after a National Highway Traffic Safety Administration investigation of reports that Waymo vehicles were not stopping for school buses with an extended stop arm and flashing red lights. Regulators listed at least 19 incidents in Austin, a tally that began to draw notice after a video of a Waymo vehicle passing a bus during student loading was widely circulated.

The Austin Independent School District asked Waymo to discontinue service during school bus loading and unloading until the company could establish that the issue had been resolved. Waymo has not indicated that it plans to halt operations, but the company says it is dealing with the behavior through the recall and other added measures.
How The Update Will Alter Robotaxi Behavior
(The company has not released technical details, but an OTA recall generally aims at the perception and decision-making modules, which dictate how a vehicle interprets and responds to various stimuli.) In the school bus case this would likely involve stop-arm detection becoming more robust, a higher degree of distinction between amber and red flash patterns, tighter right-of-way rules, as well as longer periods before commencing to move.
It is this sort of edge-case polish that self-driving systems are regularly subjected to. “Similar to mechanical failures, software compliance can be remedied fleetwide in a matter of hours or days, and NHTSA does treat these noncompliances the same as safety recalls in order to facilitate control of traceability and documentation.”
Why school bus laws leave little margin for autonomous vehicle errors
In Texas, motorists must stop for a school bus that has its stop sign extended and flashing red lights, no matter which direction they are traveling on undivided roadways. Violations incur hefty fines and, more importantly, are meant to stave off the most predictable and devastating type of road tragedy: a child stepping out in front of a stopped bus. For self-driving cars, this is a zero-tolerance principle that would have to override legally sound passing moves or ambiguous traffic flow indications.
The challenge isn’t just recognizing the silhouettes of yellow buses; it’s interpreting the stop arm robustly in all sorts of light, occlusions, and weather conditions, and then getting that decision “right” even when some part of the signal is being hidden. Industry engineers tend to call these “high-salience, low-frequency” events — rare in the data but with zero tolerance for noncompliance.

Regulatory Pressure And The AV Learning Curve
NHTSA has recently stepped up oversight of autonomous deployments, chasing down incident reports and forcing companies to recall software that it says can result in illegal or unsafe behavior. The approach is based on the industry’s history: one of its competitors, Cruise, took its fleet off the roads and filed a software recall after a high-profile pedestrian injury led to expanded scrutiny of automated decision-making in difficult situations.
For its part, Waymo has released safety reports that include millions of miles when riders are on board, covering crash rates it says are under human baselines in geofenced areas where the company operates. Still, regulators and school officials care less about averages than they do specific failure modes, particularly those associated with kids. The fix on a single, systemic behavior — how a car responds when it encounters an active school bus, say — can be as important in maintaining public trust as any aggregate number.
What the software recall means for riders and partner cities
And because the recall is a software update, riders in the service areas of Waymo should experience little impact. The company currently provides driverless service in San Francisco, Phoenix, Los Angeles, Atlanta, and Austin, and has expressed interest in moving to other U.S. metros as well as international markets. School districts and city transportation departments will closely monitor the post-update performance to determine if it eliminates bus passing.
For autonomous mobility to scale, moments like this are inescapable — and educational. The technology doesn’t just learn from disengagements and collisions but also, perhaps more importantly, learns when a company was in violation for having let a gap exist between its code and codified law. This latest recall is a reminder that safety leadership is as much about the speed and transparency with which companies respond to specific risks as it is about how many miles they travel.
Assuming that the new behavior consistently yields to school buses as they are accustomed to through all lighting and traffic conditions, Waymo will have effectively transformed a regulatory warning sign into a lasting safety gain. If not, restricted operating parameters will tighten further as the vehicles decidedly prove they are a whole other ball game.