Waymo has ordered a voluntary software recall of its autonomous fleet after receiving several reports that its robotaxis failed to stop for school buses with their stop arms extended and lights flashing. The move will come in a filing to the National Highway Traffic Safety Administration, seeking to fix the way its vehicles recognize and react in these high-priority situations.
Unlike traditional recalls to the service bay, however, this fix will be deployed over the air, illustrating how safety-critical behavior in driverless vehicles increasingly depends upon software, not hardware.
The move comes amid increasing scrutiny from regulators and school officials over the cars’ behavior near children and loading zones at schools.
What Went Wrong Around Stopped School Buses
The complaints arose after accounts of a Waymo vehicle driving by a stopped school bus and at least one other documented incident in a massive Texas school district. That district informed the company that it had evidence of at least 19 violations, in which robotaxis cut around school buses — including five after Waymo said its software fixed the problem. One car started pulling away even though a student was still in the roadway.
Waymo attributed the problem to its fifth-generation automated driving system, which manages perception and decision-making. Note that even the school bus interaction is known as an edge case to self-driving stacks: they have to identify a bus in all sorts of lighting and occlusions, distinguish yellow hazard lights from red stop indicators, detect and interpret a physical stop arm, and abide by state-specific rules that usually require a full stop along with gigantic safety margins.
Regulatory Pressure and Context of Safety
The NHTSA launched an investigation following media reports and later told Waymo to either suspend service during school hours or seek a formal exemption. The company selected a recall, with its chief safety officer saying Waymo’s standard is to “notice when our behavior should be better” and take the appropriate action of slowing and stopping around school buses. Software recalls are usually resolved by targeted updates and validation, rather than sidelining entire fleets, industry watchers say.
This is not Waymo’s first recall related to software. That followed the company updating some 1,200 vehicles after its system would not properly handle roadside barriers like chains and gates. The through line is clear — regulators view software behavior that might increase crash risk as a safety defect no more or less dangerous than a broken brake line.
The wider autonomous vehicle sector is under increasing pressure. California regulators put another operator’s driverless permits on hold following a high-profile pedestrian episode, and the federal agency continues to collect crash and incident data across companies to identify systemic risks. No place is more sensitive than school zones and buses.
How the Recall Will Play Out for Waymo Robotaxis
Waymo plans to push an over-the-air patch that tweaks multiple layers of its stack: vision and lidar detection for school bus cues, state machine logic for stop-arm and flashing-red events, and conservative motion planning so the vehicle not only stops but maintains a safety buffer until all pedestrians have crossed.
Operationally, the company could keep things restricted near schools or within usual pick-up and drop-off hours as it verifies performance. These municipalities and school districts have requested further security measures, and specified reporting until the recurrence of this behavior is reliably addressed.
The Importance of School Bus Detection as an AV Test
Stopping for school buses is a bright-line rule in just about every U.S. jurisdiction, and violations run an outsized risk. A nationwide tally by the National Association of State Directors of Pupil Transportation Services found 241,151 illegal pass-bys in just a day’s worth of reporting — a measure, presumably, of how often human drivers get it wrong. In the case of robotaxis, a moment’s hesitation or misread would also be unacceptable, simply because there is no margin for error around children.
This recall highlights two things to be true with driverless tech: Software can get better very, very quickly at scale, but public trust relies on transparent treatment of the edges of the problem. Look for closer cooperation among AV companies, regulators, and school transportation officials in the future — and clearer, machine-readable signals on buses (stronger lighting language, standard stop-arm signaling, and maybe vehicle-to-everything messaging) to help autonomous systems do what they’re supposed to do, as well as humans.