
Waymo’s promise of safer streets is colliding with a very old piece of road reality: the flashing red lights of a school bus. Even after a high profile software recall meant to keep its robotaxis from sliding past stopped buses, reports from police and school districts suggest the company’s fix is not fully working, and the cars keep rolling through some of the most sensitive zones on the road. The gap between Waymo’s assurances and what people are seeing at the curb is now a central test of whether automated driving can be trusted around children.
At the same time, federal regulators are escalating their scrutiny, local officials are documenting dozens of incidents, and parents are watching videos of driverless cars gliding by extended stop arms. The result is a rare convergence of technical, legal, and moral pressure on a single feature of Waymo’s system, and a warning that if robotaxis cannot reliably handle school buses, their broader expansion will face a hard political and regulatory ceiling.
The safety promise collides with the school bus reality
From the start, Waymo has sold its robotaxis as a way to reduce human error, especially in complex urban environments where distraction and fatigue can be deadly. That pitch runs straight into the bright yellow reality of school buses, which are designed to force every other road user to slow down and wait while children cross. When a driverless car fails that basic test, the issue is not just technical, it is existential for public trust in the technology.
Federal safety rules already treat school buses as a special class of vehicle, and the National Highway Traffic Safety Administration has long framed their protection as a core part of its mission. That is why reports of Waymo robotaxis gliding past buses with stop arms extended have drawn such sharp attention from regulators and local officials. The incidents cut directly against the idea that automated driving systems will be more cautious than humans where it matters most, around children stepping off a bus and into the street.
How the federal investigation put Waymo under the microscope
Regulatory pressure intensified when federal safety officials opened a formal probe into Waymo’s handling of school buses, focusing on how its automated driving system responds to flashing lights and stop signs on those vehicles. That investigation zeroed in on a pattern of robotaxis failing to stop or yield properly, treating the behavior not as a one off glitch but as a systemic risk that could repeat across the fleet. The scrutiny signaled that school bus interactions are now a frontline test case for automated driving oversight.
Video clips of the problem helped drive that scrutiny, including footage highlighted in a Dec report that showed the self driving taxi company Whimo’s vehicles moving past stopped buses instead of waiting. Those images gave regulators and the public a concrete view of what had previously been described in technical language, turning abstract concerns about “unexpected driving behaviors” into scenes of real streets where children could have been present. Once those images were circulating, it became much harder for Waymo to argue that the issue was minor or already contained.
ODI’s warning and the logic behind the recall
Inside the federal government, the Office of Defects Investigation, or ODI, framed the risk in stark terms. The office warned that automated driving system equipped vehicles showing unexpected behavior around school buses could create confusion for other drivers and pedestrians, and that the danger would be magnified near schools where children are entering and exiting vehicles. In other words, the concern was not only what the robotaxi did, but how everyone else on the road might react when it failed to follow the script.
That logic helped push Waymo toward a software recall focused on its school bus behavior, after ODI made clear it was worried about the company’s automated driving system in these specific scenarios. In its own description of the issue, the agency stressed that vehicles exhibiting such behavior might need to change their operations near schools, a concern that was tied directly to the ODI investigation. The recall was supposed to be the fix that would restore confidence, both inside the agency and on the streets where the cars operate.
From probe to full recall of 3,000 robotaxis
The federal probe did not stay theoretical for long. Earlier in the fall, the National Highway Traffic Safety Administration opened a formal investigation into Waymo’s driverless vehicles, focusing on their interactions with school buses and other unusual road situations. That initial step set the stage for a much broader response once more incidents came to light and regulators concluded that the risk was not confined to a handful of edge cases.
The result was a recall covering 3,000 robotaxis, a sweeping move that underscored how seriously officials viewed the problem. Reporting on the decision noted that, at the beginning of October, the National Highway Traffic Safety Administration had already been probing why Waymo’s vehicles failed to stop for school buses in the first place. At the time, a Waymo spokesperson defended a September 22 incident, but what began as a single episode eventually turned into a full recall once the pattern became impossible to ignore.
Local districts count incidents while robotaxis keep rolling
While federal regulators worked through their process, local school districts were keeping their own scorecards. In Austin, officials tracked nearly twenty school bus related incidents involving Waymo vehicles, a tally that suggested the problem was recurring rather than rare. Those numbers came not from abstract simulations but from drivers and staff who watched the cars behave unpredictably around buses that were loading or unloading children.
The Austin Independent School District went so far as to send a public letter describing nineteen school bus related Waymo incidents, and it said that five of those occurred after the district had already asked the company to change its operations. That detail, laid out in a Dec account, underscored the frustration on the ground. However carefully Waymo described its safety culture, the district was still watching robotaxis behave in ways it considered unsafe, even after raising alarms directly with the company.
Waymo’s software patch and the limits of a quick fix
In response to the growing pressure, Waymo rolled out a software update that it said would address the school bus problem, framing the change as a targeted patch to improve how its vehicles detect and respond to buses with flashing lights and extended stop arms. The company told reporters that the recalled software would be updated to better handle these scenarios, and that it was working to ensure the fix covered both the known incidents and any similar situations that might arise in the future. The message was clear: the issue was in the code, and the code could be corrected.
Waymo also acknowledged that the number of prior similar incidents was high, a concession that hinted at how often the system had misread or mishandled school bus cues before the patch. In comments shared with USA TODAY, Waymo said it was working to fix the issues with a software update and that the number of prior similar incidents was high. That admission raised a hard question for anyone watching the rollout of robotaxis: if such a basic rule of the road could be mishandled so often before being caught, how many other edge cases might still be lurking in the system, waiting for their own recall?
Police reports and videos show the problem did not stop
Even after Waymo’s patch, reports from law enforcement suggested that the behavior around school buses had not been fully corrected. Police in at least one city documented roughly twenty instances in which Waymo’s robotaxis failed to stop for school buses, a figure that pointed to a persistent pattern rather than a handful of anomalies. Those reports described vehicles moving past buses that were stopped with lights flashing, exactly the scenario the recall was supposed to address.
Video shared with reporters showed Waymo’s robotaxis failing to stop for school buses at least twenty times, according to police, reinforcing the sense that the fix had not fully taken hold. One Dec report described how the self driving taxi company’s vehicles were seen failing to stop at least twenty times, a number that is hard to square with the idea that the issue had been fully resolved. For parents and bus drivers, those images were not just data points, they were reasons to doubt that the robotaxis could be trusted around their children.
Atlanta’s experience and the claim that the fix “doesn’t work”
Atlanta’s public school district added another layer of concern by documenting its own encounters with Waymo’s vehicles. Officials there reported multiple incidents in which robotaxis did not behave as expected around school buses, even after the company’s software update was supposed to be in place. The district’s experience suggested that the problem was not confined to one city or one set of road conditions, but was instead baked into how the system interpreted school bus signals.
Coverage of those incidents described Waymo’s attempt to end robotaxis blowing by school buses as a fix that simply “doesn’t work,” reflecting the frustration of school officials who felt their students were being put at risk. One detailed account noted that any decent human being would take extra care around a stopped school bus, and contrasted that instinct with the behavior of the robotaxis that kept moving past buses used by Atlanta’s public school district. That critique was captured in a Dec analysis that argued Waymo’s fix to end robotaxis blowing by school buses does not work, at least not reliably enough for the people watching from the sidewalk.
Schools say the “not run down children” patch is still failing
Perhaps the most damning criticism came from schools that described Waymo’s software patch in blunt, almost sarcastic terms. One district characterized the update as a patch to “not run down children getting off school buses,” and then said that patch was not working. The phrasing captured the unease many parents feel about entrusting their kids’ safety to a system that needs a specific software tweak just to recognize one of the most obvious hazards on the road.
Reports from that district said the problem was “problematic” and noted that there had been six incidents in Atlanta throughout 2025 involving Waymo vehicles and school buses. Those details were laid out in a Dec story that described how Waymo’s Software Patch to Not Run Down Children Getting Off School Buses Isn Working, School Claims. When a school system feels compelled to use that kind of language about a safety update, it is a sign that the relationship between the company and the community has moved beyond technical debate and into a deeper crisis of confidence.
Why school buses are a make-or-break test for robotaxis
School buses occupy a unique place in American traffic culture, with rules that are drilled into drivers from their first licensing test and reinforced by heavy penalties for violations. For automated vehicles, that makes buses a kind of moral and technical benchmark: if a robotaxi cannot reliably stop for a bus with its lights flashing and stop arm extended, it is hard to argue that the system is ready for the full complexity of public streets. The stakes are not abstract, they are measured in the safety of children who trust that every car will stop when they step off the bus.
Waymo’s struggle to get this right, even after a recall and a targeted software patch, highlights the gap between controlled testing and messy reality. The company can refine its models, add more training data, and push over the air updates, but each new report of a robotaxi sliding past a stopped bus undercuts the narrative that automation is inherently safer than human driving. Until the cars can handle something as basic and emotionally charged as a school bus stop, their expansion into more cities and more neighborhoods will remain a hard sell for regulators, school districts, and the families watching from the curb.
More from MorningOverview