
Waymo is facing fresh scrutiny after its driverless cars failed to properly stop for school buses, prompting a new software recall that cuts to the heart of public trust in robotaxis. The company is now reprogramming its fleet so that its autonomous Jaguar vehicles and other models treat flashing school bus lights and extended stop signs with the same caution expected of any human driver.
The move marks a pivotal moment for the self-driving industry, which is under pressure to prove that complex software can handle the most sensitive scenarios on public roads, from children crossing the street to unpredictable neighborhood traffic. It also shows how quickly regulators and local communities are willing to push back when automated systems fall short of long‑standing safety norms.
Waymo’s school bus problem comes into focus
The core issue is simple and alarming: some Waymo vehicles did not behave correctly around stopped school buses, a situation where traffic laws are unambiguous and the stakes are especially high. In Texas, footage and reports showed self-driving cars continuing past buses that were loading or unloading children, behavior that would earn a human driver a ticket and, in some cases, a license suspension.
Waymo has acknowledged that its software needed to be updated so its autonomous Jaguar electric vehicles and other models consistently recognize and respond to school bus stop arms and flashing lights, particularly in complex traffic where other cars may be partially blocking the view. The company is now rolling out a software fix after its self-driving cars were observed passing stopped school buses, a problem that was serious enough to trigger a formal recall of the code running on its fleet of robotaxis in affected areas such as Texas and other markets where the vehicles operate around schools.
Regulators turn up the heat on robotaxi safety
The recall did not emerge in a vacuum. Federal regulators had already been watching Waymo closely, and the school bus incidents gave them a concrete defect to probe. The National Highway Traffic Safety Administration, through its Office of Defects Investigation, opened an inquiry after reviewing video that showed the company’s vehicles moving past stopped buses, raising questions about how the software classified and prioritized those high‑risk situations.
That investigation by the NHTSA and its Office of Defects Investigation, often referred to as ODI, focused on how Waymo’s robotaxis perform around school buses and whether the system’s perception and decision‑making stack met the standard expected for vehicles on public roads. The new software recall is designed to address those concerns by changing how the cars detect bus signals, interpret local traffic laws, and decide when to stop and wait, a shift that underscores how quickly regulatory pressure can translate into code changes for a commercial autonomous fleet.
Texas incidents and a voluntary recall keep cars on the road
The most visible problems surfaced in Texas, where videos of Waymo vehicles passing stopped school buses in cities like Austin helped galvanize public concern. In that state, traffic rules around school buses are strict, and the sight of a driverless car gliding past a bus with its stop sign extended cut directly against the narrative that robotaxis are inherently more cautious than human drivers.
In response, Waymo issued what it described as a voluntary recall after recorded issues in Texas, a move that involved updating the software while allowing vehicles to remain on the roads during the rollout. Technology analyst Banafa framed the situation bluntly, noting that the company had to act because “Now the software they have to do it because again we’re talking about near miss,” a reminder that regulators and the public are less forgiving when children are involved and the margin for error is effectively zero.
Houston’s perspective and the scale of the software fix
The Texas concerns were not limited to Austin. In the Houston region, officials and residents were also briefed on the fact that Waymo’s self-driving cars had passed stopped school buses, prompting the company to extend its software recall to vehicles operating there. The company emphasized that no injuries had been reported as a direct result of the school bus behavior, but acknowledged that the pattern of incidents warranted a fleet‑wide software correction.
Waymo, owned by Google’s parent company, has framed the recall as a proactive safety measure, arguing that updating the software is the fastest way to eliminate the risk that its cars will misinterpret school bus signals in the future. The company has said that the problem occurred because of how its system handled specific combinations of bus lights, stop arms, and surrounding traffic, and that the new code is intended to ensure the vehicles stop and wait in every scenario where state law requires it.
A pattern of recalls: from gates and chains to school buses
The school bus recall is not Waymo’s first encounter with software defects that only became obvious once robotaxis were deployed at scale. Earlier this year, the company recalled 1,200 robotaxis after a series of low‑speed collisions with gates and chains, a reminder that even seemingly minor infrastructure can confuse automated driving systems. Those incidents involved vehicles misjudging the presence or position of thin obstacles, leading to scrapes and bumps that, while not catastrophic, exposed gaps in the perception stack.
In regulatory filings tied to that earlier campaign, The Alphabet owned company explained that its Safety Board had decided to conduct a recall for a specific version of the software and then update the code to “fulfill relevant regulatory reporting obligations.” That episode showed how Waymo’s internal Safety Board and external regulators are now intertwined, with each new defect prompting not only a technical fix but also a formal process that documents what went wrong and how the company plans to prevent similar failures in the future.
Federal documentation and the anatomy of a defect
When Waymo identifies a problem that rises to the level of a recall, it must spell out the issue in detail for federal regulators, and the school bus behavior is no exception. In prior filings, the company’s Part 573 Safety Recall Report has included a section labeled Description of Defect that explains how the software misinterpreted certain road features or traffic scenarios, and a Chronology that walks through when engineers first noticed the issue, how it was reproduced, and when leadership decided a recall was necessary.
One such report described how the affected population of vehicles had all received an updated software build and map data, yet still exhibited problematic behavior in a narrow alleyway while executing a low‑speed pullover maneuver, a scenario that ultimately triggered a recall. That level of detail in the Description of Defect and Chronology offers a window into how complex and context‑dependent autonomous driving failures can be, and it sets a template for how the school bus recall will likely be documented as regulators seek to understand exactly why the cars did not stop when they should have.
From telephone poles to school zones: a growing safety ledger
The school bus recall also follows a separate safety issue in Phoenix, Arizona, where a Waymo driverless car hit a telephone pole after misjudging its environment. That crash, which occurred during a low‑speed maneuver, underscored that even in relatively simple settings, the software can still make mistakes that a human driver might avoid, especially when dealing with fixed roadside objects that sensors and maps are expected to capture accurately.
In response to that incident, Waymo issued a voluntary software recall for all 672 robotaxis in the affected program, updating the code so the vehicles would be less likely to drive into poles or similar obstacles in the future. Taken together with the earlier recall involving 1,200 robotaxis and the new school bus‑related fix, the Phoenix crash shows how the company’s safety ledger is expanding, with each new problem adding another layer of complexity to the task of proving that its driverless cars can handle the full range of real‑world hazards.
Public trust, Uber rivalry, and the Texas spotlight
The Texas school bus incidents landed at a delicate moment for Waymo, which is competing with Uber and other players to dominate the autonomous ride‑hailing market in cities like Austin. In that city and across Texas, Uber and Waymo have both launched autonomous rides, turning local streets into a live test bed for competing robotaxi services that promise convenience and reduced congestion but also raise questions about safety and accountability.
Those questions became sharper as reports emerged of self-driving cars illegally passing stopped school buses in Texas, prompting federal authorities to signal that they were INVESTIGATING Waymo’s behavior around those vehicles. For parents and school districts, the idea that a driverless car might ignore a bus’s stop arm is more than a technical glitch, it is a direct challenge to the social contract that has long governed how drivers behave in school zones, and it puts added pressure on Waymo to show that its new software can restore confidence.
What the school bus recall means for the future of robotaxis
Waymo’s decision to recall software over school bus behavior is a reminder that autonomous driving is not just a question of engineering elegance, it is a test of whether code can reliably encode the most conservative interpretation of traffic laws designed to protect children. The company’s willingness to keep vehicles on the road while it rolls out fixes, as it did with the voluntary recall after recorded issues in Texas, reflects a belief that incremental software updates can close safety gaps without pausing commercial operations, a stance that regulators will continue to scrutinize.
At the same time, the pattern of recalls, from low‑speed collisions with gates and chains to a driverless car hitting a telephone pole and now robotaxis failing to stop for school buses, suggests that the path to fully trusted autonomy will be longer and more iterative than early marketing promised. As Waymo updates its software after a robotaxi drives into a telephone pole and now retools its code to handle school buses correctly, the company is effectively learning in public, with each misstep documented in federal filings and amplified by local communities that are living with the consequences on their streets.
More from MorningOverview