Image Credit: Dllu - CC BY 4.0/Wiki Commons

On a narrow San Francisco street, three driverless taxis recently turned a routine evening into a live demonstration of how autonomous systems can fail in oddly human ways. Two self-driving Waymos appeared to clip each other at low speed, then froze in place, effectively boxing in a third Waymo and trapping residents in a gridlock that looked more like a software bug than a traffic accident. The scene, captured on video and shared widely online, has become a vivid case study in how quickly confidence in robotaxis can wobble when the cars stop behaving like tools and start looking like confused actors in their own drama.

The incident did not involve high-speed carnage or dramatic injuries, but its symbolism landed hard in a city already skeptical of driverless fleets. Watching multiple Waymos lock up in a kind of robotic standoff, with human drivers and pedestrians stuck around them, raised a sharper question than any safety statistic: what happens when the system fails in public, in full view of the people it is supposed to convince?

The low-speed collision that started it all

The chain reaction began when two Dec Waymos, both operating without human drivers, appeared to make minor contact while navigating a tight San Francisco block. Video from the scene shows the vehicles nose to nose, hazard lights blinking, as if each system had reached a decision tree it could not escape. Witnesses described the impact as a low-speed bump rather than a serious crash, but the visual of two autonomous cars colliding with each other was enough to ignite fresh doubts about how these systems interpret close quarters and right-of-way.

In the footage, the cars sit immobile while traffic piles up behind them, a tableau that undercuts the promise that robotaxis will keep streets flowing more smoothly than human drivers. The clip, which circulated widely after being posted online, shows the pair of Dec Waymos effectively blocking the lane and setting the stage for what came next, a moment that has been compared to a software deadlock playing out in real life and that was first highlighted in a detailed account of how the two Waymos appeared to collide and then trigger a jam that trapped residents on the street in San Francisco.

A three-car ‘standoff’ that trapped a neighborhood

What turned an awkward fender tap into a viral spectacle was the arrival of a third Waymo that rolled into the same narrow corridor and then stopped, unable or unwilling to maneuver around its immobilized counterparts. The result was a three-car tableau that residents quickly dubbed a standoff, with the third vehicle effectively trapped by the first two and human drivers stuck behind the robotic trio. Instead of gliding past congestion, the autonomous fleet had become the congestion, a reversal that cut against the core sales pitch of self-driving mobility.

Video captured by a bystander shows the three cars sitting in a frozen triangle while other vehicles queue up, their drivers honking and inching forward in frustration. The clip, which spread rapidly on social media, shows how the three Waymo self-driving cars in a standoff caused a traffic jam in San Francisco and even notes that the footage was shared by a TikTok user who filmed the scene from a nearby building, a detail that surfaced in a report that also highlighted the figure 44 in connection with the coverage of the incident by KGO on a Mon afternoon in PST.

Waymo’s explanation: ‘minor contact at low speed’

Once the clip began circulating, Waymo moved quickly to frame the event as a manageable glitch rather than a harbinger of systemic failure. The company acknowledged that two of its driverless cars made what it described as “minor contact at low speed,” language that signaled both an admission of error and an attempt to keep the stakes in perspective. By emphasizing the low-speed nature of the bump, Waymo sought to reassure regulators and riders that the incident was closer to a parking lot scrape than a dangerous crash.

That framing was echoed in follow up coverage that described how a trio of driverless Waymo cars became locked in a standoff in SAN FRANCISCO, with the company stressing that the contact between the vehicles was minor and that no injuries were reported. In those accounts, the phrase “minor contact at low speed” appears repeatedly as Waymo’s preferred shorthand for the event, a detail that was underscored in a widely shared summary of the Waymo standoff in SAN FRANCISCO that noted the company’s characterization of the collision as limited to low-speed contact between the cars in Dec coverage.

‘We are looking into this further’: the company line

Waymo’s public response followed a familiar script for emerging technologies that misfire in public. The company said it was reviewing the incident, stressing that each unusual scenario provides data that can be used to refine the software and avoid similar problems in the future. The core message was that the system is still learning and that every misstep is an opportunity to improve, a line that has become standard in the self-driving industry whenever a robotaxi behaves in ways that unsettle the public.

In a statement that circulated alongside the viral video, Waymo said, “We are looking into this further, and when we encounter situations like this, we are able to learn from them and make updates that improve the experience.” The company added that it regularly reviews incidents involving its vehicles and that the technology is updated over time as new edge cases emerge on city streets, a stance captured in a report that quoted the company’s pledge to learn from the standoff and noted that such events are monitored regularly by the new technology in Dec statements.

From viral clip to national talking point

What might have remained a local curiosity quickly turned into a national talking point once the video of the three stalled cars began ricocheting across platforms. The image of multiple driverless vehicles stuck in a kind of robotic staring contest resonated far beyond San, because it distilled a complex set of concerns about automation into a single, easily shareable scene. For critics of rapid deployment, the standoff became a shorthand for the fear that cities are being used as test beds for unfinished technology.

Coverage framed the event as part of a broader pattern in which a trio of driverless Waymo cars in San found themselves immobilized in a way that felt both comical and unsettling, with The Brief style summaries emphasizing how quickly the clip spread and how it fed into existing anxieties about robotaxis. Those accounts described the three vehicles as locked in a standoff that created a scene in San and reiterated that the cars made “minor contact at low speed,” a phrase that has now become part of the public vocabulary around the incident, as reflected in a widely circulated piece that presented The Brief overview of how the Waymo standoff in San went viral in Dec reporting.

Not the first time: two cars, one truck, and a recall

The San Francisco standoff did not occur in a vacuum. Earlier this year, Waymo had already been forced to confront another embarrassing pattern, when two separate autonomous vehicles struck the same truck that was being towed. That pair of incidents, which involved different cars encountering the same unusual roadway configuration, raised questions about how well the company’s software handles rare but foreseeable scenarios, such as a vehicle being pulled at an angle behind a tow truck.

In response, Self-driving firm Waymo issued a recall for its software, a move that signaled both regulatory scrutiny and the company’s own recognition that the underlying logic needed to be adjusted. The recall was framed as a targeted fix aimed at preventing similar collisions with towed vehicles, but it also underscored how a single edge case, repeated twice, can expose blind spots in an otherwise sophisticated system, a point that was spelled out in coverage of how Waymo decided to issue a recall after two self-driving cars hit the same truck that was being towed in detailed reports.

Voluntary software fixes after ‘close calls’

The pattern of learning through public mistakes has continued as Waymo’s fleet expands. After a series of close calls that did not necessarily result in high-profile crashes but still raised safety concerns, the company moved to issue a voluntary software recall for its autonomous vehicles. That step was framed as proactive, a way to address potential problems before they produced more viral footage or, worse, serious injuries, and it highlighted how software updates have become the primary lever for managing risk in a driverless world.

In coverage that paired the recall with other dramatic safety stories, such as skydivers narrowly averting disaster, Waymo was described as issuing a voluntary software recall after close calls involving its vehicles. The company said it would push updated code to its fleet to address the scenarios that had triggered concern, a reminder that in this industry, fixes arrive not as new hardware but as over-the-air updates, a process captured in reports that noted how Waymo is issuing a voluntary software recall for its autonomous cars after close calls and quoted spokesperson Peña in a statement about the decision in Dec coverage.

Why low-speed glitches still matter for public trust

On paper, a low-speed bump between two Dec Waymos and a resulting traffic jam might seem trivial compared with the carnage caused daily by human drivers. Yet the optics of autonomous cars colliding with each other and then freezing in place carry an outsized weight, because they cut directly against the narrative that machines are more reliable, more predictable, and more efficient than people behind the wheel. When the technology stalls in such a visible way, it invites the public to question not just the specific incident but the entire premise of handing over control to software.

In my view, that is why the San Francisco standoff resonated so widely: it turned abstract concerns about edge cases and training data into a concrete scene that anyone who has ever been stuck in traffic could understand. The fact that the three-car standoff unfolded in a dense urban environment, with residents literally trapped on their own street, underscored how deeply these systems are already woven into daily life and how disruptive it can be when they fail, even at low speed. Each new recall, whether prompted by two cars hitting the same truck or by a string of close calls, reinforces the sense that the technology is still in a live-fire testing phase on public roads.

What regulators and cities will be watching next

For regulators and city officials, the lesson from this episode is not just that software can misjudge a tight street or a towed vehicle, but that public patience has limits when those misjudgments play out in crowded neighborhoods. San Francisco, which has already wrestled with how many robotaxis to allow on its streets and under what conditions, now has another vivid example to point to when it weighs future permits and operating rules. The sight of three immobilized Waymos in a row will likely feature in debates about whether companies should be required to maintain remote human oversight or stricter geofencing in complex areas.

At the same time, the industry’s response, from Waymo’s Dec statements about learning from the standoff to its voluntary software recall after close calls, shows how quickly companies are willing to tweak their systems once a failure becomes public. I expect regulators to press for more transparency about those updates, including clearer reporting on how often vehicles experience deadlocks, low-speed contact, or other non-catastrophic failures that still disrupt city life. The San Francisco standoff may have been a minor collision on paper, but as a stress test of public trust, it was anything but small.

More from MorningOverview