A new lawsuit alleges that a Tesla Cybertruck operating in Full Self-Driving (FSD) mode attempted to steer toward the edge of an overpass, forcing the driver to grab the wheel to avoid leaving the roadway. The case, filed in California, adds to litigation challenging Tesla’s advanced driver-assistance technology. It also arrives as federal regulators collect consumer complaints and crash reports involving vehicles equipped with driver-assistance systems, including Tesla models.
What the Overpass Lawsuit Alleges
The plaintiff claims the Cybertruck’s FSD system failed to properly handle a highway merge scenario, directing the vehicle toward the edge of an elevated roadway instead of maintaining a safe lane position. According to the suit, the driver had to physically override the system to prevent the truck from leaving the road surface entirely. The complaint targets Tesla’s software design, arguing that FSD was not equipped to safely manage the specific geometry and traffic conditions of the overpass.
The filing describes a sequence in which the Cybertruck, under computer control, began to follow what the system appeared to interpret as a through lane, even as the actual roadway curved away from the guardrail. The driver allegedly received no clear warning that the system was confused before the steering input veered toward the edge. Only when the driver felt the lateral pull and saw the truck drifting did he disengage FSD and counter-steer back into the proper lane.
This is not a minor software hiccup. Overpass and elevated roadway scenarios present a distinct challenge for camera-based driver-assistance systems because lane markings, barriers, and road edges can appear ambiguous to computer vision at speed. A system that misreads these cues on flat ground might drift into a neighboring lane. On an overpass, the same error could send a vehicle off a bridge. The Cybertruck’s size and weight compound the stakes: a heavier vehicle that leaves an elevated roadway carries far more kinetic energy at impact than a smaller sedan in the same scenario.
The lawsuit also points to the Cybertruck’s angular bodywork and wide track as factors that could make minor steering deviations more dangerous. With less margin between the truck’s outer edge and roadside barriers, the complaint argues, even brief miscalculations by FSD can translate into immediate contact with guardrails or drop-offs. Plaintiffs contend that Tesla should have adapted its software testing and safety margins to account for the truck’s dimensions before enabling FSD on elevated highways.
Tesla has not publicly commented on the specific allegations. The company has consistently maintained that FSD requires active driver supervision and that the driver remains responsible for the vehicle at all times. Critics argue that Tesla’s marketing and the system’s name create an expectation gap, leading drivers to trust the software more than its actual capabilities warrant. The overpass case will test whether a court believes that gap contributed to the near-miss described in the complaint.
Federal Complaint and Recall Records
The lawsuit does not exist in a vacuum. The National Highway Traffic Safety Administration maintains a Cybertruck vehicle page that aggregates owner complaints, recall notices, and any open investigations tied to the vehicle. That database aggregates owner complaints and recall notices, including reports that may involve steering and other control-related concerns.
NHTSA’s complaint system is reactive by design. Owners file reports, and the agency uses patterns in those filings to decide whether to open a formal investigation. For the Cybertruck, which only began deliveries in late 2023, the complaint record is still building. But the complaint record is still developing, and the lawsuit’s allegations are being raised as regulators continue to collect and review consumer reports. It fits within a cluster of reported software and control issues that regulators are monitoring.
At the same time, the Cybertruck’s recall history remains relatively short, reflecting its limited time on the market. No recall specifically targeting FSD behavior in the Cybertruck has been issued as of the available NHTSA records. That absence does not mean the agency has cleared the system. It means the threshold for a formal defect determination or manufacturer-initiated recall has not yet been crossed. The distinction matters because it shapes how courts and regulators evaluate Tesla’s liability: a recall signals an acknowledged defect, while an open complaint record leaves the question unresolved and shifts more fact-finding to litigation.
Attorneys in cases like the overpass suit often mine the complaint database for patterns that can support claims of a systemic problem. Even without a recall, a series of similar owner reports can help plaintiffs argue that Tesla knew or should have known about the risk and failed to adequately warn drivers or adjust the software. Tesla, in turn, typically emphasizes the relatively small number of complaints compared with total miles driven and the role of driver error in many incidents.
How Federal Crash Reporting Rules Apply
Under NHTSA’s Standing General Order, manufacturers must report certain crashes involving vehicles equipped with advanced driver-assistance systems or automated driving systems, subject to the order’s definitions and reporting criteria. NHTSA’s Standing General Order on crash reporting establishes the framework: manufacturers must disclose certain incidents involving vehicles equipped with driver-assistance systems or higher-level automated driving systems within defined timeframes.
The Standing General Order covers definitions of what qualifies as a reportable crash, what data manufacturers must provide, and how NHTSA uses that information to identify safety trends. For Tesla, which has the largest fleet of vehicles with active driver-assistance features on U.S. roads, this reporting obligation generates a significant and growing dataset. The agency uses it to spot patterns that might not be visible from individual complaints alone, such as recurring scenarios where driver-assistance systems appear to struggle.
The overpass lawsuit raises a question about the boundary of this reporting framework. If the driver intervened and no collision occurred, the incident may not meet the Standing General Order’s definition of a reportable crash. That gap is significant. Near-miss events, where the software fails but the human catches the error, can be invisible to the federal data system. Plaintiffs’ attorneys in cases like this one often argue that the true failure rate of driver-assistance systems is higher than official crash data suggests, precisely because successful human interventions mask software errors.
In court, this dynamic can cut both ways. Tesla can point to crash statistics and regulatory filings that show relatively few confirmed FSD-related collisions compared with the miles its vehicles travel. Plaintiffs can respond that those numbers undercount the most important category of events: the ones where the system would have failed catastrophically but for human correction. The overpass allegation, if substantiated, falls squarely into that blind spot.
The Shadow of the Autopilot Verdict
The overpass case lands in a legal environment already shaped by a major jury decision. In a separate case, a jury ordered Tesla to pay damages following a fatal crash involving the company’s Autopilot system, according to reporting from the Associated Press. That verdict established that a jury was willing to assign substantial liability to Tesla for the design and marketing of its driver-assistance technology.
The two cases involve different Tesla models, different software versions, and different crash circumstances. But the legal theory connecting them is the same: Tesla sold a driver-assistance system that did not perform safely, and the company’s representations about the technology’s capabilities contributed to the harm. Plaintiffs in the overpass suit will almost certainly point to the earlier verdict as evidence that juries are receptive to these arguments and are prepared to award significant damages when they conclude the technology behaved unpredictably.
Tesla’s defense in prior cases has centered on driver responsibility. The company argues that its systems are tools meant to assist, not replace, an attentive human driver. Terms of use and on-screen warnings reinforce this position, reminding drivers to keep their hands on the wheel and eyes on the road. But juries have shown willingness to look past those disclaimers when the evidence suggests the system behaved in ways a reasonable driver would not expect, particularly when the system’s name and marketing imply a level of autonomy that exceeds its actual performance.
The Autopilot verdict also matters because it gives plaintiffs’ lawyers a roadmap. Expert testimony about software design choices, testing protocols, and internal discussions over naming and marketing will likely reappear in the Cybertruck litigation. The central question remains whether Tesla exercised reasonable care in deploying and branding a system that can directly control steering and acceleration on public roads.
Why the Cybertruck Raises Distinct Concerns
Most of the high-profile Autopilot and FSD litigation to date has involved Tesla’s Model S, Model 3, Model X, or Model Y sedans and SUVs. The Cybertruck, by contrast, is a much larger, heavier pickup with a radically different body structure and a different set of typical use cases. Drivers are more likely to use it for towing, hauling, and off-road or semi-rural travel, all of which can present edge cases for driver-assistance software tuned primarily on highway and urban data.
The truck’s stainless-steel exterior and sharp-edged design also change how it interacts with the environment. Its geometry can alter how cameras perceive lane lines and roadside objects, especially under glare or low-angle sunlight. In an overpass context, where visual cues are already complex, these factors may increase the risk of misinterpretation by vision-based systems. Plaintiffs in the new suit argue that Tesla did not do enough to validate FSD specifically on the Cybertruck before enabling it on elevated highways.
There is also a broader policy concern. As more heavy vehicles adopt advanced driver-assistance features, the consequences of software mistakes grow. A misjudged steering input in a compact car can be serious; the same error in a large pickup at highway speed can be catastrophic for occupants and for anyone below an overpass or in adjacent lanes. Regulators and courts are still working out how to weigh those risks when the software is optional, branded as a convenience feature, yet marketed as a step toward full autonomy.
The Cybertruck lawsuit, like the earlier Autopilot case, will not by itself settle those questions. But it will add another detailed factual record to the debate over how far companies can go in promising automated driving capabilities before the law holds them fully accountable for what their systems actually do on the road. Whatever the outcome, the allegations that an FSD-equipped pickup tried to steer itself off an overpass underscore the stakes of getting that balance right.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.