Image Credit: Brian Snelson - CC BY 2.0/Wiki Commons

Modern cars are rolling computers packed with sensors, cameras, microphones, and always-on connectivity, yet the people who own them are being told that peeking under the digital hood could be treated like a serious crime. As automakers lobby to lock down access to vehicle data and software, they are leaning on broad anti-hacking laws that already threaten security researchers and independent repairers with years behind bars. The result is a collision between public safety, privacy, and the basic idea that you should be allowed to understand and fix the machine you paid for.

At the center of this fight is a push to treat routine access to car data as “unauthorized” computer intrusion, punishable under statutes that carry 3 to 5 year prison terms for a first offense and far more if prosecutors stack charges. I see a growing pattern in which industry groups, from automakers to smartphone giants, publicly celebrate innovation and sustainability while quietly backing legal frameworks that make it risky, or even criminal, for outsiders to audit, repair, or improve the products that shape daily life.

How car data turned into a criminal-law problem

For decades, tinkering with a car meant swapping parts and reading mechanical gauges, not navigating criminal exposure. That changed once vehicles became networked devices, with engine control units, telematics modules, and infotainment systems all tied into the same digital nervous system. When a modern SUV logs your location history, driving behavior, and biometric clues from driver-assistance cameras, accessing that information is no longer just a technical question, it is framed as a potential violation of computer crime laws that were written long before cars went online.

The legal hook is the Computer Fraud and Abuse Act, or The CFAA, which prohibits intentionally accessing a computer “without authorization” or “in excess of authorization” and has been stretched to cover nearly every aspect of computer activity. Because modern vehicles qualify as “protected” computers under federal law, connecting to a car’s internal network, pulling diagnostic data, or probing a telematics API can be portrayed as hacking, even when the owner consents. That is the legal backdrop automakers are leaning on when they warn that independent access to vehicle data could trigger criminal penalties measured in years, not fines.

The CFAA’s vague language is a gift to automakers

What makes this landscape so fraught is not just that the CFAA exists, but that its key terms are so poorly defined. The statute never clearly explains what “without authorization” means in practice, and it leaves “exceeds authorized access” open to interpretation by prosecutors and judges. That ambiguity gives powerful companies room to argue that any access they do not explicitly bless, even by the owner of the device, is a potential crime.

Under the statute’s own wording, The CFAA defines “exceeds authorized access” as knowingly accessing a computer with authorization and using that access to obtain or alter information that the accessor is not entitled to obtain or alter. Courts have split over how far that language reaches, which means the same behavior could be treated as benign in one jurisdiction and criminal in another. For automakers, that legal gray zone is an opportunity: by writing restrictive terms of service and warning labels, they can argue that independent diagnostics, data exports, or security testing “exceed” what is allowed, then point to the CFAA’s felony penalties as a deterrent.

From security research to felony territory

In practice, the CFAA has become a blunt instrument that can turn good-faith security work into a prosecutable offense. The law makes it a federal crime to gain unauthorized access to “protected” computers, including systems used in interstate commerce, with the intent to defraud or do damage. That category easily encompasses connected vehicles, their backend servers, and the dealership portals that manage them, which means probing for vulnerabilities in a car’s software stack can be framed as a federal hacking case.

Legal guides spell out that the Computer Fraud and Abuse Act, CFAA makes it a crime to access protected computers without authorization with the intent to defraud or cause damage, and that penalties escalate quickly when prosecutors allege broader schemes. For a security researcher who discovers a flaw in a vehicle’s remote unlock feature or telematics API, the threat is not hypothetical. If an automaker decides that the testing was not “authorized,” it can refer the matter to law enforcement, and the same tools that protect against genuine cyberattacks can be turned on the very people trying to keep drivers safe.

Connected cars are genuinely vulnerable

Automakers often justify aggressive legal postures by pointing to real and serious cyber risks. Modern vehicles are part of the broader Internet of Things, with cellular connections, Wi-Fi hotspots, Bluetooth links, and over-the-air update channels that all expand the attack surface. When those pathways are not secured, attackers can move from a cloud service or mobile app into the vehicle network itself, where critical safety systems live.

Technical analyses of connected car security warn that the potential for hackers to gain unauthorized remote access to the vehicle network and compromise critical safety systems puts at risk not only drivers’ personal information but their physical safety as well, a danger that is spelled out in detail in discussions of connected car cyber security. That reality makes it all the more important to encourage, not chill, independent testing. Yet instead of building clear safe harbors for good-faith research, automakers are reaching for the harshest tools in the criminal code, effectively telling outside experts that if they touch the code, they could be treated like intruders.

When a dealership portal becomes a break-in tool

The stakes are not theoretical. In one documented incident, an online dealership system exposed both car and personal data in ways that allowed anyone with access to remotely unlock vehicles. The portal tied together customer records, vehicle identifiers, and remote control functions in a single web interface, and a flaw in its access controls meant that outsiders could see information they were never meant to see and trigger commands they were never meant to send.

Reporting on that breach described how an Online portal exposed car and personal data and allowed anyone with access to remotely break into a car. That kind of failure underscores why independent scrutiny is essential. Yet under the current legal regime, the people most likely to find and responsibly disclose such flaws must weigh the public benefit against the possibility that a company will accuse them of “unauthorized access” and invite prosecutors to treat their work as a felony.

Automakers’ lobbying playbook: from emissions to data

To understand how the industry is likely to wield these laws, it helps to look at how automakers already behave in other regulatory fights. When states have tried to tighten emissions rules or accelerate the shift to electric vehicles, major manufacturers have not simply complied. They have organized through trade groups, hired lobbyists, and pushed for carve-outs that protect their business models, even when that means slowing down climate policy.

One prominent example is The Alliance for Automotive Innovation, an industry group representing a large number of automobile manufacturers that appeared generally opposed to the Advanced Clean Cars II policy and supported delays and other measures to weaken the policy. That track record matters in the data-access debate. If the same organizations that fought stricter emissions standards are now warning lawmakers about the dangers of letting owners or independent shops tap into vehicle data, it is reasonable to ask whether the primary concern is safety or control.

Right to repair collides with car data lockouts

The push to criminalize access to car data sits squarely inside the broader right to repair movement. For years, independent mechanics and consumer advocates have argued that owners should have the tools, information, and software access needed to fix their own devices, from tractors to smartphones. Automakers, like other manufacturers, have often responded by locking down diagnostics, encrypting firmware, and insisting that only authorized dealers can safely service complex systems.

Consumer groups have documented how some companies publicly support repair in one context while quietly backing trade associations that lobby against the Right to Repair in another. One report notes that while While, Google has advocated for repair-friendly policies in some venues, it has also been linked to industry associations lobbying against the Right to Repair. The same pattern is emerging in the automotive world, where manufacturers talk up sustainability and customer choice while supporting legal frameworks that make it risky for independent shops to plug into a car’s data bus without fear of triggering a hacking allegation.

Massachusetts shows what access can look like

There is a real-world counterexample to the idea that only manufacturers should control vehicle data. In Massachusetts, voters have repeatedly backed measures that give car owners and independent repair shops access to the information they need. More than a decade ago, the state’s electorate overwhelmingly approved a ballot initiative that required automakers to share diagnostic data with non-dealer mechanics, a move that set a national benchmark for repair access.

Advocates point out that Massachusetts voters have been at the forefront of Right to Repair, and that first, in 2012, voters approved a ballot measure, 87.7% to 12, to ensure that car owners and independent shops could access repair information, including through a car’s diagnostic port. That kind of mandate directly conflicts with the idea that tapping into a vehicle’s data systems should be treated as a potential felony. It also shows that when the public is given a clear choice, they tend to side with access and competition rather than exclusive control by manufacturers.

Cybersecurity law is expanding around connected vehicles

Automakers are not operating in a vacuum. Around the world, regulators are building new legal frameworks for intelligent connected vehicles that address data security, privacy, and safety. These rules often require manufacturers to implement robust technical safeguards, conduct regular risk assessments, and respond quickly to vulnerabilities. At the same time, they can create new obligations for anyone who touches vehicle data, from cloud providers to app developers and, potentially, independent researchers.

Legal analyses of intelligent connected vehicles describe how Failure to secure vehicle data and communication channels can lead to unauthorized access, data breaches, and potential harm to the vehicle’s occupants, and they highlight the critical need for stronger security measures in connected vehicles. Those same frameworks can be interpreted in ways that either welcome independent oversight or treat it as a threat. When automakers argue that only their own teams should be allowed to test or access vehicle systems, they are effectively asking lawmakers to choose the latter path and to back that choice with criminal penalties.

How broad cyber bills fold in car hacking

Beyond the CFAA, broader cyber legislation has started to fold vehicle-related intrusions into the same category as traditional computer crimes. Bills that encourage information sharing between companies and the government often define “cybersecurity threats” in sweeping terms, covering any act that could harm networks, data, or critical infrastructure. When cars are treated as nodes on that infrastructure, accessing their systems without a manufacturer’s blessing can be swept into the same bucket as attacking a corporate data center.

One legislative text, for example, describes how a Summary of cyber intelligence sharing provisions ties covered activity to a crime under a Federal or State law that involves violations of federal computer statutes, including the Computer Fraud and Abuse Act of 1986 (Public Law 99–474). When automakers lobby in these spaces, they are not just asking for better defenses against genuine attackers. They are also shaping how “unauthorized” access is defined in contexts that can reach all the way down to a mechanic’s scan tool or a researcher’s test script.

What counts as a car crime under the CFAA

To see how easily routine activity can be recast as criminal, it helps to look at how practitioners describe the list of offenses under the CFAA. Legal overviews break the statute into categories that sound straightforward on paper, such as unauthorized access, theft of information, and damage to protected computers. In a world where vehicles are computers, each of those buckets can map onto behavior that used to be considered normal tinkering or diagnostics.

One summary of What the List of Criminal Offenses under the CFAA includes highlights Unauthorized Access to Computers and Theft of Information as core prohibitions. If a driver uses a third-party tool to pull detailed logs from their car’s control units, a manufacturer could argue that this is unauthorized access. If a researcher downloads firmware images to analyze how a braking system works, that could be framed as theft of information. The same legal categories that target malicious intrusions into corporate networks can, with a few interpretive steps, be pointed at people who are simply trying to understand or improve the machines they rely on.

Why 3 to 5 years in prison is not a hypothetical threat

When automakers warn that accessing vehicle data without their permission could lead to 3 to 5 years in prison, they are not inventing those numbers out of thin air. The CFAA’s penalty structure sets baseline sentences for first-time offenses and allows for enhancements when prosecutors allege broader schemes, financial loss, or risks to public safety. In the context of a connected car, where a vulnerability could theoretically affect thousands of vehicles at once, it is easy for an aggressive prosecutor to argue that the stakes justify serious time.

Because the statute treats protected computers used in interstate commerce as especially sensitive, and because connected vehicles clearly fall into that category, the same conduct that might once have been handled as a contractual dispute or a civil copyright issue can now be escalated into a federal hacking case. When industry groups lobby to keep access tightly controlled, they are not just defending their intellectual property. They are also preserving the option to frame unwanted access as a criminal matter, backed by the threat of multi-year sentences that can scare off independent shops, researchers, and even curious owners.

Who really benefits from criminalizing access

From a public-interest perspective, the question is not whether car systems should be secure. They absolutely should. The question is who gets to test, audit, and repair those systems, and under what legal conditions. When access is limited to manufacturers and their chosen partners, the public must simply trust that the job is being done well, even as real-world breaches and vulnerabilities keep surfacing.

By contrast, a framework that protects good-faith research and owner-directed repair would treat independent access as a feature of a healthy ecosystem, not a crime. That would mean narrowing the CFAA’s vague language, clarifying that consent from the device owner matters, and building explicit safe harbors for security testing that does not cause harm. Until that happens, automakers will continue to operate in a world where they can invoke powerful computer crime laws to keep outsiders away from the data and code that increasingly define what a car is, and where accessing your own vehicle’s digital innards can be portrayed as a tripwire for 3 to 5 years in prison.

More from MorningOverview