The United States Army is grappling with a problem that sounds almost paradoxical for a force built on intelligence superiority: it has too much information and not enough capacity to process it. Senior officers are now openly acknowledging that the volume of sensor data flooding modern battlefields has outpaced human ability to sort, interpret, and act on it. The service’s response is a deliberate, contract-backed push to hand much of that cognitive burden to artificial intelligence and automated data-fusion systems, a shift that carries both operational promise and unresolved risk.
Sensors Everywhere, Answers Nowhere Fast Enough
Col. Jeff Pickler of the 2nd Multi-Domain Task Force framed the challenge in blunt terms at a Dynamic Front media roundtable, telling reporters that the modern battlefield is “swimming in sensors” and that “we are drowning in data.” His comments, reported in a recent Business Insider piece, are striking because they come not from a think tank but from a field-grade officer whose unit is designed to integrate capabilities across air, land, sea, space, and cyber. Pickler’s assessment suggests that the Army’s own success in deploying more sensors, from drones to electronic warfare systems, has created a bottleneck where raw feeds accumulate faster than staff sections can turn them into targeting decisions or coherent situational awareness.
That bottleneck matters because speed is now the defining currency of modern combat. An adversary that can close the loop from detection to action in minutes holds a lethal advantage over one still sifting through video feeds, chat logs, and radar tracks. Pickler has positioned AI and automation as the only viable path to process target-scale workloads at the tempo operations demand. In his framing, machine assistance is not a distant aspiration but an immediate operational necessity, and the Army is already funding tools intended to triage, prioritize, and present data in ways humans can actually use under fire.
NGC2 and the Architecture Behind the Fix
The Army’s primary vehicle for addressing this overload is its Next Generation Command and Control program, known as NGC2. Official descriptions portray NGC2 as a data-centric architecture meant to fuse intelligence, electronic warfare, and other operational information into a single, shared picture. The stated goals are direct: enable commanders to make “more, faster, and better decisions” and shorten the time from sensor detection to a weapon or other response. In practice, that means building an ecosystem where disparate sensors and systems feed into a common data layer that software can rapidly sort, correlate, and push to the right decision-makers.
What makes NGC2 different from earlier command-and-control modernization efforts is its explicit focus on the data layer rather than on individual hardware platforms or isolated software suites. Previous generations of Army C2 systems often treated data as something that moved between stovepiped programs, each with its own formats and interfaces. NGC2 instead treats shared, machine-readable data as the foundation on which every decision tool sits. That architectural choice has real consequences: it implies that once the underlying data fabric is in place, new applications (whether for targeting, logistics, or force protection) can plug in and benefit from the same common operating picture without rebuilding bespoke connections every time.
Lockheed Martin’s $26 Million Prototype Contract
Concrete money is now flowing behind the concept. The Army has awarded a Lockheed Martin-led team a $26 million Other Transaction Authority contract to build an integrated data layer for the 25th Infantry Division as part of NGC2 prototyping. The selection of the 25th, based in Hawaii and oriented toward the Indo-Pacific, is notable: the region’s vast distances and dispersed islands demand robust networking and the ability to stitch together information from sensors spread across land, sea, air, and space. A successful prototype there would be a strong proof of concept for broader adoption across the force.
The choice to use an Other Transaction Authority mechanism is also telling. OTAs allow the Army to bypass some of the more cumbersome federal acquisition rules that can slow traditional contracts, accelerating experimentation and fielding. By putting NGC2’s integrated data layer on this faster track, the service is signaling that it sees data fusion as urgent, not merely a long-term research project. Whether a $26 million prototype can scale to a service-wide architecture remains uncertain, but the deal represents a tangible commitment rather than just another strategy slide. It also gives the Army a structured way to test how well a data-centric approach actually reduces sensor-to-shooter timelines in a real operational division.
The Cognitive Hierarchy Problem AI Alone Cannot Solve
Technology investment, however, does not automatically translate into better command decisions. A recent analysis from West Point’s Modern War Institute argues that many leaders still approach decision-making by relying heavily on manual aggregation of information, spending disproportionate time collecting and formatting data instead of thinking, visualizing, and understanding. The article frames this as a failure to “ascend the cognitive hierarchy,” the conceptual ladder that runs from raw data to information, knowledge, and ultimately understanding. Automating the bottom rungs of that ladder through AI is necessary but not sufficient if commanders remain mentally anchored in the data layer.
This critique cuts against a common assumption in defense technology circles: that faster data processing will automatically yield faster and better decisions. If officers have spent their careers rewarded for exquisite staff products and detailed PowerPoint slides, simply handing them a cleaner, AI-curated feed may not change how they think under pressure. The real test for NGC2 and similar efforts will be whether the Army pairs technology fielding with changes in doctrine, training, and culture that encourage leaders to trust automated aggregation and spend more of their cognitive bandwidth on framing problems, weighing risk, and exercising judgment. Without that parallel shift, there is a real danger that officers will simply use new tools to generate more data products, remaining busy in the weeds rather than stepping back to command.
What Faster Sensor-to-Shooter Loops Mean Beyond the Battlefield
The Army’s push to shrink sensor-to-shooter timelines through AI-driven data fusion carries implications well beyond any single battlefield. Defense contractors across the industrial base are watching NGC2 closely, because a successful data-centric architecture could become a template for how future joint and coalition systems share information. If the integrated data layer being tested with the 25th Infantry Division proves effective, it is likely to influence requirements for other Army formations and shape how partners and allies design their own networks to plug into U.S.-led operations. In that sense, the program is as much about interoperability and standard setting as it is about internal efficiency.
There are also broader strategic and ethical questions lurking behind the drive for speed. Compressing the time between detection and action can increase deterrence by making it harder for adversaries to exploit gaps in U.S. awareness, but it also raises concerns about escalation and error if humans are effectively supervising, rather than actively controlling, rapid machine-processed engagements. As the Army races to keep up with the flood of battlefield data, it will have to balance the imperative for faster loops with safeguards that preserve meaningful human judgment. The outcome of NGC2’s experiments (in architecture, contracting, and command culture) will help determine whether AI-enabled command and control becomes a stabilizing force for U.S. operations or a new source of friction in an already volatile security environment.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.