Morning Overview

Memory chip makers are diverting 70% of production to AI data centers — starving the consumer market and pushing electricity demand to records

In the first half of 2026, roughly seven out of every ten advanced memory chips rolling off production lines at Samsung, SK Hynix, and Micron are headed to the same destination: AI data centers. That allocation, projected by semiconductor research firm TrendForce and reported by The Wall Street Journal, marks a dramatic tilt from just three years ago, when consumer devices claimed the majority of high-end DRAM and flash output. The result is a supply squeeze already visible in tighter inventory for smartphones, laptops, and vehicles, paired with a separate but deeply connected strain on the U.S. power grid as those same data centers push national electricity consumption toward record territory.

Where the chips are going

The product at the center of the shift is High Bandwidth Memory, or HBM, the specialized DRAM stacked in layers and bonded directly to AI accelerators like NVIDIA’s H100 and B200 GPUs. A single AI server rack can require several times more memory capacity than a traditional cloud computing rack, and the three major memory manufacturers have retooled fabrication lines accordingly. SK Hynix, which supplies the bulk of NVIDIA’s HBM, told investors during its Q1 2026 earnings call that HBM production had been sold out through the end of the year. Samsung and Micron have made similar disclosures, with Micron’s CEO Sanjay Mehrotra noting in the company’s most recent quarterly report that data center revenue now accounts for a record share of total sales.

The downstream effect is straightforward: every wafer allocated to HBM is a wafer not available for the DDR5 modules in laptops, the LPDDR5X packages in flagship phones, or the automotive-grade memory chips in next-generation vehicles. Industry analysts at TrendForce have warned that consumer DRAM contract prices could rise by double-digit percentages in the second half of 2026 if the allocation imbalance persists. Automakers, who already endured chip shortages during the pandemic, face a repeat scenario with a different bottleneck.

The electricity bill for AI

The chips tell half the story. The data centers absorbing them need enormous amounts of power to run, and the U.S. grid is feeling the load. According to the U.S. Energy Information Administration, electricity demand growth from 2020 through 2025 has outpaced the trend of the prior 15 years, with data center construction identified as a primary driver. The agency’s Short-Term Energy Outlook forecast tables project that total U.S. electricity retail sales in 2026 will exceed 4,100 billion kilowatt-hours, surpassing the previous annual record set in 2022, a milestone that reflects not just data centers but also manufacturing reshoring and the electrification of transportation.

Data centers, however, are the fastest-growing slice. Microsoft has committed more than $80 billion to AI-ready data center infrastructure in its current fiscal year. Meta, Amazon, and Google have each announced multibillion-dollar expansions. Many of these facilities are clustered in Virginia’s Loudoun County, central Texas, and parts of the Midwest, where local utilities are scrambling to add generation capacity.

In a higher-demand-growth scenario, the EIA projects that fossil fuel generation could rise faster than expected because renewable energy and nuclear capacity alone cannot keep pace with the load these facilities add. The agency is careful to frame this as one possible path, not a certainty, but the direction is clear: AI infrastructure is pulling the grid toward higher total output and, in some regions, higher carbon intensity.

What remains genuinely uncertain

The 70 percent allocation figure, while widely cited, is an analyst projection, not an audited production record. None of the three major memory makers has publicly confirmed that exact ratio, and the actual split could shift if demand for large language models plateaus or if new fabrication capacity comes online ahead of schedule. SK Hynix has broken ground on a new HBM plant in Cheongju, South Korea, and Micron is expanding its Hiroshima facility, but neither project will meaningfully increase output before late 2027.

The price impact on consumer devices is also hard to pin down. Tighter chip supply does not translate one-to-one into higher retail prices; manufacturers hold inventory buffers, negotiate long-term contracts, and can substitute older chip architectures in some product lines. No institutional research in the current evidence base puts a specific dollar figure on how much a phone or laptop will cost more because of the AI reallocation. The pressure is real, but the magnitude at the checkout counter depends on variables that differ by brand and product category.

On the electricity side, the EIA’s projections rest on assumptions about data center build-out timelines, cooling efficiency improvements, and the pace of renewable energy permitting. If hyperscale operators accelerate their own solar and wind procurement, or if next-generation chips deliver more computation per watt, the fossil fuel generation increase could be smaller than modeled. The agency publishes updated STEO tables monthly, giving analysts a way to track whether reality is tracking the baseline or the higher-demand scenario.

Why the two pressures reinforce each other

What makes this moment distinct is that the chip squeeze and the electricity strain share a single cause. Every new AI data center that comes online needs both more memory and more power, and the buildout is happening on a timeline that outpaces expansion in both chip fabrication and grid capacity. More AI servers require more HBM, which tightens supply for consumer devices. Those same servers draw more electricity, which strains regional grids and can push rates higher for households and businesses nearby.

The reinforcing loop also means that a slowdown in AI investment would ease both pressures simultaneously. If spending on large language models and generative AI applications decelerates, memory allocation would rebalance toward consumer products and electricity demand growth would moderate. But as of mid-2026, every major hyperscaler is still increasing capital expenditure, and semiconductor executives on recent earnings calls have described AI demand as “insatiable.”

How regional grids and consumer prices may shift through late 2026

For buyers shopping for new smartphones, laptops, or vehicles later in 2026, the chip reallocation means inventory may be tighter and prices modestly higher on models that use the latest memory components. On the energy side, households in regions with heavy data center construction, particularly Northern Virginia, central Texas, and parts of the Ohio River Valley, may see upward pressure on electricity rates as utilities invest in generation and transmission to serve new industrial load. Both dynamics trace back to the same accelerating AI buildout, and both will be shaped by whether that buildout continues at its current pace or begins to level off in the months ahead.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.