Morning Overview

The global semiconductor industry will hit $975 billion in sales this year — a 26% jump fueled entirely by AI

The global semiconductor industry is on track to generate roughly $975 billion in revenue in 2026, according to projections from Gartner and aligned forecasts from the Semiconductor Industry Association. If the number holds, it would represent approximately 26% growth over 2025 and mark the largest single-year dollar increase the chip business has ever recorded. The force behind nearly all of that expansion is artificial intelligence: the processors that train and run large language models, the high-bandwidth memory stacked on top of them, and the networking silicon that ties massive GPU clusters together.

For companies building AI infrastructure, the math is straightforward. Hyperscale cloud operators and enterprises are spending at a pace that has turned data center chips into the industry’s center of gravity. For everyone else in the semiconductor supply chain, the picture is more complicated. Automotive chipmakers, analog component suppliers, and consumer electronics firms are watching capital and engineering talent flow toward AI, raising questions about whether the boom lifts all segments or deepens a divide between AI winners and the rest of the market.

What the filings actually show

The clearest window into AI’s dominance comes from audited financial disclosures, not analyst slide decks. NVIDIA, the largest supplier of AI training and inference chips, reported $130.5 billion in total revenue for its fiscal year ending January 26, 2025, with data center sales accounting for roughly $115.2 billion of that total. Its annual 10-K filing with the SEC identifies spending by cloud providers on GPUs for generative AI workloads as the primary growth engine and warns that revenue is concentrated among a small number of hyperscale customers.

Micron Technology tells a parallel story from the memory side. Its most recent 10-K shows that high-bandwidth memory, the specialized DRAM stacked directly onto AI accelerators, has become one of the company’s fastest-growing product lines. Micron reported record HBM revenue in fiscal 2024 and indicated that demand from data center customers would continue to grow by multiples, not percentages. Together, the NVIDIA and Micron filings confirm that AI is not simply one growth driver among several. It is the category reshaping how the semiconductor industry earns its money.

Both companies also flag the risks that come with that concentration. NVIDIA’s filing warns about U.S. export restrictions that limit shipments of advanced chips to China and other countries, customer concentration among a handful of cloud giants, and the possibility that AI spending could slow if enterprises struggle to show returns on their investments. Micron discloses supply constraints for HBM, long lead times to bring new fabrication capacity online, and the cyclical nature of memory pricing, which can swing sharply when demand softens. These are not speculative worries. They are required disclosures reviewed by independent auditors and the SEC, and they underscore how dependent the current growth cycle is on a narrow set of AI-driven buyers.

The broader landscape beyond two companies

NVIDIA and Micron are the most visible beneficiaries, but the AI spending wave touches a wider cast. Taiwan Semiconductor Manufacturing Company, which fabricates the vast majority of advanced AI chips for NVIDIA, AMD, and Broadcom, reported record quarterly revenue in early 2025 and has committed more than $100 billion to new fabrication plants in Arizona under agreements tied to the U.S. CHIPS and Science Act. AMD has been gaining share in the data center GPU market with its Instinct MI300 series, booking several billion dollars in AI-related revenue in 2024 and projecting continued growth. Broadcom, meanwhile, has become a major player in custom AI accelerators designed for Google, Meta, and other hyperscalers, with its AI revenue surpassing $12 billion in its fiscal year ending October 2024.

On the memory side, SK Hynix has been racing Micron and Samsung to expand HBM production capacity. Samsung, the world’s largest memory chipmaker by volume, has acknowledged that it fell behind on HBM yields and is investing heavily to close the gap. The competition among memory suppliers matters because HBM is one of the tightest bottlenecks in the AI hardware stack. Every new GPU cluster requires stacks of HBM chips, and the lead times to qualify and produce them stretch well beyond a single quarter.

What is less clear is how segments outside AI are performing. The SIA reported that global chip sales reached $627.6 billion in 2024, up about 19% from 2023, with much of that growth concentrated in logic and memory chips tied to data centers. Automotive semiconductors, which boomed during the post-pandemic supply crunch, have cooled as electric vehicle sales growth has moderated in key markets. Analog and industrial chips, sold by companies like Texas Instruments and Analog Devices, have been working through inventory corrections. Smartphone processors are growing modestly, helped by upgrade cycles in markets like India, but not at rates that move the industry needle the way AI does.

Why “entirely” is a stretch, and why it matters

The headline claim that the 26% jump is “fueled entirely by AI” captures the spirit of the moment but overstates what the data can prove. AI-related chips, including data center GPUs, AI accelerators, HBM, and networking silicon, are clearly responsible for the majority of incremental revenue growth. Strip out those categories and the rest of the semiconductor market is growing in the low-to-mid single digits at best, with some segments flat or declining.

But “entirely” implies that no other category contributed at all, which is not accurate. Smartphone application processors are still a multibillion-dollar market that grew modestly in 2025. Automotive chips, while cooling, did not collapse. And the ongoing buildout of 5G infrastructure in parts of Asia and Latin America continues to generate demand for RF and baseband components. The more precise framing is that AI accounts for the overwhelming share of the industry’s growth and virtually all of its margin expansion, while legacy segments are treading water or contracting in real terms.

This distinction matters for investors and policymakers. If AI is the only engine, then any slowdown in AI spending, whether from tighter enterprise budgets, a shift in hyperscaler capital allocation, or new export restrictions, would hit the entire industry’s growth rate disproportionately hard. The semiconductor business has been through boom-bust cycles before, most recently in 2022 and 2023 when pandemic-era demand evaporated and memory prices cratered. The question hanging over the current cycle is whether AI demand is structurally different or simply the latest version of a familiar pattern.

Export controls and geopolitical friction

One of the biggest wildcards for the $975 billion forecast is the evolving landscape of U.S. export controls on advanced semiconductors. The Biden administration imposed sweeping restrictions in October 2022 and tightened them in late 2023 and again in late 2024, limiting the sale of cutting-edge AI chips and chipmaking equipment to China. The Trump administration, which took office in January 2025, has signaled it will maintain or expand those controls, though the specifics remain in flux as of June 2026.

NVIDIA’s 10-K explicitly identifies export restrictions as a material risk, noting that it has already developed different chip variants for the Chinese market that comply with current rules but generate lower revenue per unit. Micron faces a different version of the same problem: China banned certain Micron products from critical infrastructure in 2023, and the company has had to redirect capacity toward other markets. For the industry as a whole, export controls create a ceiling on how much of the AI boom can translate into sales to the world’s second-largest economy. If restrictions tighten further, the $975 billion target becomes harder to hit. If they loosen, it could be conservative.

Where the money goes next

Capital expenditure plans from the major cloud operators offer the best forward-looking indicator of whether AI chip demand will sustain its current trajectory. Microsoft, Google, Amazon, and Meta have collectively signaled more than $200 billion in combined capital spending for 2025 and 2026, with the bulk directed at data center construction and AI infrastructure. Those commitments are already translating into purchase orders for GPUs, HBM, networking chips, and power management components.

But capex plans are not contracts. They can be revised downward if macroeconomic conditions deteriorate, if AI workloads do not generate the expected revenue for cloud customers, or if new architectures reduce the number of chips needed per unit of compute. NVIDIA’s own risk disclosures acknowledge this possibility, noting that “ichyperscale customers may adjust their purchasing patterns based on their own business conditions.”

For semiconductor companies outside the AI core, the strategic question is whether to chase the boom or defend their existing markets. Texas Instruments, for example, has continued to invest in analog chip capacity even as AI dominates the headlines, betting that industrial automation and automotive electrification will generate steady, if less spectacular, demand over the next decade. Intel, which has struggled to compete in AI accelerators, is attempting a turnaround through its foundry services business, hoping to win manufacturing contracts from AI chip designers who need alternatives to TSMC.

The semiconductor industry has always been cyclical, and the current AI-driven surge will eventually moderate. What makes this cycle different is the sheer scale of the spending and the speed at which it has concentrated in a single application category. Whether that concentration proves to be a foundation for sustained growth or a vulnerability that amplifies the next downturn is the question that will define the industry’s trajectory well beyond 2026.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.