Image Credit: AMD Global - CC BY-SA 2.0/Wiki Commons

Artificial intelligence is no longer a side project for the chip industry, it is the main event. As companies race to build smarter assistants, autonomous vehicles and generative models, the cost of the silicon that powers those systems is soaring even as demand accelerates faster than chipmakers can expand capacity. At the center of that tension is AMD CEO Lisa Su, who is arguing that AI demand is “going through the roof” at the very moment individual accelerators are selling for tens of thousands of dollars.

Her message is blunt: the world is still in the early stages of an AI buildout that could reach billions of users, and the infrastructure bill will be enormous. I see her recent comments as a clear signal that the industry is entering a phase where scale, pricing power and long‑term capital planning will matter more than quarterly volatility in chip stocks.

AI demand “through the roof,” even at eye‑watering prices

Lisa Su has been unusually explicit about just how hot the AI market has become. She has described AI demand as “going through the roof” and “insatiable,” language that underlines how quickly cloud providers, enterprises and startups are trying to secure compute capacity even as they confront supply bottlenecks. In her telling, the surge is not a short‑term spike but a structural shift in how much processing power the global economy will consume as AI moves into every major industry.

What makes that surge more striking is that it is happening while the cost of individual AI chips has climbed into the tens of thousands of dollars per unit. Su has acknowledged that leading accelerators now sell at those levels, yet customers are still lining up, a dynamic she highlighted in a Jan appearance where she said AI demand was “going through the roof” even as costs climb and individual AI chips cost tens of thousands of dollars during CES. In a related interview, she reiterated that AMD CEO Lisa Su declares AI demand is going through the roof even as prices rise into the thousands, underscoring that the company sees no sign of a pause in orders despite the sticker shock on high‑end accelerators, a point she made when AMD CEO Lisa Su declares that AI demand is going through the roof.

“AI everywhere” and the early innings of a 5‑billion‑user market

Su’s bullishness is not just about near‑term sales, it is about how deeply she expects AI to permeate daily life. She has framed the current moment as the “early innings” of a transformation that will put AI into virtually every device and service, from smartphones and PCs to cars and industrial systems. In her view, the industry is still at the stage of building foundational models and infrastructure, with the real wave of consumer and enterprise applications still ahead.

That conviction is captured in her forecast that AI services could reach more than 5 billion active users in the coming years, a figure that would effectively mean AI touching almost every internet‑connected person on the planet. In a video interview, AMD CEO Lisa Su says that we are in the early innings of this cycle and that AI Is Not Hype and Demand Will Explode, projecting that demand will explode to 5B users as she spoke as AMD CEO Lisa Su: Is Not Hype, Demand Will Explode. On an investor forum, that same outlook surfaced again as AMD CEO Lisa Su Says Concerns About an AI Bubble Are Overblown and she told followers to expect over 5 billion active AI users, a stance reflected in a post titled More AMD CEO Lisa Su Says Concerns About Bubble Are Overblown.

Inside Su’s “AI everywhere” thesis

When Su talks about “AI everywhere,” she is not speaking in abstractions. She is pointing to a world where generative models are embedded in office software, where driver‑assistance systems in cars rely on real‑time inference, and where industrial robots and logistics networks use predictive algorithms to cut waste. That vision requires a vast expansion of both data center capacity and edge computing, with accelerators deployed from hyperscale server farms to laptops and embedded systems.

In a recent appearance, AMD CEO Lisa Su declares “AI everywhere” as chip demand explodes, tying that phrase directly to the surge in orders for data center and client processors that can handle AI workloads, a message captured in a Fox Business Video. She has also described AI demand as “insatiable” during Advanced Micro Devices’ first analyst day, where the company laid out new targets and highlighted that Advanced Micro Devices shares have roughly doubled in value in 2025 as investors bet on that AI‑everywhere thesis, a performance detailed in a set of Key Takeaways on Advanced Micro Devices.

Yottaflops and the coming compute crunch

Behind Su’s optimism lies a stark assessment of the scale of infrastructure that will be required. She has argued that AI is growing so quickly that the industry will need “yottaflops” of compute, a term that implies performance levels far beyond today’s exascale systems. That is not just marketing language, it is a way of signaling that the current generation of data centers will be insufficient once billions of users are interacting with AI models daily.

In a recent discussion, she noted that with all of that user growth, the world has seen a huge surge in demand in the global compute infrastructure and that the industry is heading toward a regime where AI is growing so much that we will need yottaflops of compute, a point made by a Speaker at 00:48 in a Jan segment. I read that as a warning that the bottleneck is shifting from model design to sheer physical capacity, from the availability of high‑bandwidth memory and advanced packaging to the power and cooling needed to run dense racks of accelerators at scale.

Rising GPU prices and the risk of an AI hardware squeeze

The other side of this story is cost. As demand for AI accelerators has surged, the price of GPUs has climbed sharply, raising questions about who will be able to afford cutting‑edge hardware. Reports suggest that AMD and Nvidia are planning phased price hikes for GPUs in 2026, a move that would push already expensive cards even higher and could widen the gap between well‑funded hyperscalers and smaller players trying to compete in AI.

One report, titled New Report Suggests Phased Price Hikes for AMD and Nvidia GPUs in 2026, describes how Reports indicate that AMD and Nvidia are preparing to roll out phased price hikes in February 2026, with the spike in prices for AI‑capable GPUs tied directly to the current wave of demand, a trend laid out in New Report Suggests Phased Price Hikes for AMD and Nvidia. Combined with Su’s own acknowledgment that individual AI chips can cost tens of thousands of dollars, that trajectory suggests a hardware squeeze where access to top‑tier compute becomes a strategic differentiator in its own right.

Wall Street’s whiplash: soaring story, sliding shares

Investors have been trying to reconcile Su’s long‑term AI narrative with the short‑term volatility of AMD’s stock. On one recent trading day, AMD shares initially gained nearly 1 percent before reversing sharply, with the stock falling as much as 4.4% to $211.25, a move that marked the worst intraday decline in months even as Su was highlighting soaring computing demand. That kind of whiplash reflects how sensitive the market has become to any hint that AI spending might slow or that competition could erode margins.

The episode was captured in a report noting that After initially gaining nearly 1%, the stock fell as much as 4.4% to $211.25 as traders digested Su’s comments at an event, a swing detailed in a Jan market update that described how After the event AMD shares slid even as the CEO pointed to soaring computing demand. I see that reaction as a reminder that while the AI story is compelling, investors are still scrutinizing near‑term profitability, supply constraints and the competitive landscape with Nvidia as closely as they are listening to long‑range forecasts.

From “insatiable” demand to a $1 trillion data center market

Su has tried to reassure skeptics by framing AI spending as a rational response to a once‑in‑a‑generation platform shift rather than a speculative bubble. She has described AI demand as “insatiable” and argued that the buildout of AI data centers will support a vast ecosystem of hardware, software and services. In her view, the capital flowing into AI infrastructure is laying the groundwork for productivity gains that will justify the upfront costs.

That argument is backed by her projection that the data center market could reach $1 trillion by 2030, a figure that captures not only AI accelerators but also CPUs, networking, storage and the software stacks that tie them together. In one interview, AMD CEO dismisses AI spending concerns and projects a $1 trillion data center market by 2030, positioning AMD CEO as a central player in that expansion and pushing back on fears that customers are overspending on AI, a stance detailed in a report that notes how AMD CEO dismisses AI spending concerns while projecting that $1 trillion figure. Combined with her “insatiable” demand language and the fact that Advanced Micro Devices shares have roughly doubled in value in 2025, that forecast helps explain why AMD is investing so heavily in its AI roadmap despite near‑term cost pressures.

Who pays the bill when AI demand explodes?

All of this raises a practical question: who ultimately absorbs the rising cost of AI hardware as demand keeps climbing? For now, hyperscale cloud providers and large enterprises are footing much of the bill, buying accelerators that cost tens of thousands of dollars and building out data centers that Su believes will contribute to a $1 trillion market. Those costs are then passed along through higher prices for AI services, from premium tiers of generative tools to usage‑based APIs that charge per token or per image generated.

As AI moves toward the more than 5 billion active users Su envisions, I expect that cost burden to diffuse further, showing up in subscription prices for productivity suites, in the sticker price of AI‑equipped cars and in the budgets of governments deploying AI for healthcare, education and defense. Su’s insistence that AI Is Not Hype and that Demand Will Explode, combined with her “AI everywhere” mantra and the prospect of yottaflops‑scale compute, suggests that the industry is only at the start of grappling with how to finance this transformation. The tension between “insatiable” demand and rising chip prices will define not just AMD’s strategy, but the pace at which AI reaches the billions of users she is counting on.

More from Morning Overview