
HSBC has delivered one of the harshest verdicts yet on the economics of frontier AI, warning that OpenAI could burn through hundreds of billions of dollars by 2030 and still fall short of profitability. The bank’s analysts argue that even explosive user growth and aggressive pricing may not be enough to offset the staggering cost of the data centers and chips needed to train and run models at OpenAI’s current scale.
If that forecast is right, the company behind ChatGPT is not just a high‑growth startup but a capital sink that will test the patience of investors, partners, and regulators. I see the report as a stress test of the entire AI boom, forcing a blunt question on the industry: can the business model catch up with the technology before the bill comes due?
HSBC’s brutal forecast for OpenAI’s balance sheet
HSBC’s core claim is stark: OpenAI could remain unprofitable through 2030 even as it spends “hundreds of billions” of dollars on infrastructure and compute. The bank’s analysts frame the company’s current trajectory as a race between revenue growth and a runaway cost base, with the latter still winning by a wide margin. Their modeling suggests that even if OpenAI scales its products globally and pushes deeper into enterprise, the economics of training and serving ever larger models will keep the company in the red for the rest of the decade.
Several reports describe how HSBC’s note lays out a projected funding gap of roughly 200 billion dollars between what OpenAI might earn and what it would need to spend to sustain its AI roadmap. One analysis of the bank’s projections says OpenAI is “just 200 billion away” from still losing money, a phrase that captures how far the company may be from breakeven despite its rapid growth, and ties that shortfall directly to the cost of building and operating massive AI data centers in the United States and beyond, as detailed in HSBC’s 200 billion funding gap estimate.
Why compute and data centers dominate the bill
At the heart of HSBC’s warning is a simple equation: every leap in model capability demands a leap in compute, and compute at frontier scale is extraordinarily expensive. Training and running systems like GPT‑4 and its successors requires vast clusters of high‑end accelerators, custom networking, and power‑hungry data centers that must be refreshed frequently as hardware generations turn over. The bank’s analysts argue that this infrastructure burden is not a one‑time capex spike but a recurring obligation, because each new model cycle resets the bar for what “state of the art” means and forces another round of spending.
One breakdown of the report notes that HSBC expects OpenAI’s compute and data center costs to reach into the trillions of dollars over time, with the company potentially facing “trillion‑dollar compute bills” if it continues to chase the frontier at its current pace. That projection is not just about GPUs, it also bakes in the cost of building or leasing hyperscale facilities, securing long‑term power contracts, and paying for the specialized engineering talent needed to keep those systems running, all of which are highlighted in HSBC’s trillion‑dollar compute warning.
The 200 billion dollar shortfall and what it implies
The headline number that has grabbed attention is HSBC’s estimate that OpenAI faces a roughly 200 billion dollar gap between its projected revenues and the capital required to fund its AI ambitions through 2030. In practical terms, that means the company would need to raise or otherwise secure that amount just to stay on its current trajectory, before it can even think about generating sustained profits. For a firm that has already tapped deep-pocketed backers and complex financing structures, the idea of another 200 billion dollars in required capital is a sign of how capital intensive frontier AI has become.
Reporting on the note explains that this shortfall is tied directly to the cost of building out new data centers, acquiring cutting‑edge chips, and scaling global infrastructure to support consumer and enterprise demand. One detailed account of the analysis describes how HSBC maps OpenAI’s likely revenue growth against its infrastructure roadmap and still concludes that the company will be “200 billion dollars away” from profitability by the end of the decade, a conclusion laid out in the 200 billion shortfall projection.
Revenue hopes: 220 million paying ChatGPT users
OpenAI is not standing still on the revenue side, and HSBC’s analysis acknowledges that the company has aggressive growth plans for its flagship products. Internal projections cited in coverage of the report suggest that OpenAI is targeting 220 million paying ChatGPT users by 2030, a figure that would represent one of the largest subscription bases in the history of consumer software. If achieved, that scale would put ChatGPT in the same league as services like Netflix and Spotify in terms of paying customers, and would give OpenAI a powerful recurring revenue engine.
The bank’s skepticism, however, is that even a user base of that size may not be enough to offset the cost of running frontier models at global scale. Analysts point out that many of those 220 million users are likely to be on lower‑priced tiers, and that enterprise contracts, while lucrative, come with their own support and customization costs. One report on the projections notes that OpenAI’s internal target of 220 million paying users is central to its path to profitability, yet still leaves the company exposed to massive infrastructure expenses, a tension captured in the 220 million user forecast.
HSBC’s skepticism on profitability by 2030
Even with those ambitious revenue goals, HSBC’s bottom line is that OpenAI is unlikely to make money by 2030. The bank’s analysts argue that the company’s cost structure is simply too heavy, and that the marginal cost of serving each additional user, especially for complex multimodal queries, remains high compared with traditional software. In their view, the economics of generative AI at the frontier are closer to those of a capital‑intensive utility than a lean SaaS business, which means profitability timelines should be measured in decades rather than years.
Coverage of the note emphasizes that HSBC does not see OpenAI’s current monetization strategy as sufficient to close the gap, even under optimistic adoption scenarios. One summary of the report states plainly that the bank expects OpenAI “will not make money by 2030,” tying that conclusion to the combination of infrastructure spending, revenue mix, and competitive pressure from other AI providers, a stance laid out in HSBC’s profitability forecast.
Sam Altman’s frustration and the scale of the AI bet
HSBC’s report lands in the context of mounting public frustration from OpenAI chief executive Sam Altman about the cost and complexity of building frontier AI. Earlier commentary has described how Altman has bristled at repeated questions about the company’s path to profit, at one point reportedly responding with a curt “enough” when pressed on the sustainability of ChatGPT’s economics. That reaction reflects the tension between a founder who sees AI as a once‑in‑a‑century technological shift and investors who must eventually justify the capital outlay.
Accounts of HSBC’s note point out that the bank’s current forecast echoes earlier skepticism about ChatGPT’s ability to turn viral popularity into durable margins, the same line of questioning that reportedly left Altman exasperated. One report links the new analysis to prior debates over whether ChatGPT’s free and low‑cost tiers can ever fully cover the compute they consume, describing how the bank’s latest numbers revive the concerns that once made Altman say “enough” in public, a connection detailed in coverage of Altman’s frustration.
Investor reaction: a “brutal” AI report
For investors who have treated OpenAI as the flagship of the generative AI boom, HSBC’s note reads like a cold shower. Commentators in financial and tech circles have described the report as one of the harshest assessments yet of the sector’s near‑term economics, not because it questions the technology, but because it forces a reckoning with the sheer amount of capital required to keep pace. The idea that a market leader could burn hundreds of billions of dollars and still be unprofitable by 2030 challenges the assumption that scale alone will eventually fix the margins.
On social and professional networks, analysts and founders have been dissecting the report’s numbers and debating whether they are too pessimistic or simply realistic. One widely shared post called it “the most brutal AI report” and highlighted the 200 billion dollar funding gap as a wake‑up call for anyone assuming that AI infrastructure would quickly become cheap and commoditized, a reaction captured in commentary on HSBC’s AI analysis.
OpenAI’s funding gap and the Microsoft question
HSBC’s projections also raise a strategic question: who will actually pay for this? OpenAI has already leaned heavily on deep partnerships and complex financing arrangements to fund its growth, most notably with Microsoft, which has integrated OpenAI models into products like Copilot and Azure. The bank’s estimate of a 200 billion dollar funding gap suggests that even with such a partner, the company may need additional capital sources, whether through new investors, debt, or creative revenue‑sharing structures tied to its infrastructure build‑out.
One detailed breakdown of the report explains how HSBC maps out OpenAI’s likely funding needs and concludes that the company faces a “funding gap” that could stretch into the hundreds of billions, even after accounting for existing commitments. That analysis notes that the scale of the shortfall is large enough to influence broader capital markets, since it implies sustained demand for financing tied to AI data centers and chips, a dynamic laid out in coverage of OpenAI’s funding gap.
Community skepticism and the broader AI bubble debate
Outside formal research notes and investor decks, HSBC’s forecast has fed into a growing online debate about whether the current AI boom is sustainable. On forums and discussion boards, users have seized on the idea that OpenAI might still be losing money even after spending hundreds of billions of dollars, using it as evidence that the industry is in a speculative phase where valuations are racing ahead of fundamentals. Some see the report as confirmation that the economics of large language models are fundamentally flawed, while others argue that such heavy investment is typical of transformative technologies in their early stages.
One widely discussed thread summarizes the bank’s view that OpenAI “won’t be profitable by 2030 and still” faces a massive funding gap, and uses that line to question whether any single company can realistically own the frontier of AI without turning into a quasi‑infrastructure utility. The discussion highlights how retail investors and technologists alike are grappling with the idea that even the most prominent AI player may be a long way from sustainable profits, a sentiment reflected in community reactions to HSBC’s forecast.
What HSBC’s numbers mean for the AI business model
Taken together, HSBC’s projections amount to a challenge to the prevailing narrative that AI will quickly become a high‑margin software business. If OpenAI, with its brand recognition, technical lead, and deep-pocketed partners, is still staring at a 200 billion dollar funding gap and potential losses through 2030, then the rest of the industry may need to rethink its timelines and expectations. The report suggests that the real constraint on AI progress over the next decade may not be algorithms or data, but capital and energy, and that the winners will be those who can align their business models with that reality.
Other analyses of the bank’s note underscore that point by focusing on the projected “hundreds of billions” in cumulative cash burn through 2030, arguing that such a figure forces a revaluation of how AI companies are priced and financed. One summary of the report emphasizes that HSBC expects OpenAI to “burn hundreds of billions of dollars through 2030,” a phrase that has quickly become shorthand for the scale of the bet being placed on generative AI, and which is laid out in the bank’s cash burn projection.
The open question: can OpenAI bend the cost curve?
HSBC’s analysis is not destiny, but it does set a high bar for what OpenAI must achieve to prove the skeptics wrong. To escape the scenario of burning hundreds of billions without reaching profitability, the company would need to bend the cost curve of compute, either through more efficient models, custom hardware, or radically cheaper energy. It would also need to deepen and diversify its revenue streams, moving beyond ChatGPT subscriptions into high‑margin enterprise tools, platform fees, and perhaps entirely new categories of AI‑native products that can command premium pricing.
One detailed examination of the bank’s numbers notes that the path to profitability runs through both sides of the ledger: OpenAI must grow revenues far beyond current projections while also finding ways to reduce the unit cost of inference and training. That dual challenge is at the heart of HSBC’s skepticism, and it is why the report has resonated so strongly across tech and finance circles, as captured in the broader discussion of OpenAI’s profitability prospects.
More from MorningOverview