When Oracle Corp. announced a multi-hundred-billion-dollar computing deal with OpenAI last September, its stock jumped 30-40 per cent in a single day. The $330 billion leap in market value was larger than Oracle’s total revenue for the previous decade. The company later admitted it would lose money on the contract.
Nvidia pledged $100 billion to OpenAI while selling it the very chips that spending would finance. Microsoft booked OpenAI’s compute bills as Azure revenue while reinvesting the proceeds into CoreWeave, another OpenAI supplier that also counts Nvidia as a shareholder. It feels like circular financing on a planetary scale.
Start with the basics. OpenAI lost roughly $5 billion in 2024 on $3.7 billion in revenue. It’s estimated to spend $9 billion just keeping its models running before salaries, rent, or research. According to filings and investor projections, losses could reach $14 billion this year and $44 billion cumulatively by 2028, and yet its valuation sits at nearly half a trillion dollars. It is among the most valuable private companies on Earth, despite never turning a profit.
The imbalance isn’t unique. IBM projects enterprise computing costs will jump 90 per cent between 2023 and 2025, with roughly seven in ten executives blaming generative AI. Startups tracked by Kruze Consulting now spend about half their revenue on cloud and model inference, double the share two years ago. When you’re solving $100 problems with $120 in compute costs, you don’t have a business model; you have a subsidy program.
Meanwhile, the commercial engine everyone points to, OpenAI’s API business, generated about $1 billion last year, at a loss. If the leading model provider can’t sell access profitably, the $2.8 trillion in projected AI infrastructure spending through 2029 begins to look less like strategic investment and more like the fiber-optic overbuild of 1999. Citi estimates that AI capital expenditure will exceed one per cent of US GDP growth, a level of concentration last seen in wartime mobilisation or speculative mania.
The money itself forms a loop. Microsoft finances OpenAI, sells it Azure capacity, invests in CoreWeave (which sells compute to OpenAI), and books all of that as cloud revenue. Amazon and Google mirror the pattern with Anthropic, each both investor and supplier. Nvidia funds AI startups that then spend billions buying Nvidia chips. OpenAI reportedly took a stake in AMD while committing to buy AMD hardware.
This is less of a competitive market than it is a closed circuit where investor, customer, and supplier collapse into one another. Analysts estimate that Microsoft alone accounted for nearly 20 per cent of Nvidia’s revenue last quarter.
Growth that flows in a circle isn’t the same as growth that flows from independent demand. The dot-com era had a name for this behaviour: revenue round-tripping. Telecoms companies paid each other for capacity in order to record reciprocal sales. On paper, revenue exploded. In practice, cash did not. The AI sector hasn’t crossed that line, but it rhymes. When the same dollar passes through three balance sheets, the illusion of profitability deepens.
In 1999, speculative excess lived in small-cap tech. When Webvan and Pets.com vanished, the damage was containable. Today’s version sits at the heart of global markets. Seven companies, Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia, and Tesla, now account for more than half of S&P 500 gains since late 2022. Their combined capital spending rose 40 per cent last year; the other 493 firms managed 3.5 per cent.
This isn’t diversification. It’s dependence: an index fund built on a single bet that artificial intelligence will one day pay for itself. JPMorgan estimates that AI-linked equities have driven 75 per cent of S&P returns and 90 per cent of capex growth since ChatGPT launched. Any correction would ripple through household wealth, pensions, and 401(k)s tethered to the so-called “Magnificent Seven.”
How do you tell a build-out from a bubble? Three questions cut through the hype: Unit economics: What’s the true cost per inference versus value created? Demand independence: How much revenue comes from partners who also fund your infrastructure? Capacity burn-in: What happens if power costs spike or model architectures plateau?
A trillion-dollar capex cycle lives or dies on utilisation, the same point where every prior overbuild, from railways to fiber optics, faltered.
Every bubble starts with something of real value, but then came cheap money, circular storytelling, and the conviction that this time it’s different. When the math stopped closing, gravity returned.
AI will be no different. The technology is extraordinary. Its potential is vast. But the record valuations and surge in capex all point to a big bet that revenue will appear before depreciation does. The outcome may not be a crash so much as a slow rerating: too much infrastructure chasing too few profitable use cases.
Some firms will emerge stronger, but others will find they cannot outrun arithmetic. The internet changed everything. So did the crash that followed. AI will too. The question isn’t whether there’s a bubble; it’s how and for whom it ends.
The story doesn’t have to end in correction. If enterprises and governments start transforming AI into measurable return on investment by redesigning workflows, re-engineering services, and capturing real productivity gains, the cycle could bend. The same spending that now fuels circular growth could start generating genuine value.
That only happens when organisations rewire workflows, rebuild services, and actually bank the productivity AI creates. When AI becomes a line-item for profit, not a line-of-credit for compute, the flywheel flips: capex stops chasing hype and starts compounding return.
That is the problem we’re solving at ai71: turning AI from a cost center into an enterprise-wide performance engine. When adoption becomes systemic.
That’s the real revolution: when AI’s circular economy spins outward.
