Nvidia’s AI Bubble Shows Cracks as Inventory and Unpaid Bills Soar
Despite reporting record $57 billion quarterly revenue, Nvidia faces growing concerns about AI sustainability as inventory doubles to $19.8 billion and unpaid customer bills surge 89% to $33.4 billion.
Key Takeaways
- Nvidia’s unsold chip inventory nearly doubled to $19.8 billion in three months
- Accounts receivable surged 89% to $33.4 billion with longer payment cycles
- Wall Street reacted negatively with Nasdaq falling 2.2% post-earnings
- Circular financing deals raise questions about artificial demand
The Inventory Conundrum
While CEO Jensen Huang celebrated “Blackwell sales off the charts” and claimed cloud GPUs were “sold out,” the company reported $19.8 billion worth of unsold chips in warehouses – nearly double the $10 billion inventory from three months prior. This contradiction between shortage claims and growing inventory raises fundamental questions about true demand.
“Blackwell sales are off the charts, and cloud GPUs are sold out. Compute demand keeps accelerating and compounding across training and inference — each growing exponentially,” says Jensen Huang, founder and CEO of NVIDIA.
Payment Problems Mount
The $33.4 billion in accounts receivable represents customers who haven’t paid for chips already purchased. The average payment window stretched from 46 to 53 days, indicating potential cash flow problems among AI startups and companies buying Nvidia hardware.
Circular Economy Concerns
Recent deals like Nvidia’s $2.5 billion investment in Elon Musk’s xAI involve special purpose vehicles that purchase Nvidia hardware then lease it back to AI companies. This creates a circular flow of capital that may artificially inflate demand metrics.
Expert Warning
Investor Michael Burry highlighted the confusion between physical utilization and profitability, noting: “Just because something is used doesn’t mean it is profitable.” He compared the situation to airlines using old planes that are only marginally profitable despite being fully utilized.
The Musk Moment
At the U.S.-Saudi Investment Forum, Elon Musk stumbled between megawatts and gigawatts while discussing xAI’s 500 MW data center plans, accidentally revealing the uncertainty underlying AI infrastructure investments. Nvidia CEO Jensen Huang’s visible discomfort during the exchange highlighted industry anxieties about unrealistic expectations.
Power Efficiency Reality
The debate around GPU depreciation timelines ignores significant power efficiency differences. A100 GPUs consume 3x more power than H100s, which are themselves 25x less efficient than Blackwell chips. This creates massive hidden costs in AI operations.
The AI industry faces a critical moment where trillions are being committed to infrastructure without clear business models or understanding of true power requirements. As continues, the fundamental economics of AI compute may need reevaluation.



