OpenAI CEO Sam Altman took to social media to publicly thank Nvidia chief executive officer Jensen Huang, signaling that the partnership between the world’s most famous AI startup and the leading chipmaker is reaching new heights. The comment came a week following Nvidia announced $30 billion investment in OpenAI, and days after Huang said that this may be the last chance for the chipmaker to invest in ChatGPT-maker because it may go public later this year.
“Very grateful to Jensen for working to expand Nvidia capacity at AWS so much for us!” Altman posted, highlighting a massive push to increase the computing power available to OpenAI.
Jensen Huang: Ramping up “like mad”
The gratitude follows recent comments from Jensen Huang, who revealed that Nvidia is moving aggressively to support OpenAI’s massive growth. According to Huang, Nvidia is helping OpenAI expand its reach across multiple cloud platforms. Huang described the expansion at AWS as “ramping like mad”.
“So recently, there was a question about, are we going to invest $100 billion in OpenAI. We just — just for everybody’s update, we finalized our agreement. We’re going to invest $30 billion in OpenAI. I think the opportunity to invest $100 billion in OpenAI is probably not in the cards.
And the reason for that is because they’re going to go public,” Huang said.
“And so I’m fairly sure that if we provide the capacity they need, which the compute capacity they need, which we’re ramping up hard to go to, the revenues will more than follow. And they’re going to go public towards the end of the year. And so this might be the last time we’ll have the opportunity to invest in a consequential company like this. And our $10 billion investment in Anthropic probably will be the last as well,” he added.
OpenAI Codex witnessing growth
Reports have said that there has been a surge in infrastructure because the usage of OpenAI Codex – the AI system that helps developers write code – is currently exploding. This high level of activity suggests that the need for massive computing power (compute) is only growing.
“You see all the news, you probably haven’t internalized, some of the really great work that we did last year, the last 1.5 years or so, last year or so, we expanded OpenAI’s capacity from Azure to OCI to now AWS. We expanded OpenAI’s reach of capacity to AWS. We’re ramping AWS like mad. We’re ramping them as hard as we can so that OpenAI has accessed even more capacity. That’s one,” Huang said.
Nvidia is also working to boost capacity for Anthropic, the developers of the Claude AI, across both AWS and Azure.
“The second thing that we did, and this was a really, really great outcome is we’re now also working with Anthropic. And in the case of Anthropic, we’re expanding their capacity as aggressively as we can at AWS as well as Azure. And so notice what we’re doing in both — they used to be one and one, now they’re kind of cross product. But the amount of capacity that we’re going to bring online for them, supporting their revenues, their quality of revenues are so good, we just need a lot more capacity for them. So I think that this is something that — that is somewhat new,” he added.
Huang noted that the company is also working with Meta and the Meta Superintelligence Lab.



