OpenAI closed its latest funding round with $110 billion, with backing from industry majors Amazon, Nvidia and SoftBank, the company said Friday.
The round represents a new milestone for the ChatGPT developer, coming at more than double the size of its previous financing round. At $40 billion, the deal was previously the largest private tech deal ever recorded.
The latest investment bumps OpenAI’s pre-money valuation to $730 billion, with Amazon giving $50 billion, Nvidia giving $30 billion and SoftBank giving $30 billion, OpenAI stated in a press release.
Additional investors are anticipated “as the round progresses,” the company added.
The funding will be used to support OpenAI’s development of artificial general intelligence (AGI), or human-level AI, as well as facilitate more widespread use of AI across industries.
“We are entering a new phase where frontier AI moves from research into daily use at global scale,” OpenAI said in the release. “Leadership will be defined by who can scale infrastructure fast enough to meet demand, and turn that capacity into products people rely on.”
OpenAI also said it is expanding its existing $38 billion agreement with AWS by $100 billion over the next eight years.
Under the deal, AWS will serve as the exclusive third-party cloud distribution provider for Frontier, OpenAI’s enterprise platform designed to allow businesses to deploy and manage teams of AI agents. Frontier will continue to be hosted on Microsoft Azure.
The expanded partnership also commits OpenAI to consume “approximately 2 gigawatts” of AWS’ Trainium chips, which will support demand for stateful runtime enviornments, Frontier and other advanced workloads.
The partners will also collaborate to develop customized models to power AWS’ customer-facing applications, with Amazon tailoring OpenAI models for use across AI products and agents.
“We have lots of developers and companies eager to run services powered by OpenAI models on AWS,” said Andy Jassy, Amazon’s CEO. “Our unique collaboration with OpenAI to provide stateful runtime environments will change what’s possible for customers building AI apps and agents.”
In the same announcement, OpenAI also said it will be expanding its partnership with Nvidia, with the company to use 3GW of dedicated inference capacity and 2GW of training capacity on Nvidia’s Vera Rubin systems.
OpenAI stressed that the news does not affect its existing deal with Microsoft, with Microsoft Azure remaining the exclusive cloud provider for OpenAI’s APIs.
“Nothing about today’s announcements in any way changes the terms of the Microsoft and OpenAI relationship,” the companies said.

