AMD and OpenAI have entered into a multi-year strategic partnership under which OpenAI will deploy 6 gigawatts of AMD GPUs to power its next-generation artificial intelligence infrastructure.
The deal names AMD as a core strategic compute partner for OpenAI, with the first gigawatt of graphics processing capacity set to be delivered using AMD's Instinct MI450 series GPUs in the second half of 2026. This marks the beginning of what the companies describe as a multi-generational commitment, intended to drive large-scale deployments of AMD hardware across future AI projects at OpenAI.
Capacity expansion
The agreement outlines a process for rolling out AMD's technology in phases. The initial deployment of 1 gigawatt using Instinct MI450 GPUs will be followed by further expansions, scaling up to a total of 6 gigawatts over several years. These deployments are intended to form a major part of OpenAI's infrastructure for research into generative AI and support increasing demand for large-scale compute resources.
Financial details
According to the terms of the agreement, AMD has issued OpenAI a performance-based warrant for up to 160 million shares of AMD stock. The structure of the warrant ties vesting to specific milestones, including both technology deployments and certain AMD share price targets. The first tranche of shares will vest when the initial 1 gigawatt of GPUs is delivered, with further shares vesting as the partnership scales up to the full 6 gigawatts and as OpenAI meets various technical and commercial targets required for these deployments.
"Our partnership with OpenAI is expected to deliver tens of billions of dollars in revenue for AMD while accelerating OpenAI's AI infrastructure buildout," said Jean Hu, EVP, CFO and treasurer, AMD. "This agreement creates significant strategic alignment and shareholder value for both AMD and OpenAI and is expected to be highly accretive to AMD's non-GAAP earnings-per-share."
Collaboration and ecosystem development
Representatives from both companies emphasised that the partnership builds on existing hardware and software collaboration, continuing developments that began with the AMD Instinct MI300X and MI350X series.
The companies plan to share technical expertise to optimise product roadmaps, deepening co-development in both hardware and AI infrastructure solutions.
"We are thrilled to partner with OpenAI to deliver AI compute at massive scale," said Dr. Lisa Su, chair and CEO, AMD. "This partnership brings the best of AMD and OpenAI together to create a true win-win enabling the world's most ambitious AI buildout and advancing the entire AI ecosystem."
Under the agreement, OpenAI will work closely with AMD on rack-scale systems and leverage AMD's open software ecosystem, including the ROCm platform. This approach is intended to support OpenAI's increasing AI computing requirements and bolster the development of infrastructure that can handle rapidly growing demand for advanced AI technologies.
"This partnership is a major step in building the compute capacity needed to realize AI's full potential," said Sam Altman, co-founder and CEO of OpenAI. "AMD's leadership in high-performance chips will enable us to accelerate progress and bring the benefits of advanced AI to everyone faster."
"Building the future of AI requires deep collaboration across every layer of the stack," said Greg Brockman, co-founder and president of OpenAI. "Working alongside AMD will allow us to scale to deliver AI tools that benefit people everywhere."
Long-term outlook
Both organisations view the partnership as pivotal in developing the infrastructure required to support large-scale AI systems in the future.
The full 6 gigawatt deployment and associated infrastructure upgrades are expected to take place across several years, with the first deployments beginning in the second half of 2026. Ongoing collaborations will extend to future generations of AMD GPU technology.
The agreement is structured to align the strategic interests of both companies through technical cooperation, phased deployments, and financial incentives, in the context of growing worldwide demand for high-performance AI computing.