OpenAI's models are no longer locked inside Microsoft's Azure. As of this week, GPT-5.5, Codex, and a new Managed Agents platform are available on Amazon Bedrock in limited preview — the first time enterprises can run OpenAI's frontier models on infrastructure that isn't Microsoft's. The move follows a sweeping renegotiation of the Microsoft-OpenAI partnership that ended the exclusivity arrangement which had defined the commercial AI era since 2019.
This isn't just a cloud infrastructure story. It's the moment the AI industry's most consequential partnership went from exclusive marriage to open relationship — and the ripple effects will reshape pricing, competition, and procurement decisions for every business using AI.
What actually changed in the deal
The amended agreement, announced simultaneously by both companies on 27 April, dismantles the core pillars of a partnership that began with Microsoft's $1 billion investment in 2019 and grew to exceed $13 billion. The key changes:
Microsoft's licence becomes non-exclusive. Microsoft retains access to OpenAI's models and IP through 2032, but OpenAI can now serve all its products on any cloud provider. Azure remains the "primary" cloud partner, with OpenAI products shipping there first — unless Microsoft can't support the required capabilities.
The revenue share flips. Microsoft no longer pays OpenAI when customers access models through Azure. OpenAI continues paying Microsoft a 20 percent revenue share through 2030, but that obligation is now subject to a total cap whose dollar figure hasn't been disclosed.
The AGI clause is gone. The original deal contained a philosophical tripwire: Microsoft's exclusive rights would change if OpenAI's board declared it had achieved artificial general intelligence. That provision — which created perverse incentives for both sides to game the definition — has been replaced by fixed calendar dates.
The restructuring was forced by a practical crisis. Amazon's up-to-$50 billion investment in OpenAI earlier this year included commitments to make AWS the exclusive third-party distributor for OpenAI's Frontier agent platform — commitments that directly contradicted OpenAI's existing contract with Microsoft. As VentureBeat reported, the Financial Times found Microsoft was actively considering legal action to enforce its rights. Monday's deal resolves that impasse entirely.
What's now available on AWS
The AWS launch, announced at an event in San Francisco on 28 April, delivers three offerings in limited preview:
OpenAI models on Bedrock. GPT-5.4 is available immediately, with GPT-5.5 coming within weeks, according to AWS CEO Matt Garman. Enterprises can evaluate and deploy OpenAI models alongside Anthropic, Meta, Mistral, and other providers through Bedrock's unified API — with existing IAM access controls, PrivateLink connectivity, encryption, and CloudTrail logging all carrying over.
Codex on Bedrock. OpenAI's coding agent, now used by more than 4 million people weekly, can run inference through Bedrock infrastructure. Crucially, customer code stays within AWS — addressing the data sovereignty concerns that stopped many enterprises from adopting Codex through OpenAI's own APIs.
Amazon Bedrock Managed Agents, powered by OpenAI. A new platform for deploying production-ready AI agents that maintain context across sessions, execute multi-step workflows, and use tools — all within AWS's security and compliance perimeter. "With Amazon Bedrock Managed Agents, developers can build optimised, production-scale AI applications that bring together the strengths of OpenAI's latest models with the scale, security, and infrastructure of AWS," said Ben Kus, CTO at Box.
Why this matters for your business
If your organisation runs on AWS — and roughly a third of all cloud workloads globally do — the practical implications are immediate.
The procurement barrier just disappeared. Until this week, adopting OpenAI models meant either routing through Azure (with a separate billing relationship, security review, and compliance framework) or hitting OpenAI's APIs directly (with the data sovereignty questions that entails). Now you can access GPT-5.5 through the same Bedrock console where you're already running Anthropic's Claude or Meta's Llama, billed against your existing AWS commit.
Competition will compress pricing. OpenAI's models competing directly against Anthropic and open-source alternatives on the same platform creates genuine price pressure. Microsoft has already been launching its own MAI models to reduce its dependence on OpenAI. With multi-cloud distribution, every provider has less leverage to charge premium rates.
Vendor lock-in just got weaker. The broader trend here is the commoditisation of model access. When GPT-5.5, Claude, Gemini, and Llama are all available through a single API layer, the differentiator shifts from which model you can access to what you build on top of it. That's good news for businesses — it means your AI investments are more portable and less dependent on any single vendor relationship.
This echoes what we covered when OpenAI announced GPT-5.5 and workspace agents last week: the technology is maturing fast, and the infrastructure decisions matter as much as the model choices.
The new competitive landscape
The deal restructuring creates a genuinely complex competitive map. Microsoft competes with OpenAI on products (Copilot vs. ChatGPT), partners with Anthropic on agentic products, and remains OpenAI's largest shareholder at roughly 27 percent. Amazon invests in both OpenAI and Anthropic. Google builds its own Gemini models while hosting competitors on Vertex AI.
For Microsoft, the trade-off is calculated. It loses exclusive distribution but gains financial certainty — eliminating outbound revenue-share payments while continuing to collect from OpenAI through 2030. Last quarter alone, Microsoft reported $7.5 billion in OpenAI-related revenue. Wedbush Securities analyst Dan Ives noted the restructuring positions OpenAI for a potential IPO by removing barriers from its original Microsoft partnership.
For OpenAI, the commercial freedom is arguably worth more than anything it gave up. As OpenAI's revenue chief Denise Dresser told staff in an internal memo, the Microsoft partnership had "limited our ability to meet enterprises where they are". That constraint is now gone.
OpenAI's record $122 billion funding round earlier this month — backed by Amazon, Nvidia, and SoftBank — already signalled the company's ambition to operate independently. The multi-cloud move is the commercial execution of that ambition.
What to watch
Google Cloud is next. The amended Microsoft agreement explicitly allows OpenAI to serve on any cloud provider. OpenAI models on Google Cloud's Vertex AI would complete the multi-cloud picture and put further pricing pressure on all three hyperscalers.
The "first on Azure" clause. Microsoft retains the right to ship OpenAI products first. Whether that means a meaningful exclusivity window (weeks? days?) or merely simultaneous launches will determine how much of an advantage Azure retains for enterprises that want the newest capabilities immediately.
Pricing convergence. With OpenAI, Anthropic, Meta, and Mistral all accessible through Bedrock's unified API, expect aggressive pricing competition. The days of paying a premium simply because a model was available on only one platform are ending.
OpenAI's own infrastructure play. OpenAI is also building proprietary data centres. If it eventually reduces its dependence on all third-party clouds, the multi-cloud era could prove transitional — a stepping stone toward OpenAI becoming its own platform. For now, though, the enterprise customer has more choice than ever.
Sources
- OpenAI models, Codex, and Managed Agents come to AWS — OpenAI
- The next phase of the Microsoft OpenAI partnership — OpenAI
- The next phase of the Microsoft-OpenAI partnership — Microsoft
- AWS and OpenAI announce expanded partnership — About Amazon
- Microsoft and OpenAI gut their exclusive deal — VentureBeat
- OpenAI jumps out of Microsoft's bed, into Amazon's Bedrock — The Register
- Microsoft cuts OpenAI revenue share in a fresh step to loosen their AI alliance — U.S. News
