OpenAI Models, Codex and Managed Agents Land on Amazon Bedrock — Day After Microsoft Exclusivity Ends (April 28, 2026)
Amazon Web Services launched OpenAI's GPT-5.5, GPT-5.4, the Codex coding agent and a new Bedrock Managed Agents service in limited preview on April 28, 2026 — less than 24 hours after Microsoft's exclusive cloud lock on OpenAI ended.
Amazon Web Services and OpenAI announced on that OpenAI's frontier models — including the freshly released GPT-5.5 — Codex, and a new managed agents service are now available in limited preview on Amazon Bedrock. The launch landed less than 24 hours after Microsoft's exclusive cloud arrangement with OpenAI ended, and it ends the most consequential single-vendor lock-in in commercial AI.
What Happened
In a joint announcement on the OpenAI and AWS blogs, the two companies confirmed that three new offerings are live in limited preview on Amazon Bedrock: the latest OpenAI models (GPT-5.5 and GPT-5.4), Codex on Bedrock for the OpenAI coding agent, and Amazon Bedrock Managed Agents powered by OpenAI, which packages the OpenAI agent harness as a fully managed AWS service.
The timing is not subtle. On April 27, OpenAI and Microsoft announced the end of Microsoft's exclusive right to host OpenAI models, dropping the revenue-share clause that had defined their relationship since 2019. AWS announced its preview the next morning. AWS CEO Matt Garman and OpenAI CEO Sam Altman published a joint Stratechery interview the same day framing the deal as "frontier intelligence on the infrastructure customers already use" — a direct line at Microsoft's Azure pitch.
Key Details
- Models in preview: GPT-5.5 (the new frontier flagship) and GPT-5.4, accessible through the same Bedrock API surface developers already use for Claude, Llama, Mistral and Nova models.
- Codex on Bedrock: The OpenAI coding agent ships through the Bedrock-hosted Codex CLI, the OpenAI desktop app, and a VS Code extension — all authenticated against AWS IAM and billed through AWS.
- Bedrock Managed Agents: A new service that runs the OpenAI agent harness inside Bedrock with no infrastructure to operate. Engineered for "faster execution, sharper reasoning, and reliable steering of long-running tasks," per the AWS announcement.
- Limited preview, not GA: All three offerings are gated to a customer waitlist as of April 28, 2026, with broader availability "in the following weeks."
- $38B AWS commitment: Press coverage from CNBC and GeekWire ties this preview to OpenAI's previously announced $38 billion compute purchase from AWS in late 2025.
What Developers and Users Are Saying
The Hacker News thread on the joint Garman-Altman interview reached the front page within an hour. The most upvoted comments are pragmatic about model parity: one developer warned that "models on different inference platforms might not necessarily give exactly the same results" because of quantization, custom inference silicon and batching choices, citing years of experience using GPT models through OpenAI direct versus through Microsoft Azure where the Azure-hosted variants have historically lagged on output quality. Others welcomed the move as the end of an irrational tax on multi-cloud teams.
On r/AWS and r/OpenAI, the dominant reactions are relief and skepticism in roughly equal measure: relief that enterprises locked into AWS for compliance reasons can finally use OpenAI models without a separate Azure subscription, and skepticism that AWS-hosted GPT-5.5 throughput and latency will match what OpenAI ships on its own infrastructure during launch week. Several developers are already asking whether Codex on Bedrock will respect the same rate limits as the OpenAI API or be re-priced under AWS's per-minute model.
What This Means for Developers
If you build on AWS today, you can now request preview access to GPT-5.5, Codex and Managed Agents through the standard Bedrock console — no separate OpenAI billing relationship required, and no need to wire up cross-account IAM to call Azure OpenAI. For multi-cloud teams, this is the first time GPT-class frontier models can be invoked from inside a VPC on every major hyperscaler (Azure, AWS) plus OpenAI direct.
Two practical caveats. First, identical-name does not mean identical-output: as the top Hacker News reply noted, hosted variants on different platforms can quietly diverge because of quantization or batching choices, so re-run your evals before swapping providers. Second, Codex on Bedrock is gated by an AWS waitlist, so do not rip out your existing Codex CLI install today — the OpenAI-hosted version is still the canonical channel until Bedrock Codex hits GA.
What's Next
AWS told customers on the limited-preview signup page to expect "broader availability in the following weeks." Both companies signaled that the deeper integrations — fine-tuning OpenAI models on Bedrock customer data, GPT-5.5 inside Amazon Q, and Codex inside CodeCatalyst — are on the roadmap but not yet dated. Watch the official AWS Bedrock and OpenAI announcement pages, and the Codex GitHub repo for the AWS-side CLI flag landing.
Sources
- OpenAI: "OpenAI models, Codex, and Managed Agents come to AWS" — primary OpenAI announcement.
- AWS What's New: "Amazon Bedrock now offers OpenAI models, Codex, and Managed Agents (Limited Preview)" — primary AWS announcement.
- Stratechery: Sam Altman and Matt Garman interview — joint CEO interview the day of the launch.
- CNBC — financial framing and $38B compute deal context.
- GeekWire: "OpenAI's models land on Amazon Bedrock, one day after Microsoft exclusivity ends" — independent reporting.
- Hacker News thread — developer reaction.
Stay up to date with Doolpa
Subscribe to Newsletter →