Amazon Bedrock now supports Responses API from OpenAI

Amazon Bedrock Updates
Amazon Bedrock now supports Responses API on new OpenAI API-compatible service endpoints. This update enables asynchronous inference for long-running workloads, simplifies tool use integration for agentic workflows, and supports stateful conversation management. Developers can now automatically rebuild context without manual history management.
Chat Completions with reasoning effort support is available for all Amazon Bedrock models powered by Project Mantle, a new distributed inference engine for large-scale machine learning model serving. Project Mantle simplifies onboarding of new models, provides highly performant and reliable serverless inference, and offers out-of-the-box compatibility with OpenAI API specifications.
What to do
- Update your codebases to integrate with the new service endpoints.
- Explore the new asynchronous inference capabilities.
- Leverage stateful conversation management for improved workflows.
Source: AWS release notes
If you need further guidance on AWS, our experts are available at AWS@westloop.io. You may also reach us by submitting the Contact Us form.



