Technical

New AI Agent Ecosystem from AWS: AWS Bedrock AgentCore

September 15, 2025
by
Kovács Ádám

The AI revolution is in full swing. Almost every month, we hear about new announcements and partnerships that signal a new era in AI development. Major cloud providers, including AWS, are working tirelessly to simplify and scale AI integration for developers and businesses alike. At the 2025 AWS Summit in New York, one of the biggest announcements was the Bedrock AgentCore service package, currently available in preview status in the us-east-1, us-west-2,ap-southeast-2, and eu-central-1 regions. Following the announcement, AgentCore played an important role at the PartnerEquip virtual event — and not by chance, as it is building an ecosystem that simplifies the world of AI agents.

One of the most significant announcements at the 2025 AWS Summit in New York was the release of the Bedrock AgentCore service package. It is currently available in preview mode in the us-east-1, us-west-2, ap-southeast-2,and eu-central-1 regions. Following the announcement, AgentCore played an important role at the PartnerEquip virtual event — and not by chance, as itis building an ecosystem that simplifies the world of AI agents.

 

So, what exactly is Bedrock AgentCore?

A few years ago, when we said "AI," we usually meant a large, general-purpose language model that could generate text, answer questions, and perhaps even write code. Now, however, the AI ecosystem has become much more specialized. AI agents optimized for specific tasks or workflows have emerged. These agents are capable not only of question-and-answer interactions but also of connecting to and using external services.

AgentCore is AWS's solution for developing, operating, and scaling AI agents quickly and securely at the enterprise level. This is where the Strands Agents SDK comes in: an open-source developer package from AWS for the rapid creation and integration of AI agents. To avoid operating in a closed system, AgentCore natively supports the MCP (Model Context Protocol) and A2A (Agent-to-Agent)protocols, facilitating integration with other systems and collaboration between agents.

 

Challenges of AI Agents in a Corporate Environment

Everyone loves quick proofs of concept (PoCs). A development team puts together an impressive AI demo in a few days. The problem arises when this prototype must be transformed into a production-ready system.

Successful operation and scaling of AI agents involves much more than simply running an LLM. In a real business environment, the following problems arise:

·        Security and Identity Management: How do you ensure that an agent only has access to the APIs for which they are authorized? How are OAuth tokens, keys, and user rights managed?

·        Reliable memory and state management: In order for an agent to function properly, it must be able to "remember" the context. But what about long conversations and complex workflows?

·        Observability and Error Analysis: An agent often behaves like a black box. When an error occurs, how can you determine what went wrong?

·        Integration with Other Systems: CRMs, ERPs, databases, and APIs are just a few of the many integration points in the real corporate world.

With its modular and scalable building blocks, Bedrock AgentCore addresses these challenges, allowing developers to focus on developing the agent instead of solving every problem from scratch.

 

The main components of the Bedrock AgentCoreare:

Bedrock AgentCore is a modular service package that enables developers and operators to assemble the necessary functions for AI agents in a flexible manner. Rather than being a "boxed" solution, it is a family of services, each of which targets a specific real-world problem.

Here's more information about the main modules:

·        Runtime: This is the engine for agents. It is a serverless, containerized environment that ensures a fast start-up and dynamic scaling. The eight-hour session support is highlighted for good reason: it enables agents to manage more complex, stateful processes, such as intricate workflows or multi-step customer interactions.

·        Memory: The agents' memory covers two levels.

             o   Short-term memory stores the raw context of a conversation.

             o   Long-term memory allows the system to collect and remember information from agents across different sessions, and retrieve it dynamically.

The Bedrock Memory module accomplishes this by storing memory in an encrypted database or other AWS data sources, such as DynamoDB.

·        Identity: Effective identity management involves more than just access rights; it also ensures that agents can be securely identified in the user environment. It integrates with AWS IAM and external OAuth provider services, such as Slack and Google Workspace, making it easy to integrate with existing corporate authentication systems.

·        Gateway(MCP): This type of MCP server converts all internal and external APIs into agent-friendly devices. This allows developers to avoid writing integration codes manually and instead create standard MCP devices to which the agent can connect dynamically. The Gateway can handle Lambda functions and OpenAPI-compatible APIs.

·        Observability: Real-time monitoring and tracing are possible through integrations with CloudWatch and OpenTelemetry. Imagine a "digital map" showing what your agent is doing, which APIs it has called, and the responses it has received. This can greatly help with troubleshooting and performance optimization.

 

Strands Agents SDK

The Strands Agents SDK is the first step in the AWS AI agent ecosystem. If you've ever wanted to quickly and efficiently build AI-based prototypes on the AWS platform, the Strands SDK will become your favorite developer tool.

Why start with this?

First, the Strands SDK is supported in both Python and TypeScript environments, so most modern development teams will feel right at home with it. The code structure is straightforward, and the documentation ensures that building your first "Hello World" agent takes no longer than an afternoon.

Second, the SDK is tightly integrated with AgentCore. This means that agent prototypes created in Strands are compatible with production-ready components. There is no need to worry about converting the code written during the proof of concept(PoC) into a live system because there is no break between Strands and AgentCore.

Third, the SDK has built-in MCP and A2A compatibility. This is advantageous if you are designing different microservices or an architecture consisting of multiple agents. The SDK automatically prepares the necessary interfaces and makes connecting to external APIs easy.

With the Strands SDK, you're not just building a prototype; you're building a foundation that can stand on its own at the enterprise level. Its ideal choice for both startups and larger organizations, offering fast iteration, reusable components, and support for open protocols.

 

The Role of MCP and A2A Protocols

Without open protocols, AI agents would be nothing more than "talking boxes." MCP and A2A are Bedrock AgentCore's secret weapons because they form the backbone of its communication infrastructure.

·        MCP (Model Context Protocol):

The MCP is essentially a standard communication layer that connects models and business logic. The MCP transforms APIs and data sources into an abstraction layer, making them easier for agents to access. This is important because connecting the dozens of internal and external APIs that many organizations have has been a time-consuming and fragile process.

 

MCP facilitates technical integration and standardizes data management and context transfer. This ensures that agents always have access to the latest and most relevant information.

 

·        A2A (Agent-to-Agent):

Modern AI systems often consist of multiple agents. The A2A protocol enables these agents to collaborate on complex tasks. For instance, a customer service agent and a financial control agent can exchange data in real time, eliminating the need for developers to build custom interfaces.

A2A's strength lies in its workflow-oriented collaboration. Agents can perform task chaining, which means the output of one agent automatically becomes the input of another. This allow seven complex business processes to be automated.

 

Together, MCP and A2A provide all AI agents with a common language, enabling them to converse with users and collaborate with other systems and each other.

 

AWS Marketplace and the New Dimension of AI Agents

Not only does AWS support the AI agent ecosystem with developer tools and infrastructure, it is also increasingly positioning itself as a marketplace. The AWS Marketplace now offers not only classic software solutions but also dedicated AI agent packages and integration modules. This means companies no longer have to build agents from scratch; they can purchase preconfigured, tested components.

With the announcement of the new AI agent marketplace category, customers can more quickly discover AI agents from a central catalog. They can deploy selected solutions in a variety of ways, such as using the Amazon Bedrock AgentCore Runtime or adding tools to the AgentCore Gateway to speed up agent development.

The integration of AWS Bedrock, AgentCore, the AWS Marketplace's AI Agent and Tools category, and the Strands SDK ushers in a new era of AI agent development. What used to take weeks can now be achieved in days. Thanks to the MCP andA2A protocols, integration with AWS services and external systems is seamless.

 

Athene AI: The future is here

The capabilities offered by AWS Bedrock, AgentCore, and the Strands SDK align perfectly with the vision of the Athene AI team. Both Athene AI and Amazon AgentCore provide solutions to similar business challenges. What's the difference? Athene AI is a generative AI(GenAI) solution developed by three domestic IT companies. This makes it a secure, scalable, and easily integrable enterprise solution optimized for the Hungarian language. It is also supported by Hungarian experts who assist with implementation and proper use of the system.

Built on AWS Bedrock, our Athene AI platform takes full advantage of innovative services such as AgentCore. Our solution combines cutting-edge AI technologies with Hungarian language support and comprehensive data security and GDPR compliance for domestic and international companies.

Enterprise AI transformation is no longer just a futuristic vision. With AWS AgentCore and platforms like Athene AI, any organization can deploy cutting-edge AI agent solutions in just days. The question is not if, but when companies will adopt these solutions.

Would you like to see how Athene AI can transform your business? Request a personalized demo to experience the intersection of cutting-edge AI technology, Hungarian precision, and enterprise-grade security.