The AI journey for many started with chatbots. But agentic AI goes far beyond that. These are autonomous systems that observe, reason, and act—continuously and in real time. At the AWS Summit New York 2025, Amazon introduced Bedrock AgentCore, a new foundation for building and operating enterprise-grade agents. It’s a major step forward. But models and agents alone aren’t enough.
To succeed in production, agents need real-time data, asynchronous communication, and scalable integration across systems. That’s where data streaming with Apache Kafka and Apache Flink comes in. Paired with Amazon Bedrock AgentCore and using standard protocols like MCP and A2A, they provide the event-driven architecture needed for true agentic automation.
This post explores how these technologies work together to enable intelligent, always-on agents in real enterprise environments.
Join the data streaming community and stay informed about new blog posts by subscribing to my newsletter and follow me on LinkedIn or X (former Twitter) to stay in touch. And make sure to download my free book about data streaming use cases, including various AI examples across industries.
Agentic AI is no longer just an experimental technology. It’s evolving into a practical foundation for building intelligent, autonomous, and collaborative systems in production. These agents go far beyond chatbots. They reason, plan, and take actions independently across business systems—continuously and in real time.
At the AWS Summit New York 2025, Amazon introduced several major innovations to support this vision. The most prominent is Amazon Bedrock AgentCore. This is not just another platform for developers. It is a full suite of services for deploying secure, scalable AI agents across industries.
AgentCore is designed to bridge the gap between AI prototypes and scalable production agents. It includes core services such as:
These features enable agents to work reliably across internal tools, external services, and dynamic data flows.
AWS highlighted a diverse mix of partners leading the charge in this new era, including:
And then a surprise to many:
Autonomous agents are NOT request-response tools. They are event-driven and reactive by nature.
Agents need to monitor signals across a system, evaluate conditions, and take action without waiting for a human prompt. This requires a continuous stream of data, not periodic updates.
Apache Kafka plays a key role here. It serves as the event broker that connects agents running on AWS Bedrock with the rest of the enterprise. Kafka provides a shared event backbone across departments, applications, and even partner ecosystems.
With Apache Kafka, agents can:
Apache Kafka replaces fragile point-to-point APIs with a scalable, decoupled architecture. It ensures every part of the system stays in sync, even as new tools, services, and agents are added.
While Kafka connects the system, Apache Flink adds real-time intelligence to the stream.
Many production agents don’t just wait for a prompt. They run constantly in the background, observing patterns, detecting anomalies, and triggering actions. Think of fraud detection, supply chain optimization, or real-time personalization.
Here is an example using Flink with OpenAI’s LLM. I built such an example a few quarters ago. Amazon Bedrock EventCore can be integrated in the same way.
Flink enables this by:
These agents become embedded services, not just LLM wrappers. They act as digital sentinels, always listening and ready to respond – automatically and intelligently.
And with FLIP-531, Apache Flink goes beyond orchestration and data preparation in Agentic AI environments. It now provides a native runtime to build, run, and manage autonomous AI agents at scale supporting A2A and MCP standard protocols.
For agentic AI to scale, systems need a common language.
Two new protocols are addressing this:
An event-driven architecture with data streaming powered by Apache Kafka and Flink provides the perfect foundation for both protocols.
Instead of synchronous APIs, agents communicate through Kafka topics—publishing and subscribing to structured events defined by MCP or A2A. This model is fully decoupled, highly scalable, and easy to monitor.
Kafka Topics act as as a data substrate between agents and systems using MCP and A2A, which are otherwise stateless protocols. Without Kafka, replayability and traceability aren’t possible so interoperability in production becomes infeasible. Kafka allows enterprises to mix and match implementations across languages, clouds, and teams without breaking downstream workflows.
AWS supports a broad ecosystem of tools for AI. But Confluent is critical for one key reason: real-time integration combined with an event-driven for true decoupling and flexibility.
With Confluent Cloud providing serverless Kafka and Flink, organizations can:
This transforms Confluent from a data pipeline into an agent platform—not just supporting agents, but enabling the creation of new ones.
Obviously, many other frameworks, solutions, and vendors play a role in the data streaming landscape (but Confluent is the clear leader):
Don’t just take my word for it (I work at Confluent). Do your own research, evaluation and cost analysis. Look at Forrester Waves, technical comparisons, and independent benchmarks. Just make sure to look at the entire total cost of ownership (TCO). And think about time to market, too.
Agentic AI signals a new era of enterprise automation where intelligent software doesn’t just respond to prompts, but observes, reasons, and acts autonomously across systems.
To move from prototype to production, enterprises need:
Together, AWS and Confluent provide this foundation. Bedrock AgentCore delivers the platform for building enterprise-grade agents. Confluent Cloud provides the real-time, event-driven architecture to connect those agents with tools, data, and systems.
And importantly, Confluent runs wherever your business runs. Whether you’re all-in on AWS, operating in Azure, Google Cloud, Alibaba Cloud, on-premises, at the edge, or across a hybrid architecture. Confluent’s data streaming platform offers consistent deployment, management, and security.
The agentic future will be real-time, decoupled, and distributed.
And it runs on events.
Join the data streaming community and stay informed about new blog posts by subscribing to my newsletter and follow me on LinkedIn or X (former Twitter) to stay in touch. And make sure to download my free book about data streaming use cases, including various AI examples across industries.
Cyberattacks on critical infrastructure and manufacturing are growing, with ransomware and manipulated sensor data creating…
Many enterprises want to become AI ready but are limited by slow, batch based integration…
Apache Kafka’s pull-based model and decoupled architecture offer unmatched flexibility for event-driven systems. But as…
Point-of-Sale systems are evolving into real-time, connected platforms that go far beyond payments. Mobile solutions…
Real-time personalization requires more than just smart models. It demands fresh data, fast processing, and…
AI and autonomous networks took center stage at TM Forum Innovate Americas 2025 in Dallas.…