How Data Streaming with Apache Kafka and Flink Powers AI and Autonomous Networks in the Telecom Industry
Read More

How Data Streaming Powers AI and Autonomous Networks in Telecom – Insights from TM Forum Innovate Americas

AI and autonomous networks took center stage at TM Forum Innovate Americas 2025 in Dallas. Leaders from AT&T, Verizon, and other telcos showed how large-scale AI is transforming customer experience, security, and network operations, while the vision of self-optimizing networks moves closer to reality. At the heart of these advances lies data streaming: real-time, governed data flows that connect OSS/BSS, feed AI with fresh context, and enable automation at scale. This article captures the key insights from the event and explains why data streaming is becoming the central nervous system of modern telecoms.
Read More
The Future of Data Streaming with Apache Flink for Agentic AI Supporting A2A and MCP
Read More

The Future of Data Streaming with Apache Flink for Agentic AI

Agentic AI is moving into production. Autonomous, tool-using, goal-driven systems that need real-time data and context. Apache Kafka and Flink provide the event-driven foundation to run these agents at scale. With the new Flink Agents project (FLIP-531), Flink will natively support long-running, system-triggered AI agents integrated with LLMs, tools, and emerging protocols like MCP and A2A. This marks a major step toward reliable, enterprise-grade Agentic AI.
Read More
Agentic AI with AWS Amazon Bedrock AgentCore and Data Streaming using Apache Kafka Flink and Confluent Cloud
Read More

Building Agentic AI with Amazon Bedrock AgentCore and Data Streaming Using Apache Kafka and Flink

Agentic AI goes beyond chatbots. These are autonomous systems that observe, reason, and act—continuously and in real time. At AWS Summit New York 2025, Amazon launched Bedrock AgentCore to build and operate secure, scalable AI agents. But to run in production, agents also need real-time data, continuous context, and flexible integration. That’s where data streaming with Apache Kafka and Apache Flink comes in. Combined with open standards like MCP and A2A, they provide the event-driven foundation for always-on, enterprise-grade AI agents.
Read More
The Rise of Kappa Architecture in the Age of Agentic AI with Data Streaming using Apache Kafka and Flink
Read More

The Rise of Kappa Architecture in the Era of Agentic AI and Data Streaming

The shift from Lambda to Kappa architecture reflects the growing demand for unified, real-time data pipelines that serve both analytical and operational needs. With the rise of Agentic AI and streaming-first systems, Kappa—powered by Apache Kafka and Apache Flink—delivers low-latency, event-driven infrastructure that supports modern applications, from scalable data products to autonomous AI agents. Open table formats and Shift Left principles further establish Kappa as the foundation for consistent, governed, and future-ready data platforms.
Read More
FinTech Alpian using Data Streaming and Agentic AI with Apache Kafka in Switzerland Regulated Market
Read More

Agentic AI and RAG in Regulated FinTech with Apache Kafka at Alpian Bank

Regulated FinTech is transforming financial services by combining compliance with innovation. This post explores how real-time data streaming with Apache Kafka and Flink enables modern architecture, personalization, and AI integration—while maintaining strict governance. Alpian, a fully licensed Swiss digital bank, showcases how Agentic AI, RAG, and domain-driven design work together in a compliant, cloud-only environment.
Read More
How OpenAI Uses Apache Kafka and Flink for GenAI and Agentic AI
Read More

How OpenAI uses Apache Kafka and Flink for GenAI

OpenAI revealed how it builds and scales the real-time data streaming infrastructure that powers its GenAI systems, including ChatGPT, at the Current 2025 conference in London. This blog post summarizes the role of Apache Kafka and Apache Flink in OpenAI’s architecture—enabling near-instant data processing, continuous feedback loops, and scalable coordination across model training and applications. From simplified Kafka consumption to multi-region Flink pipelines, OpenAI’s sessions showed why real-time data infrastructure is essential for both generative and agentic AI.
Read More
How Penske Logistics Transforms Fleet Intelligence with Kafka and AI
Read More

How Penske Logistics Transforms Fleet Intelligence with Data Streaming and AI

Real-time visibility has become essential in logistics. As supply chains grow more complex, providers must shift from delayed, batch-based systems to event-driven architectures. Data Streaming technologies like Apache Kafka and Apache Flink enable this shift by allowing continuous processing of data from telematics, inventory systems, and customer interactions. Penske Logistics is leading the way—using Confluent’s platform to stream and process 190 million IoT messages daily. This powers predictive maintenance, faster roadside assistance, and higher fleet uptime. The result: smarter operations, improved service, and a scalable foundation for the future of logistics.
Read More
Data Streaming with Confluent Meets SAP and Databricks for Agentic AI at Sapphire in Madrid
Read More

Data Streaming Meets the SAP Ecosystem and Databricks – Insights from SAP Sapphire Madrid

SAP Sapphire 2025 in Madrid brought together global SAP users, partners, and technology leaders to showcase the future of enterprise data strategy. Key themes included SAP’s Business Data Cloud (BDC) vision, Joule for Agentic AI, and the deepening SAP-Databricks partnership. A major topic throughout the event was the increasing need for real-time integration across SAP and non-SAP systems—highlighting the critical role of event-driven architectures and data streaming platforms like Confluent. This blog shares insights on how data streaming enhances SAP ecosystems, supports AI initiatives, and enables industry-specific use cases across transactional and analytical domains.
Read More
Agentic AI with Apache Kafka as Event Broker Combined with MCP and A2A Protocol
Read More

Agentic AI with the Agent2Agent Protocol (A2A) and MCP using Apache Kafka as Event Broker

Agentic AI is emerging as a powerful pattern for building autonomous, intelligent, and collaborative systems. To move beyond isolated models and task-based automation, enterprises need a scalable integration architecture that supports real-time interaction, coordination, and decision-making across agents and services. This blog explores how the combination of Apache Kafka, Model Context Protocol (MCP), and Google’s Agent2Agent (A2A) protocol forms the foundation for Agentic AI in production. By replacing point-to-point APIs with event-driven communication as the integration layer, enterprises can achieve decoupling, flexibility, and observability—unlocking the full potential of AI agents in modern enterprise environments.
Read More
Data Streaming Lake Warehouse and Lakehouse with Confluent Databricks Snowflake using Iceberg and Tableflow Delta Lake
Read More

Databricks and Confluent Leading Data and AI Architectures – What About Snowflake, BigQuery, and Friends?

Confluent, Databricks, and Snowflake are trusted by thousands of enterprises to power critical workloads—each with a distinct focus: real-time streaming, large-scale analytics, and governed data sharing. Many customers use them in combination to build flexible, intelligent data architectures. This blog highlights how Erste Bank uses Confluent and Databricks to enable generative AI in customer service, while Siemens combines Confluent and Snowflake to optimize manufacturing and healthcare with a shift-left approach. Together, these examples show how a streaming-first foundation drives speed, scalability, and innovation across industries.
Read More