How OpenAI Uses Apache Kafka and Flink for GenAI and Agentic AI
Read More

How OpenAI uses Apache Kafka and Flink for GenAI

OpenAI revealed how it builds and scales the real-time data streaming infrastructure that powers its GenAI systems, including ChatGPT, at the Current 2025 conference in London. This blog post summarizes the role of Apache Kafka and Apache Flink in OpenAI’s architecture—enabling near-instant data processing, continuous feedback loops, and scalable coordination across model training and applications. From simplified Kafka consumption to multi-region Flink pipelines, OpenAI’s sessions showed why real-time data infrastructure is essential for both generative and agentic AI.
Read More
How Penske Logistics Transforms Fleet Intelligence with Kafka and AI
Read More

How Penske Logistics Transforms Fleet Intelligence with Data Streaming and AI

Real-time visibility has become essential in logistics. As supply chains grow more complex, providers must shift from delayed, batch-based systems to event-driven architectures. Data Streaming technologies like Apache Kafka and Apache Flink enable this shift by allowing continuous processing of data from telematics, inventory systems, and customer interactions. Penske Logistics is leading the way—using Confluent’s platform to stream and process 190 million IoT messages daily. This powers predictive maintenance, faster roadside assistance, and higher fleet uptime. The result: smarter operations, improved service, and a scalable foundation for the future of logistics.
Read More
Data Streaming with Confluent Meets SAP and Databricks for Agentic AI at Sapphire in Madrid
Read More

Data Streaming Meets the SAP Ecosystem and Databricks – Insights from SAP Sapphire Madrid

SAP Sapphire 2025 in Madrid brought together global SAP users, partners, and technology leaders to showcase the future of enterprise data strategy. Key themes included SAP’s Business Data Cloud (BDC) vision, Joule for Agentic AI, and the deepening SAP-Databricks partnership. A major topic throughout the event was the increasing need for real-time integration across SAP and non-SAP systems—highlighting the critical role of event-driven architectures and data streaming platforms like Confluent. This blog shares insights on how data streaming enhances SAP ecosystems, supports AI initiatives, and enables industry-specific use cases across transactional and analytical domains.
Read More
Agentic AI with Apache Kafka as Event Broker Combined with MCP and A2A Protocol
Read More

Agentic AI with the Agent2Agent Protocol (A2A) and MCP using Apache Kafka as Event Broker

Agentic AI is emerging as a powerful pattern for building autonomous, intelligent, and collaborative systems. To move beyond isolated models and task-based automation, enterprises need a scalable integration architecture that supports real-time interaction, coordination, and decision-making across agents and services. This blog explores how the combination of Apache Kafka, Model Context Protocol (MCP), and Google’s Agent2Agent (A2A) protocol forms the foundation for Agentic AI in production. By replacing point-to-point APIs with event-driven communication as the integration layer, enterprises can achieve decoupling, flexibility, and observability—unlocking the full potential of AI agents in modern enterprise environments.
Read More
Data Streaming Lake Warehouse and Lakehouse with Confluent Databricks Snowflake using Iceberg and Tableflow Delta Lake
Read More

Databricks and Confluent Leading Data and AI Architectures – What About Snowflake, BigQuery, and Friends?

Confluent, Databricks, and Snowflake are trusted by thousands of enterprises to power critical workloads—each with a distinct focus: real-time streaming, large-scale analytics, and governed data sharing. Many customers use them in combination to build flexible, intelligent data architectures. This blog highlights how Erste Bank uses Confluent and Databricks to enable generative AI in customer service, while Siemens combines Confluent and Snowflake to optimize manufacturing and healthcare with a shift-left approach. Together, these examples show how a streaming-first foundation drives speed, scalability, and innovation across industries.
Read More
Enterprise Application Integration with Confliuent and Databricks for Oracle SAP Salesforce Servicenow et al
Read More

Databricks and Confluent in the World of Enterprise Software (with SAP as Example)

Enterprise data lives in complex ecosystems—SAP, Oracle, Salesforce, ServiceNow, IBM Mainframes, and more. This article explores how Confluent and Databricks integrate with SAP to bridge operational and analytical workloads in real time. It outlines architectural patterns, trade-offs, and use cases like supply chain optimization, predictive maintenance, and financial reporting, showing how modern data streaming unlocks agility, reuse, and AI-readiness across even the most SAP-centric environments.
Read More
Event-Driven Agentic AI with Data Streaming using Apache Kafka and Flink
Read More

How Apache Kafka and Flink Power Event-Driven Agentic AI in Real Time

Agentic AI marks a major evolution in artificial intelligence—shifting from passive analytics to autonomous, goal-driven systems capable of planning and executing complex tasks in real time. To function effectively, these intelligent agents require immediate access to consistent, trustworthy data. Traditional batch processing architectures fall short of this need, introducing delays, data staleness, and rigid workflows. This blog post explores why event-driven architecture (EDA)—powered by Apache Kafka and Apache Flink—is essential for building scalable, reliable, and adaptive AI systems. It introduces key concepts such as Model Context Protocol (MCP) and Google’s Agent-to-Agent (A2A) protocol, which are redefining interoperability and context management in multi-agent environments. Real-world use cases from finance, healthcare, manufacturing, and more illustrate how Kafka and Flink provide the real-time backbone needed for production-grade Agentic AI. The post also highlights why popular frameworks like LangChain and LlamaIndex must be complemented by robust streaming infrastructure to support stateful, event-driven AI at scale.
Read More
Learnings from the CIO Summit: AI + Data Streaming = Key for Success
Read More

CIO Summit: The State of AI and Why Data Streaming is Key for Success

The CIO Summit in Amsterdam provided a valuable perspective on the state of AI adoption across industries. While enthusiasm for AI remains high, organizations are grappling with the challenge of turning potential into tangible business outcomes. Key discussions centered on distinguishing hype from real value, the importance of high-quality and real-time data, and the role of automation in preparing businesses for AI integration. A recurring theme was that AI is not a standalone solution—it must be supported by a strong data foundation, clear ROI objectives, and a strategic approach. As AI continues to evolve toward more autonomous, agentic systems, data streaming will play a critical role in ensuring AI models remain relevant, context-aware, and actionable in real time.
Read More