AI

How Siemens, SAP, and Confluent Shape the Future of AI Ready Integration – Highlights from the Rojo Event in Amsterdam

Enterprises want to become AI ready but many are held back by integration platforms that move data in slow, disconnected batches. The Rojo event “Future of Integration” in Amsterdam focused on solving this problem. Siemens, SAP, Rojo, and Confluent shared their approaches to building event driven and intelligent data architectures that connect systems in real time. It was the ideal forum to explore how enterprises can achieve the same level of seamless connectivity within their own data and systems.

Join the data streaming community and stay informed about new blog posts by subscribing to my newsletter and follow me on LinkedIn or X (former Twitter) to stay in touch. And make sure to download my free book about data streaming use cases, including several success stories around IT modernization.

The Future of Integration

The agenda brought together multiple viewpoints, and the next sections look at my presentation and how it connects to the topics shared by Siemens, SAP, and Rojo.

Agenda: The Future of Integration (Source: Rojo)

My Session: AI Ready Integration with Data Streaming

In my presentation, “AI Ready Integration with Data Streaming,” the focus was on how real time data streaming forms the foundation for the next generation of intelligent integration.

AI depends on live, accurate, and contextual data. Models lose value when they operate on static or outdated information. A Data Streaming Platform (DSP) such as Confluent, built on Apache Kafka, Flink, and Iceberg delivers continuous, trusted data that flows freely between all systems, including SAP and modern cloud platforms.

With data streaming, enterprises can detect fraud the moment it happens, provide personalized customer service through virtual agents, or optimize supply chains with instant feedback from connected systems. Each of these examples proves the same point. Real time data turns integration from a technical task into a source of intelligence and competitive advantage.

The next step is Agentic AI, where intelligent systems act autonomously based on live data streams. This requires integration that is context aware and scalable. Confluent enables this by orchestrating continuous data flows between platforms such as SAP, Databricks, and other enterprise systems, ensuring that AI applications always have the right context at the right time.

Here is the slide deck of my talk “Al-Ready Integration with Data Streaming“:

Fullscreen Mode

Other Perspectives about the Future of Integration from SAP, Siemens, and Rojo

The sessions from Siemens, SAP, and Rojo complemented this vision perfectly.

Siemens presented a strong customer story showing how they combine SAP and Confluent to modernize their integration landscape. Their solution replaced slow, batch based data movement with real time streaming between SAP systems and other enterprise platforms. This approach enabled higher data quality, faster processes, and new insights across business domains. Siemens is a clear example of how a data streaming backbone supports both operational excellence and AI readiness. Learn more in this article: Shift Left Architecture at Siemens: Real-Time Innovation in Manufacturing and Logistics with Data Streaming.

SAP shared its roadmap for the Integration Suite, which provides an intelligent iPaaS (Integration Platform as a Service) for connecting systems and managing business processes. SAP’s capabilities are fully complementary to a data streaming platform. While Confluent focuses on continuous event flows and data governance across the enterprise, SAP Integration Suite focuses on process orchestration, API management, and ready made connectivity for SAP applications. Together they form a complete architecture that combines data flow and process control in one unified approach. I explored years ago why iPaaS and Data Streaming are different categories of middleware.

Rojo, as a system integrator, plays a key role in helping customers realize this architecture. The company brings together expertise from SAP, Confluent, and cloud environments to design integration solutions that are not only functional but intelligent and scalable. Rojo bridges the gap between traditional integration projects and modern data streaming strategies, enabling enterprises to move from simple data transfer to real time data collaboration.

Why Real Time Data Is the Foundation for AI Ready Integration

All speakers shared one clear message. The future of integration is flexible, scalable, and event driven. Enterprises that continue to move data in nightly batches will always lag behind and NOT be AI-ready because your models need fresh, curated data from all the operational systems. Those that embrace streaming create a continuous data fabric that supports analytics, automation, and intelligent agents in real time.

This transformation does not replace existing integration tools. It complements them. Data streaming adds the missing event-driven real time layer that allows AI and business systems to work together seamlessly while being truly decoupling from each other.

Join the data streaming community and stay informed about new blog posts by subscribing to my newsletter and follow me on LinkedIn or X (former Twitter) to stay in touch. And make sure to download my free book about data streaming use cases, including several success stories around IT modernization.

Kai Waehner

bridging the gap between technical innovation and business value for real-time data streaming, processing and analytics

Recent Posts

Scaling Kafka Consumers: Proxy vs. Client Library for High-Throughput Architectures

Apache Kafka’s pull-based model and decoupled architecture offer unmatched flexibility for event-driven systems. But as…

7 days ago

Square, SumUp, Shopify: Real-Time Point-of-Sale (POS) in the Age of Data Streaming

Point-of-Sale systems are evolving into real-time, connected platforms that go far beyond payments. Mobile solutions…

2 weeks ago

Online Feature Store for AI and Machine Learning with Apache Kafka and Flink

Real-time personalization requires more than just smart models. It demands fresh data, fast processing, and…

3 weeks ago

How Data Streaming Powers AI and Autonomous Networks in Telecom – Insights from TM Forum Innovate Americas

AI and autonomous networks took center stage at TM Forum Innovate Americas 2025 in Dallas.…

4 weeks ago

Telecom OSS Modernization with Data Streaming: From Legacy Burden to Cloud-Native Agility

OSS is critical for service delivery in telecom, yet legacy platforms have become rigid and…

4 weeks ago

In-Place Kafka Cluster Upgrades from ZooKeeper to KRaft are Not Possible with Amazon MSK

The Apache Kafka community introduced KIP-500 to remove ZooKeeper and replace it with KRaft, a…

4 weeks ago