The digital transformation enables a connected world. People, vehicles, factories, cities, digital services, and other “things” communicate with each other in real-time to provide a safe environment, efficient processes, and a fantastic user experience. This scenario only works well with data processing in real-time at scale. This blog post shares a presentation that explains why Apache Kafka plays a key role in these industries and use cases but also to connect the different stakeholders.
Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way integrating with various legacy and modern data sources and sinks.
I want to give you an overview of existing use cases for event streaming technology in a connected world across supply chains, industries, and customer experiences that come along with these interdisciplinary data intersections:
All these industries and sectors do not have new characteristics and requirements. They require data integration, data correlation, and real decoupling. The difference is the massively increased volumes of data.
Real-time messaging solutions have existed for many years. Hundreds of platforms exist for data integration (including ETL and ESB tooling or specific IIoT platforms). Proprietary monoliths monitor plants, telco networks, and other infrastructures for decades in real-time. But now, Kafka combines all the above characteristics in an open, scalable, and flexible infrastructure to operate mission-critical workloads at scale in real-time. And is taking over the world of connecting data.
“Apache vs. MQ/ETL/ESB” goes into more detail about this discussion.
Before we jump into the presentation, I want to cover one key trend I see across industries: A streaming data exchange with Apache Kafka:
TL;DR: If you use event streaming with Kafka in your projects (for reasons like real-time processing, scalability, decoupling), and your partner does the same, well, then it does NOT make sense to put a REST / HTTP API in the middle. Instead, the partners should be integrated in a streaming way.
APIs and API Management still have their value for some use cases, of course. Check out the comparison of “Event Streaming with Apache Kafka vs. API Gateway / API Management with Mulesoft or Kong” for more details.
Here is the slide deck covering various use cases and architectures to realize a connected world with Apache Kafka from different perspectives:
Click on the button to load the content from www.slideshare.net.
The on-demand video recording walks you through the above presentation:
Connecting the world is a key requirement across industries. Many innovative digital services are only possible through collaboration between stakeholders. Real-time messaging, integration, continuous stream processing, and replication between partners are required. Event Streaming with Apache Kafka helps with the implementation of these use cases.
What are your experiences and plans for event streaming to connect the world? Did you already build applications with Apache Kafka to connect your products and services to partners? Let’s connect on LinkedIn and discuss it! Stay informed about new blog posts by subscribing to my newsletter.
The mainframe is evolving—not fading. With cloud-native features, AI acceleration, and quantum-safe encryption, platforms like…
OpenAI revealed how it builds and scales the real-time data streaming infrastructure that powers its…
Durable execution engines like Temporal and Restate are redefining how developers orchestrate long-running, stateful workflows…
Real-time visibility has become essential in logistics. As supply chains grow more complex, providers must…
SAP Sapphire 2025 in Madrid brought together global SAP users, partners, and technology leaders to…
Agentic AI is emerging as a powerful pattern for building autonomous, intelligent, and collaborative systems.…