Apache Kafka

Architecture patterns for distributed, hybrid, edge and global Apache Kafka deployments

Multi-cluster and cross-data center deployments of Apache Kafka have become the norm rather than an exception. Learn about several scenarios that may require multi-cluster solutions and see real-world examples with their specific requirements and trade-offs, including disaster recovery, aggregation for analytics, cloud migration, mission-critical stretched deployments and global Kafka.

Key takeaways for Multi Data Center Kafka Architectures

  • In many scenarios, one Kafka cluster is not enough. Understand different architectures and alternatives for multi-cluster deployments.
  • Zero data loss and high availability are two key requirements. Understand how to realize this, including trade-offs.
  • Learn about features and limitations of Kafka for multi cluster deployments- Global Kafka and mission-critical multi-cluster deployments with zero data loss and high availability became the normal, not an exception.
  • Learn about architectures like stretched cluster, hybrid integration and fully-managed serverless Kafka in the cloud (using Confluent Cloud), and tools like MirrorMaker 2, Confluent Replicator, Multi-Region Clusters (MRP), Global Kafka, and more.

Slide Deck

Click on the button to load the content from www.slideshare.net.

Load content

Video Recording

Kai Waehner

bridging the gap between technical innovation and business value for real-time data streaming, processing and analytics

Recent Posts

Mainframe Integration with Data Streaming: Architecture, Business Value, Real-World Success

The mainframe is evolving—not fading. With cloud-native features, AI acceleration, and quantum-safe encryption, platforms like…

2 days ago

How OpenAI uses Apache Kafka and Flink for GenAI

OpenAI revealed how it builds and scales the real-time data streaming infrastructure that powers its…

6 days ago

­­The Rise of the Durable Execution Engine (Temporal, Restate) in an Event-driven Architecture (Apache Kafka)

Durable execution engines like Temporal and Restate are redefining how developers orchestrate long-running, stateful workflows…

1 week ago

How Penske Logistics Transforms Fleet Intelligence with Data Streaming and AI

Real-time visibility has become essential in logistics. As supply chains grow more complex, providers must…

2 weeks ago

Data Streaming Meets the SAP Ecosystem and Databricks – Insights from SAP Sapphire Madrid

SAP Sapphire 2025 in Madrid brought together global SAP users, partners, and technology leaders to…

3 weeks ago

Agentic AI with the Agent2Agent Protocol (A2A) and MCP using Apache Kafka as Event Broker

Agentic AI is emerging as a powerful pattern for building autonomous, intelligent, and collaborative systems.…

3 weeks ago