Mainframe Modernization and Integration with Data Streaming using Apache Kafka IBM MQ IIDR CDC Precisely Qlik
Read More

Mainframe Integration with Data Streaming: Architecture, Business Value, Real-World Success

The mainframe is evolving—not fading. With cloud-native features, AI acceleration, and quantum-safe encryption, platforms like IBM z16 and z17 remain central to critical industries. But modern demands require real-time data access and system agility. Apache Kafka and Flink make this possible by streaming data bi-directionally between DB2, IMS, and MQ and cloud analytics platforms. This enables event-driven architectures without disrupting core systems. This post outlines proven strategies—offloading, integration, and replacement—and includes real-world examples across industries. The result: lower costs, faster innovation, and smarter use of legacy systems.
Read More
The Strangler Fig Design Pattern - Migration and Replacement of Legacy IT Applications with Data Streaming using Apache Kafka
Read More

Replacing Legacy Systems, One Step at a Time with Data Streaming: The Strangler Fig Approach

Modernizing legacy systems doesn’t have to mean a risky big-bang rewrite. This blog explores how the Strangler Fig Pattern, when combined with data streaming, enables gradual, low-risk transformation—unlocking real-time capabilities, reducing complexity, and supporting scalable, cloud-native architectures. Discover how leading organizations are using this approach to migrate at their own pace, stay compliant, and enable new business models. Plus, why Reverse ETL falls short and streaming is the future of IT modernization.
Read More
Multi-Cloud Replication in Real-Time with Apache Kafka and Cluster Linking
Read More

Multi-Cloud Replication in Real-Time with Apache Kafka and Cluster Linking

Multiple Apache Kafka clusters are the norm; not an exception anymore. Hybrid integration and multi-cloud replication for migration or disaster recovery are common use cases. This blog post explores a real-world success story from financial services around the transition of a large traditional bank from on-premise data centers into the public cloud for multi-cloud data sharing between AWS and Azure.
Read More
One Apache Kafka Cluster Type Does NOT Fit All Use Cases
Read More

Apache Kafka Cluster Type Deployment Strategies

Organizations start their data streaming adoption with a single Apache Kafka cluster to deploy the first use cases. The need for group-wide data governance and security but different SLAs, latency, and infrastructure requirements introduce new Kafka clusters. Multiple Kafka clusters are the norm, not an exception. Use cases include hybrid integration, aggregation, migration, and disaster recovery. This blog post explores real-world success stories and cluster strategies for different Kafka deployments across industries.
Read More
JMS Message Broker vs Apache Kafka Data Streaming
Read More

Message Broker and Apache Kafka: Trade-Offs, Integration, Migration

A Message broker has very different characteristics and use cases than a data streaming platform like Apache Kafka. Data integration, processing, governance, and security must be reliable and scalable across the business process. This blog post explores the capabilities of message brokers, the relation to the JMS standard, trade-offs compared to data streaming with Apache Kafka, and typical integration and migration scenarios.
Read More
SIEM and SOAR Modernization with Apache Kafka Elasticsearch Splunk QRadar Arcsight Cortex
Read More

Kafka for Cybersecurity (Part 6 of 6) – SIEM / SOAR Modernization

This blog series explores use cases and architectures for Apache Kafka in the cybersecurity space, including situational awareness, threat intelligence, forensics, air-gapped and zero trust environments, and SIEM / SOAR modernization. This post is part six: SIEM / SOAR modernization and integration.
Read More