Apache Kafka Proxy Use Cases Benefits Trade-Offs - Open Source vs Cloud
Read More

Kafka Proxy Demystified: Use Cases, Benefits, and Trade-offs

A Kafka proxy adds centralized security and governance for Apache Kafka. Solutions like Kroxylicious, Conduktor, and Confluent enable encryption, access control, and compliance without modifying clients or brokers. This article explores key use cases, best practices, and alternatives such as API gateways and service meshes.
Read More
Stablecoins and Data Streaming with Kafka and Flink for Digital Money and Currency USDC USDT Blockchain
Read More

How Stablecoins Use Blockchain and Data Streaming to Power Digital Money

Stablecoins are reshaping digital money by linking traditional finance with blockchain technology. Built for stability and speed, they enable real time payments, settlements, and programmable financial services. To operate reliably at scale, stablecoin systems require continuous data movement and analysis across ledgers, compliance tools, and banking platforms. A Data Streaming Platform using technologies like Apache Kafka and Apache Flink can provide this foundation by ensuring transactions, reserves, and risk signals are processed instantly and consistently. Together, blockchain, data streaming, and AI pave the way for a new financial infrastructure that runs on live, contextual data.
Read More
Cybersecurity with a Real-Time Digital Twin using Data Streaming with Apache Kafka Flink and Sigma
Read More

Cybersecurity with a Digital Twin: Why Real-Time Data Streaming Matters

Cyberattacks on critical infrastructure and manufacturing are growing, with ransomware and manipulated sensor data creating severe risks. Digital twins combined with data streaming provide real-time visibility, continuous monitoring, and proactive defense across both IT and OT environments. Using technologies like Kafka, Flink and Sigma, organizations can detect anomalies instantly, strengthen resilience, and secure digital transformation.
Read More
The Future of AI-Ready Integration with Data Streaming powered by Apache Kafka and Flink
Read More

How Siemens, SAP, and Confluent Shape the Future of AI Ready Integration – Highlights from the Rojo Event in Amsterdam

Many enterprises want to become AI ready but are limited by slow, batch based integration platforms that prevent real time insight and automation. The Rojo “Future of Integration” event in Amsterdam addressed this challenge by bringing together Siemens, SAP, Rojo, and Confluent to show how event driven and intelligent data architectures solve it. The discussions revealed how data streaming with Apache Kafka and Flink complements traditional integration tools, enabling continuous data flow, scalability, and the foundation for AI and automation. This blog summarizes the key learnings from the event, including my presentation “AI Ready Integration with Data Streaming,” and insights from Siemens, SAP, and Rojo on how enterprises can build truly connected, AI ready ecosystems.
Read More
Scaling Apache Kafka Consumers for High Throughput with Proxy or Client Library for API and Database Integration
Read More

Scaling Kafka Consumers: Proxy vs. Client Library for High-Throughput Architectures

Apache Kafka’s pull-based model and decoupled architecture offer unmatched flexibility for event-driven systems. But as data volumes and consumer applications grow, new challenges emerge; from head-of-line blocking and rising operational overhead to complex failure handling. This post explores real-world lessons from companies like Wix and Uber, highlighting common consumer scalability issues and two main solutions: push-based consumer proxies and enhanced client libraries like Confluent’s Parallel Consumer. It concludes with a vision for a serverless Kafka consumption model that reduces total cost of ownership while preserving Kafka’s core strengths.
Read More
Real Time Point of Sale POS in Retail and eCommerce with Data Streaming using Kafka and Flink at SumUp Square Shopify
Read More

Square, SumUp, Shopify: Real-Time Point-of-Sale (POS) in the Age of Data Streaming

Point-of-Sale systems are evolving into real-time, connected platforms that go far beyond payments. Mobile solutions from Square, SumUp, and Shopify give even the smallest merchants access to integrated sales channels, inventory management, and customer insights. Powered by Apache Kafka and Apache Flink, data streaming is transforming retail with instant decisions, automated actions, and the foundation for Agentic AI in the future of POS.
Read More
Online Feature Store for AI ML with Data Streaming using Apache Kafka Flink FlinkSQL Confluent Cloud at Wix
Read More

Online Feature Store for AI and Machine Learning with Apache Kafka and Flink

Real-time personalization requires more than just smart models. It demands fresh data, fast processing, and scalable infrastructure. This blog post explores how Wix.com rebuilt its online feature store using Apache Kafka and Flink, turning their AI architecture into a real-time powerhouse that supports personalized experiences for millions of users.
Read More
Amazon MSK Forces a Kafka Cluster Migration from ZooKeeper to KRaft
Read More

In-Place Kafka Cluster Upgrades from ZooKeeper to KRaft are Not Possible with Amazon MSK

The Apache Kafka community introduced KIP-500 to remove ZooKeeper and replace it with KRaft, a new built-in consensus layer. This was a major milestone. It simplified operations, improved scalability, and reduced complexity. Importantly, Kafka supports smooth, zero downtime migrations from ZooKeeper to KRaft, even for large, business critical clusters. But NOT with Amazon MSK…
Read More
Global Payment Processor Built with Data Streaming using Apache Kafka at Stripe Paypal Payoneer Worldline
Read More

How Global Payment Processors like Stripe and PayPal Use Data Streaming to Scale

This blog post explores how leading payment processors like Stripe, PayPal, Payoneer, and Worldline are leveraging data streaming with Apache Kafka to power real-time, scalable, and secure financial systems. As the industry shifts from batch processing to event-driven architecture, data streaming has become essential for handling transactions, fraud detection, compliance, and embedded services. The post highlights why Kafka is at the core of modern payment infrastructure – and how it enables innovation, resilience, and operational agility in the evolving fintech landscape.
Read More
The Rise of Diskless Apache Kafka with Object Storage and No Brokers
Read More

The Rise of Diskless Kafka: Rethinking Brokers, Storage, and the Kafka Protocol

Apache Kafka has evolved from a data lake pipeline into the backbone of real-time transactional systems. The shift from broker-based storage to Tiered Storage and now to Diskless Kafka using cloud object storage redefines Kafka’s role. This blog explores the business value, technical architecture, and use cases of running Kafka without brokers, using the Kafka protocol as the foundation for scalable, cost-efficient event streaming.
Read More