Online Feature Store for AI ML with Data Streaming using Apache Kafka Flink FlinkSQL Confluent Cloud at Wix
Read More

Online Feature Store for AI and Machine Learning with Apache Kafka and Flink

Real-time personalization requires more than just smart models. It demands fresh data, fast processing, and scalable infrastructure. This blog post explores how Wix.com rebuilt its online feature store using Apache Kafka and Flink, turning their AI architecture into a real-time powerhouse that supports personalized experiences for millions of users.
Read More
Amazon MSK Forces a Kafka Cluster Migration from ZooKeeper to KRaft
Read More

Amazon MSK Forces a Kafka Cluster Migration from ZooKeeper to KRaft

The Apache Kafka community introduced KIP-500 to remove ZooKeeper and replace it with KRaft, a built-in consensus layer that simplifies operations, improves scalability, and reduces complexity. Kafka itself supports smooth, zero downtime migrations from ZooKeeper to KRaft, even for large, mission critical clusters. But Amazon MSK does not. Instead, MSK forces users to perform a disruptive full migration to a new cluster. This article explores the challenges of MSK’s approach, highlights the risks for client applications, and outlines key evaluation criteria and recommendations for choosing the right data streaming platform in a rapidly evolving ecosystem.
Read More
Global Payment Processor Built with Data Streaming using Apache Kafka at Stripe Paypal Payoneer Worldline
Read More

How Global Payment Processors like Stripe and PayPal Use Data Streaming to Scale

This blog post explores how leading payment processors like Stripe, PayPal, Payoneer, and Worldline are leveraging data streaming with Apache Kafka to power real-time, scalable, and secure financial systems. As the industry shifts from batch processing to event-driven architecture, data streaming has become essential for handling transactions, fraud detection, compliance, and embedded services. The post highlights why Kafka is at the core of modern payment infrastructure – and how it enables innovation, resilience, and operational agility in the evolving fintech landscape.
Read More
The Rise of Diskless Apache Kafka with Object Storage and No Brokers
Read More

The Rise of Diskless Kafka: Rethinking Brokers, Storage, and the Kafka Protocol

Apache Kafka has evolved from a data lake pipeline into the backbone of real-time transactional systems. The shift from broker-based storage to Tiered Storage and now to Diskless Kafka using cloud object storage redefines Kafka’s role. This blog explores the business value, technical architecture, and use cases of running Kafka without brokers, using the Kafka protocol as the foundation for scalable, cost-efficient event streaming.
Read More
Synchronous Multi-Region Replication with Apache Kafka Confluent WarpStream for Zero Data Loss Disaster Recovery
Read More

Multi-Region Kafka using Synchronous Replication for Disaster Recovery with Zero Data Loss (RPO=0)

Apache Kafka is the backbone of real-time data streaming. Choosing the right deployment model – self-managed, fully managed, or bring-your-own-cloud (BYOC) – is a strategic decision. It affects performance, compliance, and cost. This article explains the most common Kafka deployment strategies and highlights the innovation of synchronous multi-region replication to achieve zero data loss (RPO=0). Alternatives like stretched Kafka clusters, Confluent Multi-Region Clusters (MRC), and WarpStream offer different paths to RPO=0. They support critical workloads with strong durability and high availability. For mission critical and regulated use cases, zero data loss is no longer a future goal. It is now achievable with the right architecture.
Read More
Data Streaming and AI in the Automotive Industry at OEMs like Porsche Tesla BMW
Read More

Driving the Future: How Real-Time Data Streaming Is Powering Automotive Innovation

The automotive industry is rapidly shifting toward a software-defined, data-driven future. Real-time technologies like Apache Kafka and Apache Flink are now critical to powering connected vehicles, smart factories, autonomous platforms, and personalized mobility services. This blog explores how leading OEMs and suppliers use data streaming to drive digital transformation – from edge processing and AI to predictive maintenance and customer experience. As the industry moves toward intelligent, adaptive systems, event-driven architecture becomes a strategic foundation.
Read More
Agentic AI with AWS Amazon Bedrock AgentCore and Data Streaming using Apache Kafka Flink and Confluent Cloud
Read More

Building Agentic AI with Amazon Bedrock AgentCore and Data Streaming Using Apache Kafka and Flink

Agentic AI goes beyond chatbots. These are autonomous systems that observe, reason, and act—continuously and in real time. At AWS Summit New York 2025, Amazon launched Bedrock AgentCore to build and operate secure, scalable AI agents. But to run in production, agents also need real-time data, continuous context, and flexible integration. That’s where data streaming with Apache Kafka and Apache Flink comes in. Combined with open standards like MCP and A2A, they provide the event-driven foundation for always-on, enterprise-grade AI agents.
Read More
FourKites Supply Chain Logistics Control Tower Powered by AI and Data Streaming with Confluent and Kafka
Read More

Inside FourKites Logistics Platform: Data Streaming for AI and End-to-End Visibility in the Supply Chain

Global supply chains face constant disruption. Trade conflicts, wars, inflation, and shifting regulations are making logistics more unpredictable than ever. Traditional systems can’t keep up with the speed and complexity of today’s challenges. This blog post shows how FourKites, a leader in supply chain visibility, uses data streaming with Apache Kafka in the cloud, and AI to power real-time logistics. With over 3 million shipments tracked daily, FourKites delivers more than visibility — it enables fast, autonomous decisions at global scale.
Read More
The Rise of Kappa Architecture in the Age of Agentic AI with Data Streaming using Apache Kafka and Flink
Read More

The Rise of Kappa Architecture in the Era of Agentic AI and Data Streaming

The shift from Lambda to Kappa architecture reflects the growing demand for unified, real-time data pipelines that serve both analytical and operational needs. With the rise of Agentic AI and streaming-first systems, Kappa—powered by Apache Kafka and Apache Flink—delivers low-latency, event-driven infrastructure that supports modern applications, from scalable data products to autonomous AI agents. Open table formats and Shift Left principles further establish Kappa as the foundation for consistent, governed, and future-ready data platforms.
Read More