JMS Message Broker vs Apache Kafka Data Streaming
Read More

Message Broker and Apache Kafka: Trade-Offs, Integration, Migration

A Message broker has very different characteristics and use cases than a data streaming platform like Apache Kafka. Data integration, processing, governance, and security must be reliable and scalable across the business process. This blog post explores the capabilities of message brokers, the relation to the JMS standard, trade-offs compared to data streaming with Apache Kafka, and typical integration and migration scenarios.
Read More
Request Response Data Exchange with Apache Kafka vs CQRS and Event Sourcing
Read More

When to use Request-Response with Apache Kafka?

How can I do request-response communication with Apache Kafka? That’s one of the most common questions I get regularly. This blog post explores when (not) to use this message exchange pattern, the differences between synchronous and asynchronous communication, the pros and cons compared to CQRS and event sourcing, and how to implement request-response within the data streaming infrastructure.
Read More
How to do Error Handling in Data Streaming
Read More

Error Handling via Dead Letter Queue in Apache Kafka

Recognizing and handling errors is essential for any reliable data streaming pipeline. This blog post explores best practices for implementing error handling using a Dead Letter Queue in Apache Kafka infrastructure. The options include a custom implementation, Kafka Streams, Kafka Connect, the Spring framework, and the Parallel Consumer. Real-world case studies show how Uber, CrowdStrike, Santander Bank, and Robinhood build reliable real-time error handling at an extreme scale.
Read More
Innovation in Financial Services and Open Banking with Apache Kafka
Read More

Apache Kafka in the Financial Services Industry

The rise of event streaming in financial services is growing like crazy. Continuous real-time data integration and processing are mandatory for many use cases. Apache Kafka is deployed across the financial services business departments for mission-critical transactional workloads and big data analytics. High scalability, high reliability, and an elastic open infrastructure are the key reasons for the success of Kafka. This blog post explores different use cases, architectures, and real-world examples in the FinServ sector.
Read More