I am happy that my first official Confluent blog post was published and want to link to it from by blog:
How to Build and Deploy Scalable Machine Learning in Production with Apache Kafka
The post explains in detail how you can leverage Apache Kafka and its Streams API to deploy analytic models to a lightweight, but scalable, mission-critical streaming appilcation.
If you want to take a look directly at the source code, go to my Github project about Kafka + Machine Learning. It contains several examples how to combine Kafka Streams with frameworks like TensorFlow, H2O or DeepLearning4J.
Financial services companies are moving from batch processing to real time data flow. A data…
Each year brings new momentum to the data streaming space. In 2026, six key trends…
Data streaming is now a core software category in modern data architecture. It powers real-time…
Reaching Lufthansa HON Circle status was both a personal milestone and a significant financial investment.…
The automotive industry transforms rapidly. Cars are now software-defined vehicles (SDVs) that demand constant, real-time…
Apache Iceberg is gaining momentum as the open table format of choice for modern data…