I am happy that my first official Confluent blog post was published and want to link to it from by blog:
How to Build and Deploy Scalable Machine Learning in Production with Apache Kafka
The post explains in detail how you can leverage Apache Kafka and its Streams API to deploy analytic models to a lightweight, but scalable, mission-critical streaming appilcation.
If you want to take a look directly at the source code, go to my Github project about Kafka + Machine Learning. It contains several examples how to combine Kafka Streams with frameworks like TensorFlow, H2O or DeepLearning4J.
The automotive industry is rapidly shifting toward a software-defined, data-driven future. Real-time technologies like Apache…
Pinterest uses Apache Kafka and Flink to power Guardian, its real-time detection platform for spam,…
Agentic AI goes beyond chatbots. These are autonomous systems that observe, reason, and act—continuously and…
Global supply chains face constant disruption. Trade conflicts, wars, inflation, and shifting regulations are making…
The shift from Lambda to Kappa architecture reflects the growing demand for unified, real-time data…
FinOps bridges the gap between finance and engineering to control cloud spend in real time.…