I am happy that my first official Confluent blog post was published and want to link to it from by blog:
How to Build and Deploy Scalable Machine Learning in Production with Apache Kafka
The post explains in detail how you can leverage Apache Kafka and its Streams API to deploy analytic models to a lightweight, but scalable, mission-critical streaming appilcation.
If you want to take a look directly at the source code, go to my Github project about Kafka + Machine Learning. It contains several examples how to combine Kafka Streams with frameworks like TensorFlow, H2O or DeepLearning4J.
Apache Kafka has long been the foundation for real-time data streaming. With the release of…
Diskless Kafka is transforming how fintech and financial services organizations handle observability and log analytics.…
Rivian and Volkswagen, through their joint venture RV Tech, process high-frequency telemetry from connected vehicles…
Airlines face constant pressure to deliver reliable service while managing complex operations and rising customer…
Running Apache Flink on a mainframe may sound surprising, but it is already happening and…
Financial services companies are moving from batch processing to real time data flow. A data…