I am happy that my first official Confluent blog post was published and want to link to it from by blog:
How to Build and Deploy Scalable Machine Learning in Production with Apache Kafka
The post explains in detail how you can leverage Apache Kafka and its Streams API to deploy analytic models to a lightweight, but scalable, mission-critical streaming appilcation.
If you want to take a look directly at the source code, go to my Github project about Kafka + Machine Learning. It contains several examples how to combine Kafka Streams with frameworks like TensorFlow, H2O or DeepLearning4J.
Mobility services like Uber, Grab, and FREE NOW (Lyft) rely on real-time data to power…
The rise of Electric Vehicles (EVs) demands a scalable, efficient charging network—but challenges like fluctuating…
Apache Kafka 4.0 represents a major milestone in the evolution of real-time data infrastructure. Used…
Agentic AI marks a major evolution in artificial intelligence—shifting from passive analytics to autonomous, goal-driven…
Industrial enterprises face increasing pressure to move faster, automate more, and adapt to constant change—without…
As real-time technologies reshape IT architectures, software vendors face a critical decision: specialize deeply in…