I am happy that my first official Confluent blog post was published and want to link to it from by blog:
How to Build and Deploy Scalable Machine Learning in Production with Apache Kafka
The post explains in detail how you can leverage Apache Kafka and its Streams API to deploy analytic models to a lightweight, but scalable, mission-critical streaming appilcation.
If you want to take a look directly at the source code, go to my Github project about Kafka + Machine Learning. It contains several examples how to combine Kafka Streams with frameworks like TensorFlow, H2O or DeepLearning4J.
AI and autonomous networks took center stage at TM Forum Innovate Americas 2025 in Dallas.…
OSS is critical for service delivery in telecom, yet legacy platforms have become rigid and…
The Apache Kafka community introduced KIP-500 to remove ZooKeeper and replace it with KRaft, a…
Connected vehicles are transforming the automotive industry into a software-driven, data-centric ecosystem. While APIs provide…
This blog post explores how leading payment processors like Stripe, PayPal, Payoneer, and Worldline are…
Agentic AI is moving into production. Autonomous, tool-using, goal-driven systems that need real-time data and…