I am happy that my first official Confluent blog post was published and want to link to it from by blog:
How to Build and Deploy Scalable Machine Learning in Production with Apache Kafka
The post explains in detail how you can leverage Apache Kafka and its Streams API to deploy analytic models to a lightweight, but scalable, mission-critical streaming appilcation.
If you want to take a look directly at the source code, go to my Github project about Kafka + Machine Learning. It contains several examples how to combine Kafka Streams with frameworks like TensorFlow, H2O or DeepLearning4J.
AI agents fail in production when they are connected directly to raw event streams. Flink…
Complex Event Processing is the most underused capability in Apache Flink. It detects meaningful event…
MCP, REST/HTTP APIs, and Apache Kafka are not alternatives. They solve different problems at different…
The Enterprise Agentic AI Landscape 2026 maps every major AI vendor across two dimensions that…
Agentic AI without governed processes is fast but ungoverned. Event-driven integration without process intelligence moves…
Two toolchains, two skill sets, two CI/CD pipelines — that has been the reality for…