Model Serving: Stream Processing vs. RPC / REST with Java, gRPC, Apache Kafka, TensorFlow

Posted in Analytics, Apache Kafka, Big Data, Confluent, Deep Learning, Java / JEE, Kafka Streams, KSQL, Machine Learning, Microservices, Open Source, Stream Processing on July 9th, 2018 by Kai Wähner

Machine Learning / Deep Learning models can be used in different ways to do predictions. My preferred way is to deploy an analytic model directly into a stream processing application (like Kafka Streams or KSQL). You could e.g. use the TensorFlow for Java API. This allows best latency and independence of external services. Several examples can be found in my Github project: Model Inference within Kafka Streams Microservices using TensorFlow, H2O.ai, Deeplearning4j (DL4J).

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Apache Kafka + Machine Learning => Confluent Blog Post and Github Project

Posted in Apache Kafka, Big Data, Confluent, Deep Learning, Kafka Streams, KSQL, Machine Learning, Open Source, Stream Processing on October 27th, 2017 by Kai Wähner

I am happy that my first official Confluent blog post was published and want to link to it from by blog:

How to Build and Deploy Scalable Machine Learning in Production with Apache Kafka

The post explains in detail how you can leverage Apache Kafka and its Streams API to deploy analytic models to a lightweight, but scalable, mission-critical streaming appilcation.

Apache Kafka Ecosystem for Machine Learning

Github Examples for Apache Kafka + Machine Learning

If you want to take a look directly at the source code, go to my Github project about Kafka + Machine Learning. It contains several examples how to combine Kafka Streams with frameworks like TensorFlow, H2O or DeepLearning4J.

Tags: , , , , , , , , , , ,