I am happy that my first official Confluent blog post was published and want to link to it from by blog:
How to Build and Deploy Scalable Machine Learning in Production with Apache Kafka
The post explains in detail how you can leverage Apache Kafka and its Streams API to deploy analytic models to a lightweight, but scalable, mission-critical streaming appilcation.
If you want to take a look directly at the source code, go to my Github project about Kafka + Machine Learning. It contains several examples how to combine Kafka Streams with frameworks like TensorFlow, H2O or DeepLearning4J.
One of the greatest wishes of companies is end-to-end visibility in their operational and analytical…
Time flies… I joined Confluent seven years ago when Apache Kafka was mainly used by…
Snowflake is a leading cloud data warehouse and transitions into a data cloud that enables…
The integration between Apache Kafka and Snowflake is often cumbersome. Options include near real-time ingestion…
Snowflake is a leading cloud-native data warehouse. Integration patterns include batch data integration, Zero ETL…
Google announced its Apache Kafka for BigQuery cloud service at its conference Google Cloud Next…