Kafka’s Streams API for Highly Scalable Machine Learning & Deep Learning in Real Time by Kai Waehner

Intelligent real time applications are a game changer in any industry. This session explains how companies from different industries build intelligent real time applications. The first part of this session explains how to build analytic models with R, Python or Scala leveraging open source machine learning / deep learning frameworks like TensorFlow, DeepLearning4J or H2O.ai. The second part discusses the deployment of these built analytic models to your own applications or microservices by leveraging the Apache Kafka cluster and Kafka’s Streams API instead of setting up a new, complex stream processing cluster. The session focuses on live demos and teaches lessons learned for executing analytic models in a highly scalable, mission-critical and performant way.

Key takeaways for the audience:
– Insights are hidden in Historical Data on Big Data Platforms such as Hadoop
– Machine Learning and Deep Learning find these Insights by building Analytics Models
– Streaming Analytics uses these Models (without Redeveloping) to act in Real Time
– See different open source frameworks for Machine Learning and Stream Processing like TensorFlow, DeepLearning4J or H2O.ai
– Understand how to leverage Kafka Streams to use analytic models in your own streaming microservices
– Learn best practices for building and deploying analytic models in real time leveraging the open source Apache Kafka Streams platform

You can find the Java code examples and analytic models for H2O and TensorFlow in my Github project:

CONNECT
Subscribe:
Site:
GitHub:
Facebook:
Twitter:
Linkedin:

ABOUT CONFLUENT
Confluent, founded by the creators of Apache Kafka®, enables organizations to harness business value of live data. The Confluent Platform manages the barrage of stream data and makes it available throughout an organization. It provides various industries, from retail, logistics and manufacturing, to financial services and online social networking, a scalable, unified, real-time data pipeline that enables applications ranging from large volume data integration to big data analysis with Hadoop to real-time stream processing. To learn more, please visit