Make Data Science Actionable, Machine Learning Inference




Make Data Science Actionable, Machine Learning Inference

7 March 2019


Added 02-May-2019

Are you ready to take your machine learning algorithms and make them operational within your business in real time? In this talk, we will walk through the general theory of stream processing and discuss an architecture for taking a machine learning model from training into deployment for inference within an open source platform for real-time stream processing.

We will also cover:
The typical workflow from data exploration to model training through to real-time model inference (aka scoring) on streaming data.
Important considerations to ensure maximum flexibility for deployments that need the flexibility to run in Cloud-Native, Microservices and Edge/Fog architectures.
How the technologies like IMDG and Streaming Engine combine to provide performance at scale.
A live demonstration of a working example of a machine learning model used on streaming data within Hazelcast Jet.