Applications To Conversational AI

button-icon-arrow-right
button-icon-arrow-left

button-icon-arrow-leftBack

Event

Applications to Conversational AI

13 June 2019

New York

Added 01-Jan-1970

Lysandre Debut, Machine Learning Engineer @ Hugging Face will be joining us to discuss transfer learning in NLP and the applications to conversational AI.

Abstract: Natural Language Processing has long been dominated with supervised learning and task specific architectures. These approaches usually require large task specific datasets to perform well.

Recent trends show that training large models without a task-specific architecture on really large datasets in an unsupervised way can lead to general and universal representations. Latest developments leverage these large pre-trainings to significantly improve a large set of NLP tasks ranging from research to production applications.

The availability of these pre-trained weights, along with a wide range of easy-to-integrate tools and libraries are factors that explain the huge adoption of transfer learning in NLP.

We will present how using those pre-trained task-agnostic models in conjunction with small task-specific architectures leads to SoTA results on several tasks. We will present an open-source contribution that make some of the most impactful works in transfer learning easy to use and integrate (pytorch_pretrained_BERT). We will present how we leverage transfer learning for end to end neural dialogue systems (language understanding and language generation) and finally how transfer learning can be useful from a multi-task point of view.

Top