ODSC West 2020: Transfer Learning in NLP
This course is available only as a part of subscription plans.
Transfer learning enables leveraging knowledge acquired from related data to improve performance on a target task. The advancement of deep learning and a large amount of labelled data such as ImageNet has made high-performing pre-trained computer vision models possible. Transfer learning, in particular, fine-tuning a pre-trained model on a target task, has been a far more common practice than training from scratch in computer vision.
In NLP, starting from 2018, thanks to the various large language models (ULMFiT, OpenAI GPT, BERT family, etc) pre-trained on a large corpus, transfer learning has become a new paradigm and new state of the art results on many NLP tasks have been achieved.
In this session, we'll learn the different types of transfer learning, the architecture of these pre-trained language models, and how different transfer learning techniques can be used to solve various NLP tasks. In addition, we’ll also show a variety of problems that can be solved using these language models and transfer learning.
Overview and Author Bio
Transfer Learning in NLP
Joan Xiao, PhD
Joan Xiao, PhD