Course Abstract

Training duration : 90 minutes

Quite recently, techniques like Transfer Learning have been in use where a deep learning model trained on a large dataset is used to perform similar tasks on another dataset, and these models are called pre-trained models. The requirement for transfer learning techniques in NLP went at an all-time high. In this session, we will start by introducing the recent breakthroughs in NLP that resulted from the combination of Transfer Learning and Transformer architectures. Then, we'll learn to use the open-source tools released by HuggingFace like the Transformers and Tokenizers libraries and the distilled models.

DIFFICULTY LEVEL: ADVANCED

Learning Objectives

  • Understanding Transfer Learning in NLP

  • How the Transformers and Tokenizers libraries are organized and

  • How to use Transformers and Tokenizer for downstream tasks like text classification, NER and text generation

Instructor

Instructor Bio:

Background knowledge

  • This course is for current and aspiring Data Scientists, NLP and ML Engineers, and AI Product Managers

  • Knowledge of following tools and concepts is useful:

  • Familiarity with Python and Jupyter notebooks

  • Basic understanding of Natural Language Processing techniques

Real-world applications

  • Transformers are extensively use in natural language generation in Conversational AI.

  • More recently, Facebook AI Research released Transformer deep learning model to learn embeddings for protein sequences with 669.2M parameters

  • ABN AMRO bank is already using Transformers and NLP-powered chatbots to provide better customer service