The field of Natural Language processing has been witnessing a rapid acceleration in model improvement in the last few years. The majority of the state-of-the-art models in the field are based on the Transformer architecture. Examples include models like BERT (which when applied to Google Search, resulted in what Google calls "one of the biggest leaps forward in the history of Search") and OpenAI's GPT2 and GPT3 (which are able to generate coherent text and essays).
This talk by the author of the popular "Illustrated Transformer" guide will introduce the Transformer architecture and its various applications. This will be a visual presentation accessible to people with various levels of ML experience. 


New on-demand courses are added weekly

Session Overview

  • 1

    ODSC Europe 2020: A Gentle Intro to Transformer Neural Networks

    • Overview and Author Bio

    • A Gentle Intro to Transformer Neural Networks

Instructor Bio:

Jay Alammar

Machine Learning Research Engineer | jalammar.github.io

Jay Alammar

Passionate analytical expert in building and scaling great Internet companies and products. Learns, codes, illustrates, and teaches machine learning topics at every opportunity. Jay’s hands-on expertise covers the entire product life cycle from initial research, focus groups, user experience design, product prototyping, user testing, up to product release, marketing, and acting on deep analytics insights.