Learn Generative AI and Large Language Models

Upskill to The Next Frontier of AI

Tutorial Overview

Fine Tuning an Existing LLM

The workshop explores the process of fine-tuning Large Language Models (LLMs) for Natural Language Processing (NLP) tasks. It highlights the motivations for fine-tuning, such as task adaptation, transfer learning, and handling low-data scenarios, using a Yelp Review dataset. The notebook employs the HuggingFace Transformers library, including tokenization with AutoTokenizer, data subset selection, and model choice (BERT-based model). Hyperparameter tuning, evaluation strategy, and metrics are introduced. It also briefly mentions DeepSpeed for optimization and Parameter Efficient Fine-Tuning (PEFT) for resource-efficient fine-tuning, providing a comprehensive introduction to fine-tuning LLMs for NLP tasks.

Tutorial Topics

  • Fine Tuning a Large Language Model
  • Natural Language Processing (NLP)
  • HuggingFace Transformers library
  • Tokenization with AutoTokenizer
  • DeepSpeed for optimization 
  • Parameter Efficient Fine-Tuning (PEFT)


Meet your instructor

Senior Machine Learning Engineer / Data Science Consultant

Mary Grace Moesta

Mary Grace Moesta is a senior data science consultant at Databricks. She's been working in the big data and data science space for several years with opportunities to collaborate across several verticals, with the majority of her work focused in the Retail and CPG space. Prior to Databricks, Mary Grace was able to contribute to several machine learning applications, namely - personalization use cases, forecasting, recommendation engines, and customer experience measures.

Course Curriculum

  • 1

    Welcome to the Tutorial!

    • Welcome to the no tutorial!

    • What You'll Learn in This Tutorial

    • How to use this tutorial

    • Tutorial Prerequisites

  • 2

    Fine Tuning Part II: Models + LLMs

    • Fine tuning an Existing Large Language Model

    • Fine Tuning Part II - Quiz

    • Lesson Notebook: Fine Tuning Part II

CODE TO LEARN

A Hands-on Tutorial

This hands-on tutorial goes beyond the basics, offering you an interactive Coding Notebook crucial to your educational journey. It immerses you in an engaging process of writing, generating, and executing code, enabling a comprehensive exploration of the tutorial's core concepts through practical coding exercises. By applying these concepts in real time, you'll witness the immediate impact of your coding choices. This hands-on approach is not just about learning to code; it's about coding to learn, solidifying your understanding as you seamlessly generate and execute code.

Enroll now!

Accelerate your journey to enerative AI by enrolling in our program today!