Description
Hyperparameter Optimization (HPO), also known as hyperparameter search, tuning, or sweep, is a vastly utilized pattern in model-based methods. Usually, HPO “in the wild” is geared towards the improvement of model performance with fixed input data (i.e., tweaking params on the same dataset). Nevertheless, it is rarely used in R&D settings as a systematic way to explore the parameter space.
There are two reasons for that: Computational costs, and infrastructure prerequisites. In other words - Training many models is expensive, and being able to orchestrate a large search requires quite a bit of engineering support.
Using open-source, however, you could not only reduce the costs - but can completely tackle the infrastructure hurdles - all with about two extra lines of code added to the training script.
In this webinar, we will show how to organize and own the HPO process as a component of your MLOps, and allow for easy “ramp-up” from working code to intricate searches.
We will briefly present HPO and MLOps, and detail an MLOps-enabled HPO workflow. Then, we will use the open-source ClearML to integrate HPO with an existing codebase.
Following the integration, we will show both low-code and no-code approaches to perform HPO using Optuna with AWS auto-scaling on spot instances, incorporating early-stopping for underperforming hyperparameter combinations.
Instructor's Bio
Ariel Biller
Evangelist at ClearML
Researcher first, developer second. Over the last 5 years, Ariel has worked on various projects; from the realms of quantum chemistry, massively-parallel supercomputing to deep-learning computer-vision. With AllegroAi, he helped build an open-source R&D platform (Allegro Trains), and later went on to lead a data-first transition for a revolutionary nanochemistry startup (StoreDot). Answering his calling to spread the word, he recently took up the mantle of Evangelist at ClearML. Ariel received his PhD in Chemistry in 2014 from the Weizmann Institute of Science. Ariel recently made the transition to the bustling startup scene of Tel-Aviv, and to cutting-edge Deep Learning research.
Webinar
-
1
ON-DEMAND WEBINAR: Hyper-Parameter Optimization: A First-Class Citizen In MLOPs
-
Ai+ Training
-
Webinar recording
-
Join ODSC West 2021 Training Conference
-
UPCOMING LIVE TRAINING
Register now to save 30%
-
All Courses, All Live Training
PAST LIVE TRAINING: Available On-Demand: Gradient Boosting for Prediction and Inference
2 Lessons $189.00 -
All Courses, All Live Training
PAST LIVE TRAINING: Available On-Demand: Data Visualization with Seaborn
3 Lessons $147.00 -
All Courses, All Live Training
PAST LIVE TRAINING: Available On-Demand: Building Machine Learning Pipelines for Retraining
2 Lessons $189.00