1. Calculus I: Limits & Derivatives

This topic, Calculus I: Limits & Derivatives, introduces the mathematical field of calculus -- the study of rates of change -- from the ground up. It is essential because computing derivatives via differentiation is the basis of optimizing most machine learning algorithms, including those used in deep learning such as backpropagation and stochastic gradient descent.

Through the measured exposition of theory paired with interactive examples, you’ll develop a working understanding of how calculus is used to compute limits and differentiate functions. You’ll also learn how to apply automatic differentiation within the popular TensorFlow 2 and PyTorch machine learning libraries. The content covered in this class is itself foundational for several other topics in the Machine Learning Foundations series, especially Calculus II and Optimization.

Over the course of studying this topic, you'll:

• Develop an understanding of what’s going on beneath the hood of machine learning algorithms, including those used for deep learning.
• Be able to more intimately grasp the details of machine learning papers as well as many of the other subjects that underlie ML, including partial-derivative calculus, statistics and optimization algorithms.
• Compute the derivatives of functions, including by using AutoDiff in the popular TensorFlow 2 and PyTorch libraries.

2. Calculus II: Partial Derivatives & Integrals

This class, Calculus II: Partial Derivatives & Integrals, builds on single-variable derivative calculus to introduce gradients and integral calculus. Gradients of learning, which are facilitated by partial-derivative calculus, are the basis of training most machine learning algorithms with data -- i.e., stochastic gradient descent (SGD). Paired with the principle of the chain rule (also covered in this class), SGD enables the backpropagation algorithm to train deep neural networks.

Integral calculus, meanwhile, comes in handy for myriad tasks associated with machine learning, such as finding the area under the so-called “ROC curve” -- a prevailing metric for evaluating classification models. The content covered in this class is itself foundational for several other classes in the Machine Learning Foundations series, especially Probability & Information Theory and Optimization.

Over the course of studying this topic, you'll:

• Develop an understanding of what’s going on beneath the hood of machine learning algorithms, including those used for deep learning.
• Be able to grasp the details of the partial-derivative, multivariate calculus that is common in machine learning papers as well as many in other subjects that underlie ML, including information theory and optimization algorithms.
• Use integral calculus to determine the area under any given curve, a recurring task in ML applied, for example, to evaluate model performance by calculating the ROC AUC metric.

## Instructor Bio:

Dr Jon Krohn

### Chief Data Scientist, Author of Deep Learning Illustrated | untapt

Dr. Jon Krohn

Jon Krohn is Chief Data Scientist at the machine learning company untapt. He authored the 2019 book Deep Learning Illustrated, an instant #1 bestseller that was translated into six languages. Jon is renowned for his compelling lectures, which he offers in-person at Columbia University, New York University, and the NYC Data Science Academy. Jon holds a Ph.D. in neuroscience from Oxford and has been publishing on machine learning in leading academic journals since 2010; his papers have been cited over a thousand times.

## Course Outline

1. Limits

• What Calculus Is
• A Brief History of Calculus
• The Method of Exhaustion
• Calculating Limits

2. Computing Derivatives with Differentiation

• The Delta Method
• The Differentiation Equation
• Derivative Notation
• The Power Rule
• The Constant Multiple Rule
• The Sum Rule
• The Product Rule
• The Quotient Rule
• The Chain Rule

3. Automatic Differentiation

• AutoDiff with PyTorch
• AutoDiff with TensorFlow 2
• Machine Learning via Differentiation
• Cost (or Loss) Functions
• The Future: Differentiable Programming

4. Review of Introductory Calculus

• The Delta Method
• Differentiation with Rules
• AutoDiff: Automatic Differentiation

• Partial Derivatives of Multivariate Functions
• The Partial-Derivative Chain Rule
• Backpropagation
• Higher-Order Partial Derivatives

6. Integrals

• Binary Classification
• The Confusion Matrix
• The Receiver-Operating Characteristic (ROC) Curve
• Calculating Integrals Manually
• Numeric Integration with Python
• Finding the Area Under the ROC Curve
• Resources for Further Study of Calculus