English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 113 lectures (16h 25m) | 6.50 GB

Essential Linear Algebra and Calculus Hands-On in NumPy, TensorFlow, and PyTorch

Mathematics forms the core of data science and machine learning. Thus, to be the best data scientist you can be, you must have a working understanding of the most relevant math.

Getting started in data science is easy thanks to high-level libraries like Scikit-learn and Keras. But understanding the math behind the algorithms in these libraries opens an infinite number of possibilities up to you. From identifying modeling issues to inventing new and more powerful solutions, understanding the math behind it all can dramatically increase the impact you can make over the course of your career.

Led by deep learning guru Dr. Jon Krohn, this course provides a firm grasp of the mathematics — namely linear algebra and calculus — that underlies machine learning algorithms and data science models.

Course Sections

- Linear Algebra Data Structures
- Tensor Operations
- Matrix Properties
- Eigenvectors and Eigenvalues
- Matrix Operations for Machine Learning
- Limits
- Derivatives and Differentiation
- Automatic Differentiation
- Partial-Derivative Calculus
- Integral Calculus

Throughout each of the sections, you’ll find plenty of hands-on assignments, Python code demos, and practical exercises to get your math game in top form!

This Mathematical Foundations of Machine Learning course is complete, but in the future, we intend on adding bonus content from related subjects beyond math, namely: probability, statistics, data structures, algorithms, and optimization. Enrollment now includes free, unlimited access to all of this future course content — over 25 hours in total.

What you’ll learn

- Understand the fundamentals of linear algebra and calculus, critical mathematical subjects underlying all of machine learning and data science
- Manipulate tensors using all three of the most important Python tensor libraries: NumPy, TensorFlow, and PyTorch
- How to apply all of the essential vector and matrix operations for machine learning and data science
- Reduce the dimensionality of complex data to the most informative elements with eigenvectors, SVD, and PCA
- Solve for unknowns with both simple techniques (e.g., elimination) and advanced techniques (e.g., pseudoinversion)
- Appreciate how calculus works, from first principles, via interactive code demos in Python
- Intimately understand advanced differentiation rules like the chain rule
- Compute the partial derivatives of machine-learning cost functions by hand as well as with TensorFlow and PyTorch
- Grasp exactly what gradients are and appreciate why they are essential for enabling ML via gradient descent
- Use integral calculus to determine the area under any given curve
- Be able to more intimately grasp the details of cutting-edge machine learning papers
- Develop an understanding of what’s going on beneath the hood of machine learning algorithms, including those used for deep learning

## Table of Contents

**Data Structures for Linear Algebra**

1 Introduction

2 What Linear Algebra Is

3 Plotting a System of Linear Equations

4 Linear Algebra Exercise

5 Tensors

6 Scalars

7 Vectors and Vector Transposition

8 Norms and Unit Vectors

9 Basis Orthogonal and Orthonormal Vectors

10 Matrix Tensors

11 Generic Tensor Notation

12 Exercises on Algebra Data Structures

**Tensor Operations**

13 Segment Intro

14 Tensor Transposition

15 Basic Tensor Arithmetic incl the Hadamard Product

16 Tensor Reduction

17 The Dot Product

18 Exercises on Tensor Operations

19 Solving Linear Systems with Substitution

20 Solving Linear Systems with Elimination

21 Visualizing Linear Systems

**Matrix Properties**

22 Segment Intro

23 The Frobenius Norm

24 Matrix Multiplication

25 Symmetric and Identity Matrices

26 Matrix Multiplication Exercises

27 Matrix Inversion

28 Diagonal Matrices

29 Orthogonal Matrices

30 Orthogonal Matrix Exercises

**Eigenvectors and Eigenvalues**

31 Segment Intro

32 Applying Matrices

33 Affine Transformations

34 Eigenvectors and Eigenvalues

35 Matrix Determinants

36 Determinants of Larger Matrices

37 Determinant Exercises

38 Determinants and Eigenvalues

39 Eigendecomposition

40 Eigenvector and Eigenvalue Applications

**Matrix Operations for Machine Learning**

41 Segment Intro

42 Singular Value Decomposition

43 Data Compression with SVD

44 The MoorePenrose Pseudoinverse

45 Regression with the Pseudoinverse

46 The Trace Operator

47 Principal Component Analysis PCA

48 Resources for Further Study of Linear Algebra

**Limits**

49 Segment Intro

50 Intro to Differential Calculus

51 Intro to Integral Calculus

52 The Method of Exhaustion

53 Calculus of the Infinitesimals

54 Calculus Applications

55 Calculating Limits

56 Exercises on Limits

**Derivatives and Differentiation**

57 Segment Intro

58 The Delta Method

59 How Derivatives Arise from Limits

60 Derivative Notation

61 The Derivative of a Constant

62 The Power Rule

63 The Constant Multiple Rule

64 The Sum Rule

65 Exercises on Derivative Rules

66 The Product Rule

67 The Quotient Rule

68 The Chain Rule

69 Advanced Exercises on Derivative Rules

70 The Power Rule on a Function Chain

**Automatic Differentiation**

71 Segment Intro

72 What Automatic Differentiation Is

73 Autodiff with PyTorch

74 Autodiff with TensorFlow

75 The Line Equation as a Tensor Graph

76 Machine Learning with Autodiff

**Partial Derivative Calculus**

77 Segment Intro

78 What Partial Derivatives Are

79 Partial Derivative Exercises

80 Calculating Partial Derivatives with Autodiff

81 Advanced Partial Derivatives

82 Advanced PartialDerivative Exercises

83 Partial Derivative Notation

84 The Chain Rule for Partial Derivatives

85 Exercises on the Multivariate Chain Rule

86 PointbyPoint Regression

87 The Gradient of Quadratic Cost

88 Descending the Gradient of Cost

89 The Gradient of Mean Squared Error

90 Backpropagation

91 HigherOrder Partial Derivatives

92 Exercise on HigherOrder Partial Derivatives

**Integral Calculus**

93 Segment Intro

94 Binary Classification

95 The Confusion Matrix

96 The ReceiverOperating Characteristic ROC Curve

97 What Integral Calculus Is

98 The Integral Calculus Rules

99 Indefinite Integral Exercises

100 Definite Integrals

101 Numeric Integration with Python

102 Definite Integral Exercise

103 Finding the Area Under the ROC Curve

104 Resources for the Further Study of Calculus

105 Congratulations

**Probability**

106 Probability Information Theory

107 A Brief History of Probability Theory

108 What Probability Theory Is

109 Events and Sample Spaces

110 Multiple Independent Observations

111 Combinatorics

112 Exercises on Event Probabilities

113 More Lectures are on their Way

Resolve the captcha to access the links!