Probability and Statistics for Machine Learning LiveLessons

Probability and Statistics for Machine Learning LiveLessons

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 8h 58m | 3.70 GB

Jon Krohn is Chief Data Scientist at the machine learning company untapt. He presents a popular series of deep learning tutorials published by Addison-Wesley and is the author of the bestselling book Deep Learning Illustrated. Jon teaches his deep learning curriculum in-classroom at the New York City Data Science Academy, as well as guest lecturing at Columbia University and New York University. He holds a doctorate in neuroscience from Oxford University and has been publishing on machine learning in leading journals since 2010.

Table of Contents

1 Probability and Statistics for Machine Learning – Introduction
2 Topics
3 Orientation to the Machine Learning Foundations Series
4 What Probability Theory Is
5 Events and Sample Spaces
6 Multiple Observations
7 Factorials and Combinatorics
8 Exercises
9 The Law of Large Numbers and the Gambler’s Fallacy
10 Probability Distributions in Statistics
11 Bayesian versus Frequentist Statistics
12 Applications of Probability to Machine Learning
13 Topics
14 Discrete and Continuous Variables
15 Probability Mass Functions
16 Probability Density Functions
17 Exercises on Probability Functions
18 Expected Value
19 Exercises on Expected Value
20 Topics
21 The Mean, a Measure of Central Tendency
22 Medians
23 Modes
24 Quantiles – Percentiles, Quartiles, and Deciles
25 Box-and-Whisker Plots
26 Variance, a Measure of Dispersion
27 Standard Deviation
28 Standard Error
29 Covariance, a Measure of Relatedness
30 Correlation
31 Topics
32 Joint Probability Distribution
33 Marginal Probability
34 Conditional Probability
35 Exercises
36 Chain Rule of Probabilities
37 Independent Random Variables
38 Conditional Independence
39 Topics
40 Uniform
41 Gaussian – Normal and Standard Normal
42 The Central Limit Theorem
43 Log-Normal
44 Exponential and Laplace
45 Binomial and Multinomial
46 Poisson
47 Mixture Distributions
48 Preprocessing Data for Model Input
49 Exercises
50 Topics
51 What Information Theory Is
52 Self-Information, Nats, and Bits
53 Shannon and Differential Entropy
54 Kullback-Leibler Divergence and Cross-Entropy
55 Topics
56 Applications of Statistics to Machine Learning
57 Review of Essential Probability Theory
58 z-scores and Outliers
59 Exercises on z-scores
60 p-values
61 Exercises on p-values
62 Topics
63 Single-Sample t-tests and Degrees of Freedom
64 Independent t-tests
65 Paired t-tests
66 Applications to Machine Learning
67 Exercises
68 Confidence Intervals
69 ANOVA – Analysis of Variance
70 Topics
71 The Pearson Correlation Coefficient
72 R-squared Coefficient of Determination
73 Correlation versus Causation
74 Correcting for Multiple Comparisons
75 Topics
76 Independent versus Dependent Variables
77 Linear Regression to Predict Continuous Values
78 Fitting a Line to Points on a Cartesian Plane
79 Linear Least Squares Exercise
80 Ordinary Least Squares
81 Categorical ‘Dummy’ Features
82 Logistic Regression to Predict Categories
83 Open-Ended Exercises
84 Topics
85 Machine Learning versus Frequentist Statistics
86 When to Use Bayesian Statistics
87 Prior Probabilities
88 Bayes’ Theorem
89 Resources for Further Study of Probability and Statistics
90 Probability and Statistics for Machine Learning – Summary

Homepage