Machine Learning & AI Foundations: Linear Regression

Machine Learning & AI Foundations: Linear Regression

English | MP4 | AVC 1280×720 | AAC 48KHz 2ch | 3h 57m | 627 MB

Having a solid understanding of linear regression—a method of modeling the relationship between one dependent variable and one to several other variables—can help you solve a multitude of real-world problems. Applications areas involve predicting virtually any numeric value including housing values, customer spend, and stock prices. This course reveals the concepts behind the most important linear regression techniques and how to use them effectively. Throughout the course, instructor Keith McCormick uses IBM SPSS Statistics as he walks through each concept, so some exposure to that software is assumed. But the emphasis will be on understanding the concepts and not the mechanics of the software. SPSS users will have the added benefit of being exposed to virtually every regression feature in SPSS.

Instructor Keith McCormick covers simple linear regression, explaining how to build effective scatter plots and calculate and interpret regression coefficients. He also dives into the challenges and assumptions of multiple regression and steps through three distinct regression strategies. To wrap up, he discusses some alternatives to regression, including regression trees and time series forecasting.

Topics include:

  • Building effective scatter plots in Chart Builder
  • Challenges and assumptions of multiple regression
  • Checking assumptions visually
  • Creating dummy codes
  • Creating and testing interaction terms
  • Understanding partial and part correlations
  • Spotting problems and taking corrective action
  • Dealing with multicollinearity
Table of Contents

Introduction
1 Welcome
2 What you should know
3 Using the exercise files

Simple Linear Regression
4 Building effective scatter plots in Chart Builder
5 Adding labels and spikes to a scatter plot
6 Create a 3D scatter plot
7 Bubble chart with GPL
8 Residuals and R2
9 Calculating and interpreting regression coefficients

Introduction to Multiple Linear Regression
10 Challenges and assumptions of multiple regression
11 Checking assumptions visually
12 Checking assumptions with Explore
13 Checking assumptions – Durbin-Watson
14 Checking assumptions – Levine s test
15 Checking assumptions – Correlation matrix
16 Checking assumptions – Residuals plot
17 Checking assumptions – Summary

Dummy Code and Interaction Terms
18 Creating dummy codes
19 Dummy coding with the R extension
20 Detecting variable interactions
21 Creating and testing interaction terms

Three Regression Strategies
22 Three regression strategies and when to use them
23 Understanding partial correlations
24 Understanding part correlations
25 Visualizing part and partial correlations
26 Simultaneous regression – Setting up the analysis
27 Simultaneous regression – Interpreting the output
28 Hierarchical regression – Setting up the analysis
29 Hierarchical regression – Interpreting the output
30 Creating a train-test partition in SPSS
31 Stepwise regression – Setting up the analysis
32 Stepwise regression – Interpreting the output

Spotting Problems and Taking Corrective Action
33 Collinearity diagnostics
34 Dealing with multicollinearity – Factor analysis PCA
35 Dealing with multicollinearity – Manually combine IVs
36 Diagnosing outliers and influential points
37 Dealing with outliers – Studentized deleted residuals
38 Dealing with outliers – Should cases be removed
39 Detecting curvilinearity

Other Approaches to Regression
40 Regression options
41 Automatic linear modeling
42 Regression trees
43 Time series forecasting
44 Categorical regression with optimal scaling
45 Comparing regression to Neural Nets
46 Logistic regression
47 SEM

Conclusion
48 What s next