Whew! Now, using matrix algebra and calculus, you have derived the squared-error minimizing formula for multiple regression. Not only that, you can use the matrix form, in
R, to calculate the estimated slope and intercept coefficients, predict YY, and even calculate the regression residuals. We’re on our way to true Geekdome!
Next stop: the key assumptions necessary for OLS to provide the best, unbiased, linear estimates (BLUE) and the basis for statistical controls using multiple independent variables in regression models.
- It is useful to keep in mind the difference between “multiple regression” and “multivariate regression”. The latter predicts 2 or more dependent variables using an independent variable.↩
- The use of “prime” in matrix algebra should not be confused with the use of prime" in the expression of a derivative, as in X′X′.↩