Skip to main content

# 13.11: Chapter 13 Review

### 13.3 Linear Equations

The most basic type of association is a linear association. This type of relationship can be defined algebraically by the equations used, numerically with actual or predicted data values, or graphically from a plotted curve. (Lines are classified as straight curves.) Algebraically, a linear equation typically takes the form $$\bf{y = mx + b}$$, where $$\bf m$$ and $$\bf b$$ are constants, $$\bf x$$ is the independent variable, $$\bf y$$ is the dependent variable. In a statistical context, a linear equation is written in the form $$\bf{y = a + bx}$$, where $$\bf a$$ and $$\bf b$$ are the constants. This form is used to help readers distinguish the statistical context from the algebraic context. In the equation $$y = a + bx$$, the constant $$b$$ that multiplies the $$\bf x$$ variable ($$b$$ is called a coefficient) is called as the slope. The slope describes the rate of change between the independent and dependent variables; in other words, the rate of change describes the change that occurs in the dependent variable as the independent variable is changed. In the equation $$y = a + bx$$, the constant a is called as the y-intercept. Graphically, the $$y$$-intercept is the $$y$$ coordinate of the point where the graph of the line crosses the $$y$$ axis. At this point $$x = 0$$.

The slope of a line is a value that describes the rate of change between the independent and dependent variables. The slope tells us how the dependent variable ($$y$$) changes for every one unit increase in the independent ($$x$$) variable, on average. The $$\bf y$$-intercept is used to describe the dependent variable when the independent variable equals zero. Graphically, the slope is represented by three line types in elementary statistics.

### 13.4 The Regression Equation

It is hoped that this discussion of regression analysis has demonstrated the tremendous potential value it has as a tool for testing models and helping to better understand the world around us. The regression model has its limitations, especially the requirement that the underlying relationship be approximately linear. To the extent that the true relationship is nonlinear it may be approximated with a linear relationship or nonlinear forms of transformations that can be estimated with linear techniques. Double logarithmic transformation of the data will provide an easy way to test this particular shape of the relationship. A reasonably good quadratic form (the shape of the total cost curve from Microeconomics Principles) can be generated by the equation:

$Y=a+b_{1} X+b_{2} X^{2}\nonumber$

where the values of $$X$$ are simply squared and put into the equation as a separate variable.

There is much more in the way of econometric "tricks" that can bypass some of the more troublesome assumptions of the general regression model. This statistical technique is so valuable that further study would provide any student significant, statistically significant, dividends.