Linear regression for two variables is based on a linear equation with one independent variable. The equation has the form:
where \(a\) and \(b\) are constant numbers.
The variable \(\bf x\) is the independent variable, and \(\bf y\) is the dependent variable. Another way to think about this equation is a statement of cause and effect. The \(X\) variable is the cause and the \(Y\) variable is the hypothesized effect. Typically, you choose a value to substitute for the independent variable and then solve for the dependent variable.
The following examples are linear equations.
The graph of a linear equation of the form \(y = a + bx\) is a straight line. Any line that is not vertical can be described by this equation
Slope and Y-Intercept of a Linear Equation
For the linear equation \(y = a + bx\), \(b\) = slope and \(a = y\)-intercept. From algebra recall that the slope is a number that describes the steepness of a line, and the \(y\)-intercept is the \(y\) coordinate of the point \((0, a)\) where the line crosses the y-axis. From calculus the slope is the first derivative of the function. For a linear function the slope is \(dy / dx = b\) where we can read the mathematical expression as "the change in y (dy) that results from a change in \(x (dx) = b * dx\)".
Figure 13.5 Three possible graphs of \(y = a + bx\). (a) If \(b > 0\), the line slopes upward to the right. (b) If \(b = 0\), the line is horizontal. (c) If \(b < 0\), the line slopes downward to the right.