15.2.1: Using Linear Equations
- Page ID
- 22161
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)
( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\id}{\mathrm{id}}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\kernel}{\mathrm{null}\,}\)
\( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\)
\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\)
\( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)
\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)
\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)
\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vectorC}[1]{\textbf{#1}} \)
\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)
\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)
\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)Before we start practicing calculating all of the variables in a regression line equation, let's work a little with just the equation on it's own.
Regression Line Equations
As we just learned, linear regression for two variables is based on a linear equation:
\[\widehat{\mathrm{Y}}=\mathrm{a}+(\mathrm{b}*{X}) \nonumber \]
where \(a\) and \(b\) are constant numbers. What this means is that for every sample, the intercept (a) and the slope (b) will be the same for every score. The X score will change, and that affects Y (or predicted Y, or \(\widehat{\mathrm{Y}}\)). Some consider the predictor variable (X) as an IV and the outcome variable (Y) as the DV, but be careful that you aren't confusing prediction with causation!
We also just learned that the graph of a linear equation of the form \(\widehat{\mathrm{Y}}=\mathrm{a}+(\mathrm{b}*{X}) \nonumber \) is a straight line.
Is the following an example of a linear equation? Why or why not?
- Answer
-
No, the graph is not a straight line; therefore, it is not a linear equation.
The minimum criterion for using a linear regression formula is that there be a linear relationship between the predictor and the criterion (outcome) variables.
What statistic shows us whether two variables are linearly related?
- Answer
-
Pearson's r (correlation).
If two variables aren’t linearly related, then you can’t use linear regression to predict one from the other! The stronger the linear relationship (larger the Pearson’s correlation), the more accurate will be the predictions based on linear regression.
Slope and Y-Intercept of a Linear Equation
As we learned previously, \(b =\) slope and \(a = y\)-intercept. From algebra recall that the slope is a number that describes the steepness of a line, and the \(y\)-intercept is the \(y\) coordinate of the point \((0, a)\) where the line crosses the \(y\)-axis. Figure \(\PageIndex{2}\) shows three possible graphs of the regression equation (\(y = a + b\text{x}\)). Panel (a) shows what the regression line looks like if the slope is positive (\(b > 0\)), the line slopes upward to the right. Panel (b) shows what the regression line looks like if there's no slope (\(b = 0\)); the line is horizontal. Finally, Panel (c) shows what the regression line looks like if the slope is negative (\(b < 0\)), the line slopes downward to the right.
I get it, everything has been pretty theoretical so far. So let's get practical. Let's try constructing the regression line equation even when you don't have the scores for either of the variables. First, we'll start by identifying the variables in the examples.
Svetlana tutors to make extra money for college. For each tutoring session, she charges a one-time fee of $25 plus $15 per hour of tutoring. A linear equation that expresses the total amount of money Svetlana earns for each session she tutors is \(y = 25 + 15\text{x}\).
What are the predictor and criterion (outcome) variables? What is the \(y\)-intercept and what is the slope? Answer using complete sentences.
Answer
The predictor variable, \(x\), is the number of hours Svetlana tutors each session. The criterion (outcome) variable, \(y\), is the amount, in dollars, Svetlana earns for each session.
The \(y\)-intercept is the constant, the one time fee of $25 (\(a = 25\)). The slope is 15 (\(b = 15\)) because Svetlana earns $15 for each hour she tutors.
Although it doesn't make sense in these examples, the y-intercept (a) is determined when \(x = 0\). I guess with Svetlana, you could say that she gets $25 for any sessions that you miss or don't cancel ahead of time. But geometrically and mathematically, the y-intercept is based on when the predictor variable (x) has a value of zero.
Jamal repairs household appliances like dishwashers and refrigerators. For each visit, he charges $25 plus $20 per hour of work. A linear equation that expresses the total amount of money Jamal earns per visit is \(y = 25 + 20\text{x}\).
What are the predictor and criterion (outcome) variables? What is the \(y\)-intercept and what is the slope? Answer using complete sentences.
- Answer
-
The predictor variable, \(x\), is the number of hours Jamal works each visit. he criterion (outcome) variable, \(y\), is the amount, in dollars, Jamal earns for each visit.
The y-intercept is 25 (\(a = 25\)). At the start of a visit, Jamal charges a one-time fee of $25 (this is when \(x = 0\)). The slope is 20 (\(b = 20\)). For each visit, Jamal earns $20 for each hour he works.
Now, we can start constructing the regression line equations.
Alejandra's Word Processing Service (AWPS) does word processing. The rate for services is $32 per hour plus a $31.50 one-time charge. The total cost to a customer depends on the number of hours it takes to complete the job.
Find the equation that expresses the total cost in terms of the number of hours required to complete the job. For this example,
- \(x =\) the number of hours it takes to get the job done.
- \(y =\) the total cost to the customer.
Answer
The $31.50 is a fixed cost. This is the number that you add after calculating the rest, so it must be the intercept (a).
If it takes \(x\) hours to complete the job, then \((32)(x)\) is the cost of the word processing only.
Thus, the total cost is: \(y = 31.50 + 32\text{x}\)
Let's try another example of constructing the regression line equation.
Elektra's Extreme Sports hires hang-gliding instructors and pays them a fee of $50 per class as well as $20 per student in the class. The total cost Elektra pays depends on the number of students in a class. Find the equation that expresses the total cost in terms of the number of students in a class.
- Answer
-
For this example,
- \(x =\) number of students in class
- \(y =\) the total cost
The constant is $50 per class, so that must be the intercept (a).
So $20 per student is the slope (b).
The resulting regression equation is: \(y = 50 + 20\text{x}\)
You can also use the regression equation to graph the line if you input scores from your X variable and your Y variable into the equation. Let's see what that might look like in Figure \(\PageIndex{3}\) for the equation: \(y = -1 + 2\text{x}\)
In the example in Figure \(\PageIndex{3}\), the intercept (a) is replaced by -1 and the slope (b) is replaced by 2 to get the regression equation (\(y = -1 + 2\text{x}\)). Right now, you are being provided these constants. Soon, you'll be calculating them yourself!
Summary
The most basic type of association is a linear association. This type of relationship can be defined algebraically by the equations used, numerically with actual or predicted data values, or graphically from a plotted. Algebraically, a linear equation typically takes the form \(y = mx + b\), where \(m\) and \(b\) are constants, \(x\) is the independent variable, \(y\) is the dependent variable. In a statistical context, a linear equation is written in the form \(y = a + bx\), where \(a\) and \(b\) are the constants. This form is used to help readers distinguish the statistical context from the algebraic context. In the equation \(y = a + b\text{x}\), the constant b that multiplies the \(x\) variable (\(b\) is called a coefficient) is called the slope. The constant a is called the \(y\)-intercept.
The slope of a line is a value that describes the rate of change between the two quantitative variables. The slope tells us how the criterion variable (\(y\)) changes for every one unit increase in the predictor (\(x\)) variable, on average. The \(y\)-intercept is used to describe the criterion variable when the predictor variable equals zero.