13.12: Testing a Hypothesis with Regression in SPSS
- Page ID
- 50180
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)
( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\id}{\mathrm{id}}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\kernel}{\mathrm{null}\,}\)
\( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\)
\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\)
\( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)
\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)
\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)
\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vectorC}[1]{\textbf{#1}} \)
\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)
\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)
\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)This section focuses on how to analyze data for a simple regression using SPSS. SPSS version 29 was used for this book; if you are using a different version, you may see some variation from what is shown here.
Entering Data
Data for correlation and regression are set up the same way. See the section on entering data in Chapter 12 as needed. Here is how Data Set 12.1 looks after data are entered into SPSS:
Once all the variables have been specified and the data have been entered, you can begin analyzing the data using SPSS.
Conducting a Simple (Bivariate) Regression in SPSS
The steps to running a simple, bivariate regression in SPSS are:
- Click Analyze -> Regression -> Linear from the pull down menus.
- Drag the names of the predictor ( \(X\) -variable) into the box that says “Independent(s)” and the predicted ( \(Y\) -variable) into the box that says “Dependent.” You can also do this by clicking on the variable names to highlight them and the clicking the arrow to move each of them to the desired location.
- Click “OK” to run the analyses.
- The output (which means the page of calculated results) will appear in a new window of SPSS known as an output viewer. The results will appear in three tables as shown below.
Model |
\(R\) |
\(R\) Square |
Adjusted \(R\) Square |
Std. Error of the Estimate |
---|---|---|---|---|
1 |
.948a |
.898 |
.885 |
4.94945 |
a. Predictors: (Constant), Hours of Sleep |
Model |
Sum of Squares |
\(df\) |
Mean Square |
\(F\) |
Sig. |
|
---|---|---|---|---|---|---|
1 |
Regression |
1728.024 |
1 |
1728.024 |
70.540 |
<.001b |
Residual |
195.976 |
8 |
24.497 |
|||
Total |
1924.000 |
9 |
||||
a. Dependent Variable: Quiz Scores |
||||||
b. Predictors: (Constant), Hours of Sleep |
Coefficientsa |
||||||
Model |
Unstandardized Coefficients |
Standardized Coefficients |
t |
Sig. |
||
B |
Std. Error |
Beta |
||||
1 |
(Constant) |
38.553 |
5.177 |
7.447 |
<.001 |
|
Hours of Sleep |
6.376 |
.759 |
.948 |
8.399 |
<.001 |
|
a. Dependent Variable: Quiz Scores |
Note: Slight error was introduced in hand-calculations due to rounding which can cause those values to differ from the SPSS results in the third to fourth decimal places.
- If a scatterplot is desired to visualize the regression, click Graphs -> Scatter/Dot -> Simple Scatter -> Define. Move the names of \(X\)- and \(Y\)-variables from the left to their respective boxes on the right. Then click “OK.” The graph will appear in the output window.
- If a regression line is desired, double-click the graph to open the graph editor. Then click the Fit Line button. Choose the “Linear” option for the fit line and click “Apply.” If the mean of \(Y\) line is also desired, click the Fit Line button again and then select “Mean of \(Y\).” You can also double click on an existing fit line to reopen the window to change which type it is (such as from a linear regression line to a mean of \(Y\) line).
Reading SPSS Output for a Simple Regression
The Model Summary Table shows the correlation coefficient which is \(r\) = .948. This is .95 when rounded to the hundredths place for reporting purposes. It also shows the \(r\)-squared value which is .898 or 89.8%. These match the results from the hand-calculations performed in Chapter 12 for Data Set 12.1.
The ANOVA table shows the parts that go into the \(F\) formula, the resulting \(F\)-value, and the “sig.” The computed values match those which were hand-calculated earlier in this chapter to the hundredths place. Recall that “Sig.” is SPSS refers to the \(p\)-value. When this is less than the alpha level of .05, it is determined that the result is statistically significant and the hypothesis is supported. When this is greater than or equal to the alpha level of .05, it is determined that the result is not statistically significant and the null hypothesis is retained.
The coefficients table is used to find the slope and the y-intercepts as well as to assess whether the t-test was significant. The slope (\(b_1\)) and \(y\)-intercept (\(b_0\)) appear in the B column. The slope appears in the bottom row of the B column next to the name of the \(X\)-variable. The y-intercept appears on the top row of the B column next to the label “(constant).” These are the values that are put into the linear equation when \(X\) is used to predict a \(Y\)-value. The t-test results for testing the slope appear on the bottom row of the coefficients table. The last two columns present the obtained t-value and the corresponding \(p\)-value (“Sig.”). Note that the table does not show the \(df\) for the t-test but that this is because the \(df_E\) is used for the t-test in a regression. Thus, the \(df\) for the t-test can be found in the ANOVA results table.
- What information is provided in the Model Summary Table of SPSS output for simple regression?
- What information is provided in the ANOVA Table of SPSS output for simple regression?
- What information is provided in the Coefficients Table of SPSS output for simple regression?
- Which kind of graph and fit lines should be used to visualize a simple regression?