13.3: TwoWay ANOVA Summary Table
 Page ID
 19494
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{\!\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)
( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\ #1 \}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\id}{\mathrm{id}}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\kernel}{\mathrm{null}\,}\)
\( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\)
\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\)
\( \newcommand{\norm}[1]{\ #1 \}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)
\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)
\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)
\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vectorC}[1]{\textbf{#1}} \)
\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)
\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)
\( \newcommand{\vectE}[1]{\overset{\!\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{\!\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left#1\right}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)Factorial designs are designs, not a statistical analysis. Research designs are about how data are collected, instead of how data are analyzed. However, factorial designs are often analyzed with ANOVAs, so we will walk through ANOVA Summary Tables for factorial designs with at least two IVs with at least two levels each (“Twoway” means that there are two variables, each with two or more levels.). The main point of factorials designs is how the different IV’s and their levels are combined, which is a research design issue.
ANOVA Summary Table Refresher
Remember the ANOVA Summary Table for Between Groups designs (when the groups were independent) shown in Table \(\PageIndex{1}\)?
Source  \(SS\)  \(df\)  \(MS\)  \(F\) 

Between Groups  \(S S_{B}\)  \(k1\)  \(\frac{S S_{B}}{d f_{B}}\)  \(\frac{MS_{B}}{MS_{W}}\) 
Within Groups (Error)  \(S S_{W}\)  \(Nk\)  \(\frac{S S_{W}}{d f_{W}}\)  N/A 
Total  \(S S_{T}\)  \(N1\)  N/A  N/A 
Once you had the Sum of Squares (SS) and the Degrees of Freedom (df), you could easily calculate the Mean Square (MS), and then the final F.
What was added for the Repeated Measures ANOVA Summary Table (example shown in Table \(\PageIndex{2}\))?
Source  \(SS\)  \(df\)  \(MS\)  \(F\) 

Between Groups 
formula elsewhere 
k1 
Between Groups SS/df 
MS_{BG} / MS_{WG} 
Participants 
formula elsewhere 
P1 
N/A 
N/A 
Within Groups (Error) 
SS_{WG} = SS_{T} – SS_{BG} – SS_{P} 
(k1)*(P1) 
Within Groups SS/df 
N/A 
Total 
formula elsewhere 
N1 
N/A 
N/A 
Right, we added one more row to account for the variation within each participant. But just like the BG ANOVA Summary Table, once you had all of the Sums of Squares (SS) and the Degrees of Freedom (df), you could easily calculate the Mean Square (MS), and then the final F.
The good news is that the TwoWay ANOVA Summary Table, the ANOVA Summary Table used when your study has two IVs, has similar properties. The bad news is that, just like the BG and RM ANOVA Summary Tables, the SS and df have some new formulas.
In sum, whatever the type of ANOVA you have, your general process will be:
 Calculate the degrees of freedom.
 Calculate the Sum of Squares.
 Use SS to calculate the Mean Square (SS/df).
 Calculate the F as a ratio of the MS_{BG} divided by the MS_{WG} (or MS_{Error}, they are the same thing).
 The only difference with a twoway ANOVA is that you have two Between Group MS’s, one for each variable, and a row for the interaction.
 Locate a critical value.
 Determine whether you retain or reject the null hypothesis.
What would that kind of ANOVA Summary Table look like? Let's look at one in Table \(\PageIndex{3}\):
Source  \(SS\)  \(df\)  \(MS\)  \(F\) 

IV 1  
IV 2 

Interaction  
Within Groups (Error) 

Total 
Sum of Squares and Degrees of Freedom
With independent sample Between Groups ANOVAs, we had three Sums of Squares and three degrees of freedom.
Exercise \(\PageIndex{1}\)
What were the names of the three rows called? In other words, what were the names of the three Sums of Squares?
 Answer

 Between Groups
 WithinGroups (or Error)
 Total
With dependent sample Repeated Measures ANOVAs, we needed four Sums of Squares and four degrees of freedom.
Exercise \(\PageIndex{2}\)
What were the names of the four rows called? In other words, what were the names of the four Sums of Squares?
 Answer

 Between Groups
 Participants
 WithinGroups (or Error)
 Total
With twoway ANOVAs you need six Sums of Squares and six degrees of freedom! If you look at Table 3, though, you’ll only see five rows. What’s the deal? There’s an extra calculation called “cells” that allows you to fill in the ANOVA Summary. So, the Sums of Squares and Degrees of Freedom that are need are:
 Cells: This is the SS between all of the groups (This is not listed on the ANOVA Summary Table, but is used to calculate some of the others.)
 Between group for one variable (IV_{1})
 Between group for the other variable (IV_{2})
 Interaction
 Within group
 Total
The next major section shows how to calculate the Sums of Squares using tables, but it is exceedingly unlikely that you would every calculate a factorial ANOVA by hand. This page is unmodified from the original source (Dr. Crump's Answering Questions with Data), and uses a statistical analysis software call R. It's so unlikely that you'll every do this by hand that the page is not updated or formatted for this textbook. So let’s look at how to calculate each Degree of Freedom.
Formulas for Degrees of Freedom
The following are the formulas for each Degree of Freedom.
 Cells: \((k_1 \times k_2) – 1\)
 Remembering that “k” is the number of groups, k_{1} is the number of levels of the first IV and , k_{2} is the number of levels of the other IV.
 Between group for one variable (IV1_{1}): \(k_1 – 1\)
 Between group for the other variable (IV_{2}): \(k_2) – 1\)
 Interaction: \(df_1_{ } \times df_2\)
 Within group: \(df_{Total}_{ }– df_{Cells}\)
 Total: N1
 With N being the number of scores.
Dr. MO has been skating around the issue, but a twofactor another can be Between Groups (with all IV levels being independent) or Repeated Measures (with all IV levels being repeated or related) or a combination (some IVs are independent and some IVs are repeated). To make things easier in our calculations, we’ll just use the TwoWay ANOVA Summary Table as if all of the IVs are independent, even though it’s actually pretty common to have a combination of IVs. In any case, use the N of the number of scores, which is usually the number of participants.
When we have an example, we’ll replace the subscript numbers with subscript letters that represent the name of the IVs.
Mean Square
In twoway ANOVAs, you need to four Mean Squares.
Formulas for Mean Square
Regardless of which Mean Square you are calculating, it will be the same as all of the Mean Squares that we’ve covered in the last couple chapters:
\[ \text{MS} = \dfrac{SS}{df} \nonumber \]
You use the Sum of Squares of the row that you are working with, and divide if by the Degrees of Freedom of the row that you are working with to get the Mean Square for that row.
Calculated F
In twoway ANOVAs, you end up with three calculated F’s, one for each variable and one for the interaction. For factorial designs with more IVs, you would end with more calculated Fscores.
Formulas for Calculated FScores
Each Fscore is calculated the same was as the F in a Between Groups ANOVA Summary Table or a Repeated Measures ANOVA Summary Table: MS_{IV} / MS _{WG or Error}
For a twoway ANOVA, that looks like:
Exercise \(\PageIndex{3}\)
What is similar to past Fscore calculations? What is similar amongst all three of these calculations?
 Answer

As with prior F calculations, the ratio is the Mean Square of the variable that we’re looking and the Mean Square of the withingroups error term. That is what is similar amongst all three of these calculations, too.
Each calculated F will be compared to a different critical value based on the Degrees of Freedom of the numerator and the denominator.
TwoWay ANOVA Summary Table with Formulas
Here’s the ANOVA Summary Table for a twoway factorial design with all of the formulas included, other than the Sum of Squares. The formulas for the Sum of Squares aren't provided because they are beyond the scope of this introduction to statistics (and you'd never do them by hand, anyway). Note that there will again be three blank cells, labeled "N/A" in Table \(\PageIndex{4}\).
Source  \(SS\)  \(df\)  \(MS\)  \(F\) 

IV 1

formula elsewhere 
\(k_1  1 \) 
\( = \dfrac{SS_1}{df_1} \) 
\( = \dfrac{MS_{1}}{MS_{WG}} \nonumber \) 
IV 2 
formula elsewhere 
\(k_2  1 \) 
\( = \dfrac{SS_2}{df_2} \) 
\( = \dfrac{MS_{2}}{MS_{WG}} \nonumber \) 
Interaction  formula elsewhere 
\(df_1 * df_2 \) 
\( = \dfrac{SS_{INT}}{df_{INT}} \) 
\( = \dfrac{MS_{Interaction}}{MS_{WG}} \nonumber \) 
Within Groups (Error) 
formula elsewhere 
\(df_{Total}  df_{cells} \) 
\( = \dfrac{SS_{WG}}{df_{WG}} \) 
N/A 
Total 
formula elsewhere 
N1 
N/A  N/A 
To complete the table, you need to need to calculate the degrees of freedom of cells \((df_{cells} = (k_1 \times k_2) – 1)\) although there’s no place in the table to include this information.
Normally, we put the variable with fewer levels before (above) the variable with more levels in the notation and in the ANOVA Summary Table.
Practice with a TwoWay ANOVA Summary Table
Example \(\PageIndex{1}\)
Example 1 Complete the following ANOVA Summary Table with the information provided. The Sums of Squares are already in Table \(\PageIndex{5}\).
Participants (N = 48) joined a weightloss program designed to increase the time people exercised. The program lasted either one month, two months, or three months. At the beginning and end of their program, the participants measured their weight to see how many pounds they lost. Because hormones affect weight loss, the gender of each participant was recorded and used as a variable in this 2x3 factorial design.
8 men & 8 women who exercised 1 month
8 men & 8 women who exercised 2 months
8 men & 8 women who exercised 3 months
Solution
To complete the calculations you'll find:
 The Mean Square for one IV
 The Mean Square for the other IV
 The Mean Square for the interaction of the two groups.
 The Mean Square for the Error (the denominator of the calculated Fscore), which is sometimes called Mean Square for WithinGroups.
 The Fratio for one group
 The Fratio for the other group
 The Fratio for the interaction of the two groups.
 F for one group: \( = \dfrac{MS_1}{MS_{WG}} \nonumber \)
 F for other group: \( = \dfrac{MS_2}{MS_{WG}} \nonumber \)
 F for the interaction: \( = \dfrac{MS_{Interaction}}{MS_{WG}} \nonumber \)
Source  \(SS\)  \(df\)  \(MS\)  \(F\) 

IV1 (Gender)

330.75 
\(k_1  1 \) = 2  1 = 1 
\( = \dfrac{SS_1}{df_1} = \dfrac{330.75}{1} = 330.75 \) 
\( = \dfrac{MS_{1}}{MS_{WG}} = \dfrac{330.75}{14.79} = 22.36 \nonumber \) 
IV2 (Length of Program: 1, 2, or 3 months) 
1065.50 
\(k_2  1 \) = 3  1 = 2 
\( = \dfrac{SS_2}{df_2} = \dfrac{1065.50}{2} = 532.75 \) 
\( = \dfrac{MS_{2}}{MS_{WG}} = \dfrac{532.75}{14.79} = 36.02 \nonumber \) 
Interaction 
350.00 
\(df_1 * df_2 \) = 1 * 2 = 2 
\( = \dfrac{SS_{INT}}{df_{INT}} = \dfrac{350.00}{2} = 175.00 \) 
\( = \dfrac{MS_{Interaction}}{MS_{WG}} = \dfrac{175.00}{14.79} = 11.84 \nonumber \) 
Within Groups (Error) 
621.00 
\(df_{Total}  df_{cells} \) = 47  5 = 42 
\( = \dfrac{SS_{WG}}{df_{WG}} = \dfrac{621.00}{42} = 14.79\) 
leave blank 
Total 
2367.25 
\(N1 = 48  1 = 47\) 
leave blank  leave blank 
\((df_{cells} = (k_1 \times k_2) – 1) = (2 \times 3) – 1 = 61 = 5\)
As before, we can do a Calculation Check to see if we did all of the Degrees of Freedom correctly because they all should add up to the total again!
The rest of the process is the same as we’ve been doing, we just have to do it for each of the calculated Fscores: compare each of the three calculated Fscores to three critical values from the Critical Values of F Table to decide if we retain or reject the null hypothesis.
Like in the Between Groups ANOVA and the Repeated Measures ANOVA, the null hypothesis says that all of the means are similar. For factorial designs, this includes the interaction; the null hypothesis is saying that we will find no main effects, and that the means for each combination of IV levels will also be similar. Just for fun, let’s look at what the combination of IVs would look like in a Punnett’s Square.
Example \(\PageIndex{1}\)
Complete a grid showing the factorial design IVs and DVS for the scenario in Example \(\PageIndex{1}\).
Solution
Add text here.
IV Levels  IV1 (Gender): Men  IV1 (Gender: Women 
IV2 (Time): 1 Month 
2x3: Men who exercised for one month 
2x3: Women who exercised for one month 
IV2 (Time): 2 Months 
2x3: Men who exercised for two months 
2x3: Women who exercised for two months 

IV2 (Time): 3 Months 
2x3: Men who exercised for three months 
2x3: Women who exercised for three months 
What would we do next?
For each calculated F, we would determine if the null hypothesis was retained or not. Let’s do that!
Example \(\PageIndex{1}\)
Find the critical values for each of the calculated Fscores in the Table of Critical FScores Table (which can be found in the Common Critical Value Tables page at the end of the book), then determine if you should retain or reject the null hypothesis.
Solution
For the first IV of gender, the calculated Fscore was 22.36. The critical value from the Critical Values of F Table the using the Degrees of Freedom of Gender and the WithinGroups (Error) (Gender df = 1, WithinGroups df = 42) is 4.08 (using the df for the denominator of 40 because that is the closest to 42 or 42 rounding down on the table). Because the following is still true, we reject the null hypothesis and say that the average pounds lost differed between men and women.
Critical < Calculated == Reject null == At least one mean is different from at least one other mean. == p<.05
Critical > Calculated == Retain null == All of the means are similar. == p>.05
For the second IV of length of the exercise program, the calculated Fscore was 36.02. The critical value from the Critical Values of F Table the using the Degrees of Freedom of Gender and the WithinGroups (Error) (Exercise Length df = 2, WithinGroups df = 42) is 3.23 (using the df for the denominator of 40 because that is the closest to 42 or 42 rounding down on the table). Because the critical value is smaller than the calculated value, we reject the null hypothesis and say that the average pounds lost differed between at least two of the three groups (1 month, 2 months or 3 months.
For the interaction between gender and length of exercise program, the calculated Fscore was 11.83. The critical value from the Critical Values of F Table the using the Degrees of Freedom of Gender and the WithinGroups (Error) (Interaction df = 2, WithinGroups df = 42) is 3.23 (using the df for the denominator of 40 because that is the closest to 42 or 42 rounding down on the table). Because the critical value is smaller than the calculated value, we reject the null hypothesis and say that the average pounds lost differed between at least two of the combinations of gender and exercise length.
What about Pairwise Comparisons?
Is your next step pairwise comparisons? Maybe!
Had we retained any of the null hypotheses, we would not have conducted pairwise comparisons for that variable. Just like in the other ANOVAs, if we don’t find a differences between the means, we don’t look for where the difference might be. In this case, though, we could have retained one variable (say, the Interaction), but still rejected the null hypotheses for each IV. In that case, we would have needed to conduct some pairwise comparisons. But maybe not as many as you’d think!
If the IV only has two levels and the Fscore results in rejecting the null hypothesis, then pairwise comparisons are still not needed. The ANOVA is enough. The ANOVA says that at least two means are different from each other, and when we only have two means, we have all we need to know. So, with our Gender IV, we would not need to conduct pairwise comparisons even though we rejected the null hypothesis.
If the IV has three or more levels and the Fscore results in rejecting the null hypothesis, then we would need to conduct pairwise comparisons to find which means are different from each other. This statistically significant main effect shows that at least two means are different from each other, but we need to figure out which specific means.
If the Fscore results in rejecting the null hypothesis for the Interaction, then we would need to conduct pairwise comparisons to find which means are different from each other. This statistically significant interaction shows that at least two means are different from each other, but we’ll have at minimum four means (a 2x2 factorial design results in four combinations) so we need to figure out which means differ from each other.
Phew, that was a lot! Next up is looking at calculating Sums of Squares for a twoway ANOVA. If you are never going to do that by hand, skip that section and start with section on PostHoc Pairwise Comparisons.
Contributors and Attributions
This page was extensively adapted
Foster et al. (University of MissouriSt. Louis, Rice University, & University of Houston, Downtown Campus)
.