<meta http-equiv="refresh" content="1; url=/nojavascript/"> The One-Way ANOVA Test | CK-12 Foundation
Dismiss
Skip Navigation
You are reading an older version of this FlexBook® textbook: CK-12 Probability and Statistics - Advanced (Second Edition) Go to the latest version.

Learning Objectives

  • Understand the shortcomings of comparing multiple means as pairs of hypotheses.
  • Understand the steps of the ANOVA method and the method's advantages.
  • Compare the means of three or more populations using the ANOVA method.
  • Calculate pooled standard deviations and confidence intervals as estimates of standard deviations of populations.

Introduction

Previously, we have discussed analyses that allow us to test if the means and variances of two populations are equal. Suppose a teacher is testing multiple reading programs to determine the impact on student achievement. There are five different reading programs, and her 31 students are randomly assigned to one of the five programs. The mean achievement scores and variances for the groups are recorded, along with the means and the variances for all the subjects combined.

We could conduct a series of t-tests to determine if all of the sample means came from the same population. However, this would be tedious and has a major flaw, which we will discuss shortly. Instead, we use something called the Analysis of Variance (ANOVA), which allows us to test the hypothesis that multiple population means and variances of scores are equal. Theoretically, we could test hundreds of population means using this procedure.

Shortcomings of Comparing Multiple Means Using Previously Explained Methods

As mentioned, to test whether pairs of sample means differ by more than we would expect due to chance, we could conduct a series of separate t-tests in order to compare all possible pairs of means. This would be tedious, but we could use a computer or a TI-83/84 calculator to compute these quickly and easily. However, there is a major flaw with this reasoning.

When more than one t-test is run, each at its own level of significance, the probability of making one or more type I errors multiplies exponentially. Recall that a type I error occurs when we reject the null hypothesis when we should not. The level of significance, \alpha, is the probability of a type I error in a single test. When testing more than one pair of samples, the probability of making at least one type I error is 1-(1-\alpha)^c, where \alpha is the level of significance for each t-test and c is the number of independent t-tests. Using the example from the introduction, if our teacher conducted separate t-tests to examine the means of the populations, she would have to conduct 10 separate t-tests. If she performed these tests with \alpha=0.05, the probability of committing a type I error is not 0.05 as one would initially expect. Instead, it would be 0.40, which is extremely high!

The Steps of the ANOVA Method

With the ANOVA method, we are actually analyzing the total variation of the scores, including the variation of the scores within the groups and the variation between the group means. Since we are interested in two different types of variation, we first calculate each type of variation independently and then calculate the ratio between the two. We use the F-distribution as our sampling distribution and set our critical values and test our hypothesis accordingly.

When using the ANOVA method, we are testing the null hypothesis that the means and the variances of our samples are equal. When we conduct a hypothesis test, we are testing the probability of obtaining an extreme F-statistic by chance. If we reject the null hypothesis that the means and variances of the samples are equal, and then we are saying that the difference that we see could not have happened just by chance.

To test a hypothesis using the ANOVA method, there are several steps that we need to take. These include:

1. Calculating the mean squares between groups, MS_B. The MS_B is the difference between the means of the various samples. If we hypothesize that the group means are equal, then they must also equal the population mean. Under our null hypothesis, we state that the means of the different samples are all equal and come from the same population, but we understand that there may be fluctuations due to sampling error. When we calculate the MS_B, we must first determine the SS_B, which is the sum of the differences between the individual scores and the mean in each group. To calculate this sum, we use the following formula:

SS_B=\sum^m_{k=1} n_k (\bar{x}_k-\bar{x})^2

where:

k is the group number.

n_k is the sample size of group k.

\bar{x}_k is the mean of group k.

\bar{x} is the overall mean of all the observations.

m is the total number of groups.

When simplified, the formula becomes:

SS_B=\sum^m_{k=1} \frac{T^2_k}{n_k}-\frac{T^2}{n}

where:

T_k is the sum of the observations in group k.

T is the sum of all the observations.

n is the total number of observations.

Once we calculate this value, we divide by the number of degrees of freedom, m-1, to arrive at the MS_B. That is, MS_B=\frac{SS_B}{m-1}

2. Calculating the mean squares within groups, MS_W. The mean squares within groups calculation is also called the pooled estimate of the population variance. Remember that when we square the standard deviation of a sample, we are estimating population variance. Therefore, to calculate this figure, we sum the squared deviations within each group and then divide by the sum of the degrees of freedom for each group.

To calculate the MS_W, we first find the SS_W, which is calculated using the following formula:

\frac{\sum(x_{i1}-\bar{x}_1)^2+\sum (x_{i2}-\bar{x}_2)^2+ \ldots + \sum (x_{im}-\bar{x}_m)^2}{(n_1-1)+(n_2-1)+ \ldots + (n_m-1)}

Simplified, this formula becomes:

SS_W=\sum^m_{k=1} \sum^{n_k}_{i=1} x^2_{ik}-\sum^m_{k=1} \frac{T^2_k}{n_k}

where:

T_k is the sum of the observations in group k.

Essentially, this formula sums the squares of each observation and then subtracts the total of the observations squared divided by the number of observations. Finally, we divide this value by the total number of degrees of freedom in the scenario, n-m.

MS_W=\frac{SS_W}{n-m}

3. Calculating the test statistic. The formula for the test statistic is as follows:

F=\frac{MS_B}{MS_W}

4. Finding the critical value of the F-distribution. As mentioned above, m-1 degrees of freedom are associated with MS_B, and n-m degrees of freedom are associated with MS_W. In a table, the degrees of freedom for MS_B are read across the columns, and the degrees of freedom for MS_W are read across the rows.

5. Interpreting the results of the hypothesis test. In ANOVA, the last step is to decide whether to reject the null hypothesis and then provide clarification about what that decision means.

The primary advantage of using the ANOVA method is that it takes all types of variations into account so that we have an accurate analysis. In addition, we can use technological tools, including computer programs, such as SAS, SPSS, and Microsoft Excel, as well as the TI-83/84 graphing calculator, to easily perform the calculations and test our hypothesis. We use these technological tools quite often when using the ANOVA method.

Example: Let’s go back to the example in the introduction with the teacher who is testing multiple reading programs to determine the impact on student achievement. There are five different reading programs, and her 31 students are randomly assigned to one of the five programs. She collects the following data:

Method

& 1 && 2 && 3 && 4 && 5 \\& 1 && 8 && 7 && 9 && 10 \\& 4 && 6 && 6 && 10 && 12 \\& 3 && 7 && 4 && 8 && 9 \\& 2 && 4 && 9 && 6 && 11 \\& 5 && 3 && 8 && 5 &&8 \\& 1 && 5 && 5 &&&&\\& 6 && && 7 &&&&\\& &&&& 5 &&&&

Compare the means of these different groups by calculating the mean squares between groups, and use the standard deviations from our samples to calculate the mean squares within groups and the pooled estimate of the population variance.

To solve for SS_B, it is necessary to calculate several summary statistics from the data above:

& \text{Number } (n_k) && 7 && 6 && 8 && 5 && 5 && 31\\& \text{Total } (T_k) && 22 && 33 && 51 && 38 && 50 &&= 194\\& \text{Mean } (\bar x) && 3.14 && 5.50 && 6.38 && 7.60 && 10.00 && = 6.26\\& \text{Sum of Squared Obs. } \left (\sum_{i=1}^{n_k} x^2_{ik}\right ) && 92 && 199 && 345 && 306 && 510 && = 1,452\\& \frac{\text{Sum of Obs. Squared }}{\text{Number of Obs}} \left (\frac {T_k^2}{n_k}\right ) && 69.14 && 181.50 && 325.13 && 288.80 && 500.00 && = 1,364.57

Using this information, we find that the sum of squares between groups is equal to the following:

SS_B &= \sum^m_{k=1} \frac{T^2_k}{n_k}-\frac{T^2}{N}\\& \approx 1, 364.57 - \frac{(194)^2}{31} \approx 150.5

Since there are four degrees of freedom for this calculation (the number of groups minus one), the mean squares between groups is as shown below:

MS_B=\frac{SS_B}{m-1} \approx \frac{150.5}{4} \approx 37.6

Next, we calculate the mean squares within groups, MS_W, which is also known as the pooled estimate of the population variance, \sigma^2.

To calculate the mean squares within groups, we first use the following formula to calculate SS_W:

SS_W=\sum^m_{k=1} \sum^{n_k}_{i=1} x^2_{ik}-\sum^m_{k=1} \frac{T^2_k}{n_k}

Using our summary statistics from above, we can calculate SS_W as shown below:

SS_W &= \sum^m_{k=1} \sum^{n_k}_{i=1} x^2_{ik}-\sum^m_{k=1} \frac{T^2_k}{n_k}\\& \approx 1, 452 - 1, 364.57\\& \approx 87.43

This means that we have the following for MS_W:

MS_W=\frac{SS_W}{n-m} \approx \frac{87.43}{26} \approx 3.36

Therefore, our F-ratio is as shown below:

F=\frac{MS_B}{MS_W} \approx \frac{37.6}{3.36} \approx 11.19

We would then analyze this test statistic against our critical value. Using the F-distribution table and \alpha=0.02, we find our critical value equal to 4.140. Since our test statistic of 11.19 exceeds our critical value of 4.140, we reject the null hypothesis. Therefore, we can conclude that not all of the population means of the five programs are equal and that obtaining an F-ratio this extreme by chance is highly improbable.

On the Web

http://preview.tinyurl.com/36j4by6 F-distribution tables with \alpha=0.02.

Technology Note: Calculating a One-Way ANOVA with Excel

Here is the procedure for performing a one-way ANOVA in Excel using this set of data.

Copy and paste the table into an empty Excel worksheet.

Select 'Data Analysis' from the Tools menu and choose 'ANOVA: Single-factor' from the list that appears.

Place the cursor in the 'Input Range' field and select the entire table.

Place the cursor in the 'Output Range' field and click somewhere in a blank cell below the table.

Click 'Labels' only if you have also included the labels in the table. This will cause the names of the predictor variables to be displayed in the table.

Click 'OK', and the results shown below will be displayed.

Anova: Single Factor

SUMMARY
Groups Count Sum Average Variance
Column 1 7 22 3.142857 3.809524
Column 2 6 33 5.5 3,5
Column 3 8 51 6.375 2.839286
Column 4 5 38 7.6 4.3
Column 5 6 50 10 2.5
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 150.5033 4 37.62584 11.18893 2.05e-05 2.742594
Within Groups 87.43214 26 3.362775
Total 237.9355 30

Technology Note: One-Way ANOVA on the TI-83/84 Calculator

Enter raw data from population 1 into L1, population 2 into L2, population 3 into L3, population 4 into L4, and so on.

Now press [STAT], scroll right to TESTS, scroll down to 'ANOVA(', and press [ENTER]. Then enter the lists to produce a command such as 'ANOVA(L1, L2, L3, L4)' and press [ENTER].

Lesson Summary

When testing multiple independent samples to determine if they come from the same population, we could conduct a series of separate t-tests in order to compare all possible pairs of means. However, a more precise and accurate analysis is the Analysis of Variance (ANOVA).

In ANOVA, we analyze the total variation of the scores, including the variation of the scores within the groups, the variation between the group means, and the total mean of all the groups (also known as the grand mean).

In this analysis, we calculate the F-ratio, which is the total mean of squares between groups divided by the total mean of squares within groups.

The total mean of squares within groups is also known as the pooled estimate of the population variance. We find this value by analysis of the standard deviations in each of the samples.

Review Questions

  1. What does the ANOVA acronym stand for?
  2. If we are testing whether pairs of sample means differ by more than we would expect due to chance using multiple t-tests, the probability of making a type I error would ___.
  3. In the ANOVA method, we use the ___ distribution.
    1. Student’s t-
    2. normal
    3. F-
  4. In the ANOVA method, we complete a series of steps to evaluate our hypothesis. Put the following steps in chronological order.
    1. Calculate the mean squares between groups and the mean squares within groups.
    2. Determine the critical values in the F-distribution.
    3. Evaluate the hypothesis.
    4. Calculate the test statistic.
    5. State the null hypothesis.
  5. A school psychologist is interested in whether or not teachers affect the anxiety scores among students taking the AP Statistics exam. The data below are the scores on a standardized anxiety test for students with three different teachers.
Teacher's Name and Anxiety Scores
Ms. Jones Mr. Smith Mrs. White
8 23 21
6 11 21
4 17 22
12 16 18
16 6 14
17 14 21
12 15 9
10 19 11
11 10
13

(a) State the null hypothesis.

(b) Using the data above, fill out the missing values in the table below.

Ms. Jones Mr. Smith Mrs. White Totals
Number (n_k) 8 =
Total (T_k) 131 =
Mean (\bar{x}) 14.6 =
Sum of Squared Obs. (\sum^{n_k}_{i=1} x^2_{ik}) =
Sum of Obs. Squared/Number of Obs. (\frac{T^2_k}{n_k}) =

(c) What is the value of the mean squares between groups, MS_B?

(d) What is the value of the mean squares within groups, MS_W?

(e) What is the F-ratio of these two values?

(f) With \alpha=0.05, use the F-distribution to set a critical value.

(g) What decision would you make regarding the null hypothesis? Why?

Image Attributions

Description

Subjects:

Grades:

Date Created:

Feb 23, 2012

Last Modified:

Aug 21, 2014
Files can only be attached to the latest version of None

Reviews

Please wait...
Please wait...
Image Detail
Sizes: Medium | Original
 
CK.MAT.ENG.SE.2.Prob-&-Stats-Adv.11.2

Original text