You are on page 1of 38

One-Way ANOVA

One-Way Analysis of Variance

One-Way ANOVA

The one-way analysis of variance is used to test the claim that two or more population means are equal
< The one-way ANOVA uses an F statistic and is therefore often called the ANOVA F

This is an extension of the two independent samples t-test In fact, with only two groups the t-test and one-way ANOVA are equivalent and will always give the same p-value

t-tests vs ANOVA
The difference between the t-test and the ANOVA F is that the t-test works in the same units as the original scores and the ANOVA F works in squared units Thus, with only two groups: F = t2

The Logic and the Process of Analysis of Variance


The purpose of ANOVA is much the same as the t tests presented in the preceding chapters The goal is to determine whether the mean differences that are obtained for the sample data are sufficiently large to justify a conclusion that there are mean differences between the populations from which the samples were obtained

Multiple t-tests versus ANOVA


The difference between the ANOVA F and the t tests is that ANOVA can be used in situations where there are two or more means being compared, whereas the t tests are limited to situations where only two means are involved Analysis of variance is necessary to protect researchers from excessive risk of a Type I error in situations where a study is comparing more than two population means.

Multiple t-tests vs ANOVA, contd

Comparing more than two groups with ttests would require a series of t tests to evaluate all of the mean differences.
< Remember, a t test can compare only 2 means at a time

Although each t test can be done with a specific -level (risk of Type I error), the -levels accumulate over a series of tests so that the final experimentwise -level can be quite large

Multiple t-tests vs ANOVA, contd


ANOVA allows researcher to evaluate all of the mean differences in a single hypothesis test using a single -level and, thereby, keeps the risk of a Type I error under control no matter how many means are being compared However, what if we just compared each of the groups in a pairwise manner using a testwise -level (i.e., -level for each test) of:

< / (number of tests)

Multiple t-tests vs ANOVA Example


An instructor wants to see if students test scores differ depending on where they sit in the room (left side, middle, right side) Would you recommend that we compare all of the tests simultaneously with an ANOVA F at =.05, or compare the groups in a pairwise manner (e.g., L vs M, L vs R, M vs R) at =.05/3 = .017?

One-way Independent Groups ANOVA


Although ANOVA can be used in a variety of different research situations, this chapter discusses only independent groups designs involving only one independent variable In other words, each of the groups has a separate (and unrelated) sample of subjects

Variables in a One-Way ANOVA


The response (or dependent) variable is the variable youre comparing the groups on (e.g., anxiety) The factor (on independent) variable is the categorical variable being used to define the groups

< We will assume k samples (groups) < The k samples are the levels of the factor

The one-way is because each value is classified in exactly one way (i.e. there is only one factor variable

Assumptions of the One-Way ANOVA

Assumptions
< The data are randomly sampled < The variances of the populations are equal < The distribution of scores in each population are normal in shape

We will come back to these assumptions after going through the steps for the ANOVA

Null and Alternate Hypotheses

The null hypothesis is that the means are all equal


Ho: 1 = 2 = ... = k For example, with three groups: Ho: 1 = 2 = 3

The alternative hypothesis is that at least one of the means is different from another

In other words, Ho: 1 2 ... k would not be an acceptable way to write the alternate hypothesis (this slightly contradicts Gravetter & Wallnau, but tecnically there is no way to test this specific alternative hypothesis with a one-way ANOVA)

One-Way ANOVA Example


A classroom is divided into three sections: left, middle, and right The instructor wants to see if the students differ in test scores depending on where they sit in the room H : = = ... = o L M R H : The test scores are not the same for all 1 sections

One-Way ANOVA
A random sample of the students in each section was taken The test scores were recorded:

< Left: 82, 83, 97, 93, 55, 67, 53 < Middle: 83, 78, 68, 61, 77, 54, 69, 51, 63 < Right: 38, 59, 55, 66, 45, 52, 52, 61

One-Way ANOVA
The summary statistics for the grades of each section are shown in the table below Section
Sample size

Left 7 75.71 17.63 310.90

Middle 9 67.11 10.95 119.86

Right 8 53.50 8.96 80.29

Mean St. Dev Variance

One-Way ANOVA

Variation
< Variation is the sum of the squares of the deviations between a value and the mean of the value < Sum of Squares (SS) is the term used to represent this variation

One-Way ANOVA - Total SS

Are all of the values identical?


< No, so there is some variation in the data < This is called the total variation < Denoted SStotal for the total Sum of Squares (variation) < Sum of Squares is another name for variation

One-Way ANOVA - Between Group SS

Are all of the sample means identical?


< No, so there is some variation between the groups < This is called the between group variation < Sometimes called the variation due to the factor < Denoted SSB for Sum of Squares (variation) between the groups

One-Way ANOVA - Within Group SS

Are each of the values within each group identical?


< No, there is some variation within the groups < This is called the within group variation < Sometimes called the error variation < Denoted SSW for Sum of Squares (variation) within the groups

One-Way ANOVA - Sources of Variation

Therefore, there are two sources of variation


< the variation between the groups, SSB
In other words, the variation due to the factor

< the variation within the groups, SSW ,


In other words, the variation that cant be explained by the factor (error variation)

< Note that the sum of the between group and within group SS equals the total SS
SST = SSB + SSW

One-Way ANOVA

Here is the basic one-way ANOVA table SS df MS F p

Source Between Within Total

One-Way ANOVA

Total Sums of Squares, SST


< The total variation in the scores regardless of group < Note that the sum of the scores for each group is denoted T1 ... Tk (TL = 530, TM = 604, TR = 428) < The sum of all the scores in the study is G = T = 1562

One-Way ANOVA

Within Group Variation, SS W


< The Within Group Variation represents the variation within the groups (if that was not obvious ...) < Note that the SS within each group for our example are: SSL = 1865.43, SSM = 958.89, SSR = 562.00

One-Way ANOVA

Between Group Variation, SS B


The between group variation is the variation among the sample means Each individual variation is weighted by the sample size

One-Way ANOVA

After filling in the sum of squares, we have SS


1901.52 3386.32 5287.83

Source Between Within Total

df

MS

One-Way ANOVA - Degrees of Freedom

Degrees of Freedom, df
< A degree of freedom occurs for each value that can vary before the rest of the values are predetermined < For example, if you had six numbers that had an average of 40, you would know that the total had to be 240. Five of the six numbers could be anything, but once the first five are known, the last one is fixed so the sum is 240. The df would be 6-1=5 < The df is often one less than the number of values

One-Way ANOVA - Degrees of Freedom

The between group df is one less than the number of groups


< We have three groups, so dfB = 2

The within group df is the sum of the individual dfs of each group
< The sample sizes are 7, 9, and 8 < dfW = 6 + 8 + 7 = 21 < Alternatively, dfW = N - k = 24 - 3 = 21

The total df is one less than the sample size


< dfT = 24 % 1 = 23

One-Way ANOVA

Filling in the degrees of freedom gives this SS


1901.52 3386.32 5287.83

Source Between Within Total

df
2 21 23

MS

One-Way ANOVA - MS

Variances

The variances are also called the Mean of the Squares and abbreviated by MS, often with an accompanying variable MSB or MSW They are an average squared deviation from the mean and are found by dividing the variation by the degrees of freedom MS = SS / df

Variation Variance = df

One-Way ANOVA - MS
MSB= 1901.52 / 2= 950.76 MS = 3386.32 / 21= 161.25 W MS = 5287.83 / 23= 229.91 T

< Notice that the MS(Total) is NOT the sum of MS(Between) and MS(Within). < This works for the SS(Total), but not the mean square MS(Total) < The MS(Total) is often not presented in an ANOVA summary table

One-Way ANOVA Summary Table

Completing the MS gives SS


1901.52 3386.32 5287.83

Source Between Within Total

df

MS 2 950.76

21 161.25 23 229.91

One-Way ANOVA

F test statistic
< An F test statistic is the ratio of two sample variances < Specifically, F is the ratio of the MSB to MSW
In other words, how variability in there in the group means relative to the variability within each group

< F = MSB / MSW

For our data, F = 950.76 / 161.25 = 5.90

One-Way ANOVA

Adding F to the table SS


1901.52 3386.32 5287.83

Source Between Within Total

df

MS 2 950.76

5.90

21 161.25 23 229.91

One-Way ANOVA

The F test is always a one-tailed test


< In other words, since F is a ratio of two variances it can never be less than 0 < Further small values of F indicate small differences between the means whereas large values of F indicate large differences between the means

The F test statistic has an F distribution with dfB and dfW degrees of freedom

One-way ANOVA - Decision about the Null Hypothesis


Fcrit with =.05, dfB = 2, and dfW = 21 is 3.47 Therefore since our F obtained (5.90) is greater than our Fcrit we reject the null hypothesis We conclude that test scores are not the same for people who sit in the left, middle and right

One-way ANOVA - Decision about the Null Hypothesis with a p-value


If we were using SPSS we would have obtained a p-value Here is an example output:

< Df SS MS F p < Section 2 1901.5 950.8 5.8961 0.009284 < Within 21 3386.3 161.3

Therefore since p < (.05), reject the null hypothesis

One-Way ANOVA

Completing the table with the p-value SS


1901.52 3386.32 5287.83

Source Between Within Total

df

MS 2 950.76

5.90 .009

21 161.25 23 229.91

One-Way ANOVA
There is enough evidence to support the claim that there is a difference in the mean scores of the left, middle, and right sections of the class. However, there are still a few important points to consider:

< What about effect sizes?? < How do we know which sections differ in terms of mean test scores?? < What about assumption violations?

You might also like