2. Analysis of Variance (ANOVA)
When an F test is used to test a
hypothesis concerning the means of
three or more populations, the
technique is called analysis of
variance (ANOVA).
Although the t test is commonly used
to compare two means, it should not
be used to compare three or more
3. Assumptions for the F Test for
Comparing Three or More Means
The populations from which the samples
were obtained must be normally or
approximately normally distributed.
The samples must be independent of each
other.
The variances of the populations must be
equal.
Although means are being compared in this F
test, variances are used in the test instead of
the means.
Two different estimates of the population
variance are made.
4. Analysis of Variance
Between-group variance - this involves
computing the variance by using the
means of the groups or between the
groups.
Within-group variance - this involves
computing the variance by using all the
data and is not affected by differences
in the means.
5. F-test
If there is no difference in the means, the
between-group variance will be approximately
equal to the within-group variance, and the F
test value will be close to 1 – the null
hypothesis will not be rejected
When the means differ significantly, the
between-group variance will be much larger
than the within-group variance; the F test will
be significantly greater than 1 – the null
hypothesis will be rejected
6. Hypothesis in Analysis of
Variance
The following hypotheses should be
used when testing for the difference
between three or more means.
H0: = = … = k
H1: At least one mean is different from
the others.
7. Degrees of Freedom in
Analysis of Variance
d.f.N. = k – 1, where k is the number of
groups.
d.f.D. = N – k, where N is the sum of the
sample sizes of the groups.
The sample sizes do not need to be
equal
8. Procedure for Finding
F Test Value
Step 1- Find the mean and variance of
each sample
Step 2- Find the grand mean
Step 3- Find the between-group
variance
Step 4- Find the within-group variance
Step 5- Find the F test value
9. Analysis of Variance
Summary Table
Source
Sum of
Square
s d.f.
Mean
Square
s F
Betwee
n
SS B k - 1 MS B
MS B
Within SS W N – k MS W
MS W
Total
10. Sum of Squares Between
Groups
The sum of the squares between
groups, denoted SS B, is found using
the following formula:
1
)
( 2
2
k
X
X
S
GM
i
B
11. Sum of Squares Between
Groups
The grand mean, denoted by XGM, is the
mean of all values in the samples
N
X
X
i
GM
12. Sum of Squares Within
Groups
The sum of the squares within groups,
denoted SSW, is found using the following
formula:
Note: This formula finds an overall variance by calculating
a weighted average of the individual variances; it does not
involve using differences of the means
)
1
(
2
2
i
i
w
n
S
S
13. The Means Squares
The mean square values are equal to the
sum of the squares divided by the degrees
of freedom
k
N
S
MS
k
S
MS w
w
B
B
2
2
1
14. Analysis of Variance -Example
A cereal chemist studied the effect of
wheat variety on test weight; he
studied 3 wheat verities. The test
weights of wheat are shown on the
table (next slide).
Is there a significant difference in the
mean waiting times of customers for
each store using = 0.05?
15. Analysis of Variance -Example
variety A variety B variety C
75 77 78
74 78 79
76 75 80
75 76 77
77 78 78
76 77 77
16. Step 1: State the hypotheses and identify
the claim.
H0: =
H1: At least one mean is different from the
others (claim).
Analysis of Variance -Example
17. Step 2: Find the critical value. Since
k = 3, N = 18, and = 0.05, d.f.N. = k – 1 =
3 – 1= 2, d.f.D. = N – k = 18 – 3 = 15. The
critical value is 3.68.
Step 3: Compute the test value. From the
MINITAB output, F = 8.35.
Analysis of Variance -Example
18. Step 4: Make a decision. Since F 8.35 <
3.68, the decision is to reject the null
hypothesis.
Step 5: Summarize the results. There is
enough evidence to support the claim that
there is a difference among the means.
Analysis of Variance -Example
20. Tukey Test
The Tukey test can also be used after
the analysis of variance has been
completed to make pairwise com-
parisons between means when the
groups have the same sample size
The symbols for the test value in the
Tukey test is q
21. Formula for Tukey Test
where Xi and Xj are the means of the samples
being compared, n is the size of the sample and
sW
2 is the within-group variance
n
S
x
x
q
w
j
i
/
2
22. Tukey Test Results
When the absolute value of q is
greater than the critical value for the
Tukey test, there is a significant
difference between the two means
being compared
23. Two-Way Analysis of
Variance
The two-way analysis of variance is an
extension of a one way analysis of
variance already discussed; it involves
two independent variables
The independent variables are also
called factors
24. Two-Way Analysis of
Variance
Using the two-way analysis of
variance, the researcher is able to test
the effects of two independent
variables or factors on one dependent
variable
In addition, the interaction effect of
the two variables can be used
25. Two-Way ANOVA Terms
Variables or factors are changed
between two levels – two different
treatments
The groups for a two-way ANOVA are
sometimes called treatment groups
27. Two-Way ANOVA
Null-Hypothesis
A two-way ANOVA has several null-
hypotheses
There is one for each independent
variable and one for the interaction
28. Two-Way ANOVA Summary
Table
Sourc
e
Sum of
Square
s
d.f.
Mean
Square F
A SSa a – 1 MSA FA
B SSB b – 1 MSB FB
A x B SSAxB (a-1)(b-1) MSAxB FAxB
Within
(error)
SSW ab(n-1) MSW
Total
29. Assumptions for the
Two-Way ANOVA
The population from which the samples were
obtained must be normally or approximately
normally distributed
The samples must be independent
The variances of the population from which
the samples were selected must be equal
The groups must be equal in sample size
30. Graphing Interactions
To interpret the results of a two-way
analysis of variance, researchers
suggest drawing a graph, plotting the
means of each group, analyzing the
graph, and interpreting the results
31. Disordinal Interaction
If the graph of the means has lines that
intersect each other, the interaction is
said to be disordinal
When there is a disordinal interaction,
one should not interpret the main
effects without considering the
interaction effect
33. Ordinal Interaction
An ordinal interaction is evident when the
lines of the graph do not cross nor are they
parallel
If the F test value for the interaction is
significant and the lines do not cross each
other, then the interaction is said to be
ordinal and the main effects can be
interpreted differently of each other
35. No Interaction
When there is no significant interaction
effect, the lines in the graph will be
parallel or approximately parallel
When this simulation occurs, the main
effects can be interpreted differently of
each other because there is no
significant interaction