SlideShare una empresa de Scribd logo
1 de 67
Psychology in the Schools, Vol. 52(2), 2015 C© 2014 Wiley
Periodicals, Inc.
View this article online at wileyonlinelibrary.com/journal/pits
DOI: 10.1002/pits.21815
TRAINING TEACHERS TO USE EVIDENCE-BASED
PRACTICES FOR AUTISM:
EXAMINING PROCEDURAL IMPLEMENTATION FIDELITY
AUBYN C. STAHMER AND SARAH RIETH
Rady Children’s Hospital, San Diego and University of
California, San Diego
EMBER LEE
Rady Children’s Hospital, San Diego
ERICA M. REISINGER AND DAVID S. MANDELL
The Children’s Hospital of Philadelphia Center for Autism
Research
JAMES E. CONNELL
AJ Drexel Autism Institute
The purpose of this study was to examine the extent to which
public school teachers implemented
evidence-based interventions for students with autism in the
way these practices were designed.
Evidence-based practices for students with autism are rarely
incorporated into community settings,
and little is known about the quality of implementation. An
indicator of intervention quality is
procedural implementation fidelity (the degree to which a
treatment is implemented as prescribed).
Procedural fidelity likely affects student outcomes. This project
examined procedural implemen-
tation fidelity of three evidence-based practices used in a
randomized trial of a comprehensive
program for students with autism in partnership with a large,
urban school district. Results indicate
that teachers in public school special education classrooms can
learn to implement evidence-based
strategies; however, they require extensive training, coaching,
and time to reach and maintain
moderate procedural implementation fidelity. Procedural
fidelity over time and across intervention
strategies is examined. C© 2014 Wiley Periodicals, Inc.
Special education enrollment for children with autism in the
United States has quadrupled
since 2000 (Scull & Winkler, 2011), and schools struggle to
provide adequate programming to these
students. A growing number of interventions for children with
autism have been proven efficacious
in university-based research settings, but much less attention
has been given to practical issues
of implementing these programs in the classroom, where most
children with autism receive the
majority of their care (Sindelar, Brownell, & Billingsley, 2010).
In general, evidence-based practices
for children with autism are rarely incorporated into community
settings (Stahmer & Ingersoll, 2004).
Teachers in public schools report receiving inadequate training
and rate their personal efficacy in
working with children with autism as low (Jennett, Harris, &
Mesibov, 2003). Training public
educators to provide evidence-based practices to children with
autism is a central issue facing the
field (Simpson, de Boer-Ott, & Smith-Myles, 2003).
One major challenge to implementing evidence-based practices
for children with autism in
community settings is the complexity of these practices.
Strategies based on the principles of
applied behavior analysis have the strongest evidence to support
their use (National Standards
This research was funded by grants from the National Institute
of Mental Health (5R01MH083717) and the
Institute of Education Sciences (R324A080195). We thank the
School District of Philadelphia and its teachers and
families for their collaboration and support. Additionally, Dr.
Stahmer is an investigator with the Implementation
Research Institute at the George Warren Brown School of Social
Work, Washington University, St. Louis, through an
award from the National Institute of Mental Health
(R25MH080916).
Correspondence to: Aubyn C. Stahmer, Child and Adolescent
Services Research Center & Autism Discovery
Institute, Rady Children’s Hospital, San Diego, 3020 Children’s
Way, MC5033, San Diego, CA 92123. E-mail:
[email protected]
181
182 Stahmer et al.
Project, 2009). These practices vary greatly in structure and
difficulty. Some strategies, such as
discrete trial teaching (DTT; Leaf & McEachin, 1999; Lovaas,
1987), are highly structured and
occur in one-on-one settings, whereas others are naturalistic,
can be conducted individually or
during daily activities, and tend to be more complex to
implement (e.g., incidental teaching; Fenske,
Krantz, & McClannahan, 2001; or pivotal response training
[PRT]; Koegel et al., 1989). There are also
classroom-wide strategies and structures based on applied
behavior analysis, such as teaching within
functional routines (FR; Brown, Evans, Weed, & Owen, 1987;
Cooper, Heron, & Heward, 1987;
Marcus, Schopler, & Lord, 2000; McClannahan & Krantz,
1999). Although all of these evidence-
based practices share the common foundational principles of
applied behavior analysis, each is made
up of different techniques. These and other intervention
techniques are often packaged together as
“comprehensive interventions” (Odom, Boyd, Hall, & Hume,
2010) or used in combination in the
field to facilitate learning and expand the conditions under
which new student behaviors occur (Hess,
Morrier, Heflin, & Ivey, 2008; Stahmer, 2007).
Teachers can learn these evidence-based strategies within the
context of a research study (e.g.,
Suhrheinrich, 2011); however, studies report a highly variable
number of hours of training needed
to master the intervention strategy. For example, the amount of
time required to train classroom
educators in DTT in published studies ranges from 3 hours
(Sarokoff & Sturmey, 2004) at its most
brief, to recommendations of 26 to 60 hours of supervised
experience (Koegel, Russo, & Rincover,
1977; Smith, Buch, & Gamby, 2000; Smith, Parker, Taubman, &
Lovaas, 1992). Teachers have been
trained to fidelity in PRT in 8 to 20 hours (Suhrheinrich, 2011).
To achieve concurrent mastery of
several different intervention techniques and to incorporate the
development of appropriate student
goals, some researchers have suggested that teachers may need a
year or more of full-time, supervised
practicum training (Smith, Donahoe, & Davis, 2000).
There are several reasons why teachers may not implement
evidence-based practices the way
they were designed. First, teachers typically receive limited
instruction in specific interventions. For
example, instruction often comprises attendance at a didactic
workshop and receipt of a manual.
Teachers are then expected to implement evidence-based
practices without the ongoing coaching
and feedback that is critical for intervention mastery (Bush,
1984; Cornett & Knight, 2009). Second,
most evidence-based practices were not designed for school
settings and therefore may be difficult
to implement appropriately in the classroom (Stahmer,
Suhrheinrich, Reed, Bolduc, & Schreibman,
2011). Perhaps as a result, teachers often report that they
combine or modify evidence-based practices
to meet the specific needs of their classroom and students
(Stahmer, Collings, & Palinkas, 2005).
Finally, school administrators sometimes mandate the use of
programs that may not align with
teachers’ classroom environment, beliefs, or pedagogy
(Dingfelder & Mandell, 2011).
A major indication of the quality of the implementation of any
evidence-based practices is
treatment fidelity, also known as implementation fidelity
(Gersten et al., 2005; Horner et al., 2005;
Noell, Duhon, Gatti, & Connell, 2002; Noell et al., 2005;
Proctor et al., 2011; Schoenwald et al.,
2011). Implementation fidelity is the degree to which a
treatment is implemented as prescribed, or
the level of adherence to the specific procedures of the
intervention (e.g., Gresham, 1989; Rabin,
Brownson, Haire-Joshu, Kreuter, & Weaver, 2008; Schoenwald
et al., 2011). There are several types
of implementation fidelity. Procedural fidelity (Odom et al.,
2010; also called program adherence;
Schoenwald et al., 2011) is the degree to which the provider
uses procedures required to execute the
treatment as intended. Other types of fidelity include treatment
differentiation (the extent to which
treatments differ from one another), therapist competence (the
level of skill and judgment used
in executing the treatment; Schoenwald et al., 2011), and
dosage (Odom et al., 2010). Although,
ideally, all types of fidelity would be examined to determine the
fit of an intervention in a school
program (Harn, Parisi, & Stoolmiller, 2013), procedural fidelity
provides one important avenue for
Psychology in the Schools DOI: 10.1002/pits
Training Teachers in Autism Practices 183
measuring the extent to which an intervention resembles an
evidence-based practice or elements of
evidence-based practice (Garland, Bickman, & Chorpita, 2010).
Procedural implementation fidelity is likely a potential
mediating variable affecting student
outcomes, with higher fidelity resulting in better outcomes
(Durlak & DuPre, 2008; Gresham,
MacMilan, Beebe-Grankenberger, & Bocian, 2000; Stahmer &
Gist, 2001); however, it is not often
measured. In behavioral services research, three separate
reviews of reported implementation fidelity
data have been published. In the Journal of Applied Behavior
Analysis, fidelity data were reported
in only 16% to 30% of published articles (Gresham, Gansle, &
Noell, 1993; McIntyre, Gresham,
DiGennaro, & Reed, 2007; Peterson, Homer, & Wonderlich,
1982). Three separate reviews indicated
that only 13% to 32% of autism intervention studies included
fidelity measures (Odom & Wolery,
2003; Wheeler, Baggett, Fox, & Blevins, 2006; Wolery &
Garfinkle, 2002). A recent review of
special education journals found that fewer than half (47%) of
intervention articles reported any type
of fidelity scores (Swanson, Wanzek, Haring, Ciullo, &
McCulley, 2011). Indeed, limited reporting
of implementation adherence is evident across a diverse body of
fields (Gresham, 2009). The lack of
reporting (and therefore, the presumable lack of actual
measurement of implementation) limits the
conclusions that can be drawn regarding the association between
student outcomes and the specific
treatment provided. Therefore, examination of implementation
fidelity, although complicated, is
important to advance the understanding of how evidence-based
interventions are being implemented
in school settings.
Our research team recently completed a large-scale randomized
trial of a comprehensive pro-
gram for students with autism in partnership with a large, urban
public school district. Procedural
implementation fidelity of the overall program (which includes
three evidence-based practices) was
highly variable, ranging from 12% to 92% (Mandell et al.,
2013). The three strategies included in
this program, DTT, PRT, and FR (see description in the Method
section), share an underlying theo-
retical base, but rely on different specific techniques. The
purpose of this study was to examine the
extent to which public school teachers implemented evidence-
based interventions for students with
autism in the way these practices were designed. Examining
implementation fidelity of each strategy
individually may provide insight into whether specific
interventions are more easily implemented in
the classroom environment. In particular, we examined whether
special education classroom teach-
ers and staff: 1) mastered specific strategies that form the
backbone of applied behavioral analysis
programs for autism; 2) used the strategies in their classroom;
and 3) maintained their procedural
fidelity to these strategies over time.
METHOD
Participants
Participants were classroom teachers and staff in an urban
school district’s kindergarten-
through-second-grade autism support classrooms (each in a
different school) participating in a
larger trial of autism services. Of the 67 total autism support
classrooms in the district at the time
of the study, teachers and staff from 57 (85%) of the schools
participated. Each classroom included
one participating teacher and 0 to 2 classroom assistants (M =
1). Throughout the district, staff were
required to participate in intervention training as part of
professional development, but were not
required to consent to participate in the study. Data from the
current study are reported only for the
57 teachers and staff who consented to participate.
Teachers received intensive training in Strategies in Teaching
Based on Autism Research
(STAR) during their first year of participation in the project.
During the second year, continuing
teachers received in-classroom coaching every other week.
From the original 57, 38 teachers (67%)
Psychology in the Schools DOI: 10.1002/pits
184 Stahmer et al.
Table 1
Teacher Demographic Characteristics
N % Female
Total Years
Teaching, M
(range)
Years Teaching
Children with
ASD, M (range)
Education Level %
Bachelor’s Degree/%
Master’s Degree
57 97.3 10.8 (1–38) 6.8 (1–33) 30/70
participated in the second year of the study. See Table 1 for
teacher demographics. A complete
description of adult and student participants can be found
elsewhere (Mandell et al., 2013).
Intervention
Strategies for Teaching Based on Autism Research. The goal of
the Strategies for Teaching
Based on Autism Research (STAR) program is to develop
children’s skills in a highly structured
environment and then generalize those skills to more
naturalistic settings. The program includes a
curriculum in which each skill is matched to a specific
instructional strategy. The STAR program
includes three evidence-based strategies: DTT, PRT, and FR.
DTT relies on highly structured, teacher-directed, one-on-one
interactions between the teacher
and student. In these interactions, the teacher initiates a specific
stimulus to evoke the child’s
response, generally a discrete skill, which is an element of a
larger behavioral repertoire (Krug
et al., 1979; Krug, Rosenblum, Almond, & Arick, 1981; Lovaas,
1981, 1987; Smith, 2001). DTT is
used in STAR for teaching pre-academic and receptive language
skills, where the desired behavior
takes a very specific form, such as learning to identify colors,
sequencing events from a story into
a first-next-then-last structure or counting with one-to-one
correspondence. The consequence of the
desired behavior is an external reinforcer, such as a token or a
preferred edible (Lovaas, 2003; Lovaas
& Buch, 1997).
PRT can occur in both one-on-one interactions and small-group
interactions with the teacher.
It is considered student directed because it occurs in the regular
classroom environment, where the
teaching area is pre-arranged to include highly preferred
activities or toys that the student will be
motivated to acquire. In PRT, students initiate the teaching
episode by indicating interest in an item or
activity or selecting among available teaching materials.
Materials are varied frequently to enhance
student motivation and generalization of skills and make PRT
appropriate for targeting expressive
and spontaneous language (Koegel, O’Dell, & Koegel, 1987;
Koegel et al., 1989; Laski, Charlop,
& Schreibman, 1988; Pierce & Schreibman, 1997; Schreibman
& Koegel, 1996). After the student
expresses interest in an activity or item, he or she is required to
perform a specific behavior related
to the item. The consequence of the desired behavior is getting
access to the activity or item. For
example, students’ attempts to label and request items are
reinforced by the delivery of the item,
which may then provide the opportunity to focus on other skills,
such as joint attention, imitation,
play skills, and generalization of other skills learned in the DTT
format.
FR are the least structured of the STAR instructional strategies.
FR strategies are routines that
occur throughout the day and include school arrival and
dismissal, mealtime, toileting, transitions
between classroom activities, and recreational activities. Each
routine is broken into discrete steps
called a task analysis and then chained together using behavior
analytic procedures such as stimulus
prompts (visual and verbal) and reinforcement of each step in
the routine (Brown et al., 1987; Cooper
et al., 1987; Marcus et al., 2000; McClannahan & Krantz, 1999).
For example, a routine to change
activities may include cuing the transition (verbal prompt),
checking a schedule (visual prompt),
pulling a picture card from the schedule to indicate the next
activity, taking the card to the location of
Psychology in the Schools DOI: 10.1002/pits
Training Teachers in Autism Practices 185
the new activity, putting the card into a pocket utilizing a
match-to-sample technique, and beginning
the new activity, followed by a token for routine completion.
The advantage of this strategy is that
each transition component is taught within the context of
performing the routine, so that the child
learns to respond to natural cues and reinforcers. FR strategies
are conducted in both individual and
group formats, depending on the skills being taught (e.g.,
toileting versus appropriate participation
in snack time).
Training
STAR training occurred in accordance with the STAR
developers’ training protocols. The
research team contracted with the program developers to
provide training directly to the teachers.
Training included workshops, help with classroom setup, and
observation and coaching throughout
the first academic year of STAR implementation (described in
detail in the following sections).
Six local coaches also were trained by the STAR developers to
provide ongoing consultation to
classroom staff during the second year of STAR
implementation. The training protocol for STAR is
manualized and publicly available. Additional information
about the STAR program can be found
at www.starautismsupport.com. Training provided to classroom
teachers and staff included the
following components:
Workshops. The STAR program developers provided a series of
trainings on the use of the
STAR program. The training began in September and consisted
of 28 hours of intensive workshops
that covered the STAR program, including the use of the
curriculum assessment, classroom setup,
and training in DTT, PRT, and FR. Workshops included didactic
teaching, video examples, role-
playing, and a visit to each classroom to help with classroom
setup. STAR workshops took place
outside the school day (i.e., during professional development
days, at night, and on the weekends).
Observation and coaching. During the first year, program
developers observed classroom staff
during regular school hours and provided feedback on use of
STAR strategies with students. Trainers
provided 5 days of observation and coaching immediately
following training, 3 days of follow-up
coaching throughout the academic year, and ongoing advising
and coaching by e-mail and phone.
On average, classrooms received 26.5 (range, 1.5–36) hours of
coaching over 5.7 (range, 3–7) visits
in the first year. During the second year, local coaches trained
by the STAR developers provided
coaching in the STAR strategies. Coaching was provided
September through May on a monthly
basis. On average, classrooms received 36.1 (range, 0–59) hours
of coaching over 10 (range, 0–10)
visits in the second year.
Data Collection Procedures
Data on adherence to the instructional strategies used in STAR
were collected throughout the
academic year via video recording of teaching interactions with
students for coding of implementa-
tion fidelity in each of the three STAR intervention methods.
Classroom staff members were filmed for 30 minutes every
month in Years 1 and 2. Research
assistants trained in filming methods recorded the intervention
during a specified date each month.
Visits were timed to coincide with regularly scheduled use of
each of the intervention methods.
The 30-minute film was composed of 10 minutes of DTT, 10
minutes of PRT, and 10 minutes of
FR to provide a sample of the use of each intervention.
Recording included any consented staff
member providing the intervention. The staff member filmed by
the research staff varied depending
on which staff member (i.e., teacher or paraprofessional) was
conducting the intervention that day.
The primary classroom teacher conducted the intervention in
86% of the videos collected, and
paraprofessional staff conducted the intervention in the
remaining 14% of videos. There were no
Psychology in the Schools DOI: 10.1002/pits
186 Stahmer et al.
statistically significant differences in the proportion of videos
collected by intervention provider
(teacher vs. paraprofessional) for any strategy or time period (p
> .05).
Implementation Fidelity Measures
Coding procedures. The primary method for assessing fidelity
of STAR strategies was through
video recordings of teachers and aides interacting with students.
Coding relied on different criteria
based on specific coding definitions created for each
instructional component, as well as general
teaching strategies (see following sections). Coding schemes for
each method were developed by
the first author and were reviewed by the STAR program
developers.
Trained research assistants blinded to the study hypotheses
coded all video recordings. For each
intervention method, the core research team established correct
codes for a subset of videos through
consensus coding (keys). Each research assistant coder then
learned one coding system (i.e., DTT,
PRT, or FR) and was required to achieve 80% reliability across
two keys before beginning to code
any classroom sessions independently. One third of all tapes
were double coded to ensure ongoing
reliability of data coding throughout the duration of the project.
The core research team also re-coded
two tapes for each research assistant every other month,
providing a measure of criterion validity. If
there was less than 80% agreement between the reliability coder
and the research assistant, additional
training and coaching were provided until criterion was
achieved and previous videos were re-coded.
Coding involved direct computer entry while viewing videos
using “The Observer Video-
Pro” software (Noldus Information Technology, Inc., 2008), a
computerized system for collection,
analysis, and management of direct observation data. For each
instructional strategy, the coder
observed the 10-minute segment and subsequently rated the
adults’ use of each component of the
strategy on a 1 to 5 Likert scale, with 1 indicating Adult does
not implement throughout segment
and 5 indicating Adult implements consistently throughout the
segment. These Likert ratings were
found to have high concordance with more detailed trial-by-trial
coding of each strategy component
(88% agreement) used in previous research (Stahmer, 2010). A
score of 4 or 5 on a component
was considered passing and correlated with 80% correct use of
strategies in the more detailed
coding scheme. Following are the individual components
included in each strategy. Complete coding
definitions are available from the first author.
Discrete trial teaching. For DTT, coders examined the use of the
following components:
gaining the student’s attention, choosing appropriate target
skills, using clear and appropriate cues,
using accurate prompting strategies, providing clear and correct
consequences, using appropriate
inter-trial intervals, and utilizing error correction procedures
effectively (error correction evaluated
against procedures described in Arick, Loos, Falco, & Krug,
2004). The criterion for passing
implementation fidelity was defined as the correct use of 80%
of components (score of 4 or 5) during
the observation.
Pivotal response training. For PRT, coders examined the use of
the following components:
gaining the student’s attention, providing clear and
developmentally appropriate cues related to the
activity, providing the student a choice of stimuli/activities,
interspersing a mixture of maintenance
(previously acquired) and acquisition (not yet mastered) tasks,
taking turns to model appropriate be-
havior, providing contingent consequences, rewarding goal-
directed attempts, and using reinforcers
directly related to the teaching activity. The criterion for
passing implementation fidelity was defined
as the correct use of 80% of components (score of 4 or 5) during
the observation.
Functional routines. For FR, coders examined adherence to each
step of the FR used in class-
rooms during group and individual routines. The use of the
following components was coded: using
error correction procedures appropriately, adhering to FR lesson
plan, and supporting transitions
Psychology in the Schools DOI: 10.1002/pits
Training Teachers in Autism Practices 187
between activities. The criterion for passing implementation
fidelity was defined as correct use of
80% of components (score of 4 or 5) during the observation.
Reliability of Data Recording
Inter-rater reliability, as measured by percent agreement within
1 Likert point, was calculated
for coding of each instructional strategy and each month of
videos by having a second coder, blinded
to the initial codes, score one third of the videos per strategy
for each month. The average overall
percent agreement for each strategy was 86% for DTT (range,
60%–100%); 90% for PRT (range,
75%–100%); and 90% for FR (range, 67%–100%). A primary
coder was assigned to each strategy,
and those codes were used in the analyses.
Data Reduction and Analyses
Data were examined across four periods. Time 1 included the
first measurement for available
classrooms in Year 1, which was conducted in October,
November, or December of 2008. Filming
occurred after the initial training workshops. Coaching was
ongoing throughout the year. If class-
rooms were filmed in more than one of those months, both the
average and the best performance
were analyzed. All classroom staff participated in their initial
training prior to the Time 1 measure-
ment. Time 2 was defined as the performance from the last three
measurements of the school year
(February, March, or April 2009) for Year 1. The same
procedures were used for Year 2 (Times 3 and
4). Time 3 included the first observation in Year 2 (October,
November, or December 2009). Time
4 included the performance during the last 3 months of
observations (February, March, or April,
2010). Both average and best performance from each period
were utilized to provide an estimate of
the staff’s capacity to implement the strategy in the classroom
environment (best) and variability in
competency of use (average).
Data from Year 1 and Year 2 were analyzed. One-way within-
subject (or repeated measures)
analyses of variance (ANOVAs) were conducted for each
intervention strategy to examine change in
implementation fidelity scores for over time. Post-hoc
comparisons were made using paired sample
t tests between time periods when ANOVA results indicated
statistically significant differences.
In addition, we examined differences in fidelity of
implementation across intervention strategies
using a one-way ANOVA with paired sample t tests to follow up
on significant results. Type I error
probability was maintained at .05 (two-tailed) for all analyses
using a Bonferroni correction.
Pearson correlations were conducted to examine the relationship
between fidelity of implemen-
tation of each intervention strategy and teaching experience,
experience working with children with
autism spectrum disorder (ASD), level of education, and number
of hours of coaching received.
RESULTS
Use of the Strategies
Because teachers who did not allow filming in their classrooms
cited staffing difficulties or
lack of preparation as the reason, they were considered not to be
implementing DTT, PRT, or FR in
their classrooms on a regular basis. At Time 1, two teachers
(4%) explicitly indicated that they did
not use DTT at any time, and 13 teachers (23%) indicated that
did not use PRT at any time. The
percentage of classrooms filmed using the strategy is displayed
in Figure 1. In Year 1, classrooms
were filmed most often conducting DTT at both Time 1 (70% of
classrooms) and Time 2 (96%).
Only 23% of classrooms were filmed conducting PRT at Time 1,
and 68% were filmed at Time 2.
FR was filmed in 67% of classrooms at Time 1 and 81% at Time
2. In Year 2, filming was much
more consistent across strategies. DTT and PRT were both
filmed in 92% of classrooms at Time 3
Psychology in the Schools DOI: 10.1002/pits
188 Stahmer et al.
FIGURE 1. The percentage of classrooms using the strategy
during each time period.
and 97% of classrooms at Time 4. For FR, 89% of classrooms
were filmed at Time 3 and 97% at
Time 4.
Overall Competence in the Instructional Strategies
Discrete trial training. The percentage of DTT components on
which teachers met fidelity
(i.e., a score of 4 or 5 during the observation) was used as the
dependent variable for these analyses.
Mean results are displayed in Table 2. No statistically
significant changes over time were found in
average or best DTT fidelity over time. In general, classrooms
had a relatively high average and best
DTT fidelity during all time periods. The range of scores for
individual performance was variable at
both time periods, as evidenced by the large standard
deviations.
The percentage of classrooms in which teachers met DTT
fidelity (i.e., correct implementation
of 80% of the DTT strategies during the observation) was
examined. Fifty-six percent of classrooms
met fidelity at Time 1 based on the average of all observations
at Time 1, 47% at Time 2, 46%
at Time 3, and 59% at Time 4. When considering only the best
example, 65% of classrooms met
fidelity at Time 1, and this increased to 81% by Time 4 (see
Figure 2).
Pivotal response training. The dependent variable for these
analyses was the percentage of
PRT components on which teachers met fidelity (i.e., a …
Medical Case Study with Minitab for solutions
Background: You work for a government agency and your
management asked you to take a look at data from prescription
drugs administered at hospitals in your geography.
She asked you to analyze the data with some common tools and
build a DMAIC model for how you would work with the
hospitals to improve results, since their performance is below
the average. She would like a simple model for you to present
to her that you will propose to representatives from the
hospitals. The hospital representatives will have to be brought
on board and understand the issues and their role in the study.
Use the DMAIC model from the course material to create a
model of the effort to be completed by the hospitals.
Define:
1. What would you say about the DMAIC model to the hospital
staff on your team?
2. Write a problem statement for the work you are considering.
3. Develop a team charter so that each of the representatives
understands what is expected of them and to brainstorm
improvements upon it.
4. What are the key deliverables of the define step that you
expect of the team?
Measure:
1. What activities would you propose that the team work on?
2. What measures would you propose to the team to pursue?
3. What data collection would you propose?
4. What are the key steps to get to items 1-3 above?
Analyze: Prepare data to show the team about the extent of the
problem:
1. A Pareto chart of the errors from the Error Type chart below
1. What would you suggest the team focus upon?
2. What would you tell the team about the data they need to
collect and what will be done with it?
2. Another example of measures is the administration of Drug
A, which needs to be administered every 30 minutes. The
requirement for the drug is to be administered no more than 3
minutes early or 3 minutes late or between 27-33 minutes. Make
a histogram of the data below (Time between administration of
drug chart). What is it saying about the process?
3. Do a normalcy test. Is that a normal distribution?
Improve:
1. You don’t have a process flow or any information on how
hospitals administer drugs or their improvement plans if any.
What would you tell the participants about what is expected in
this phase of the program?
Control:
1. What are the key steps for control?
2. Develop a sample response plan that you would use to show
the team what is expected to be done.
3. What are the key deliverables for this step?
Test data in Excel format:
Error Type
Type of High Alert Medication Error
Omission
8461
Improper dose/quantity
7124
Unauthorized/wrong drug
5463
Prescribing error
2923
Wrong Time
2300
Extra Dose
2256
Wrong patient
1786
Mislabeling
636
Wrong dosage form
586
Wrong administration
335
Drug prepared incorrectly
311
Wrong route
252
Other
113
32546
Observation
Time between administration of Drug
1
35.5
2
26.2
3
31.6
4
26.4
5
28.5
6
24.6
7
26.1
8
29.4
9
33.6
10
38.8
11
27.0
12
27.2
13
19.9
14
32.0
15
23.7
16
28.3
17
25.6
18
26.7
19
24.5
20
28.6
21
23.4
22
29.5
23
27.1
24
28.3
25
31.3
26
27.4
27
25.0
28
24.6
29
27.9
30
29.2
31
28.6
32
23.4
33
29.5
34
27.1
35
28.3
36
31.3
37
27.4
38
25.0
39
24.6
40
27.9
41
28.6
42
23.4
43
29.5
44
27.1
45
28.3
46
31.3
47
27.4
48
25.0
49
24.6
50
40.0
1
The Journal of Special Education
2016, Vol. 50(1) 27 –36
© Hammill Institute on Disabilities 2015
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/0022466915613592
journalofspecialeducation.sagepub.com
Article
In the field of special education, a commitment to the logic
and ethics of using research to inform decisions about prac-
tice has been reflected in the field’s efforts to identify and
use evidence-based practices (EBPs) as a standard for the
profession (Council for Exceptional Children, 2014; Odom
et al., 2005). As in other fields, this focus has led inexorably
back to what some commentators have termed the “wicked”
problem of implementation (Cook & Odom, 2013).
Fixen and his colleagues (following Rittel & Webber,
1973) described wicked problems as those that are “difficult
to define and fight back when you try to solve them” (Fixen,
Blaze, Metz, & Van Dyke, 2013, p. 218). Indeed, the obser-
vation that “interests vested in the system-as-is suddenly
appear and typically deter attempts to change the system”
(Fixen et al., 2013, p. 218) has been made by ecologically
oriented observers of human behavior since time of Marx
(Bronfenbrenner, 1979; Lewin, 1951; Marx, 1888/1984).
One implication of this view, of course, is that the problem
of (non)implementation of EBP may be most usefully
viewed not simply as a “deficit” in the knowledge, skills, or
ideological commitments of practitioners but as a product
of the set of social, organizational, and material conditions
that operate in a given human service setting. In this article,
we draw on interviews conducted with special education
practitioners to investigate how these kinds of contextual
factors (and others) may affect the ways in which practitio-
ners interpret and respond to contemporary press for imple-
mentation of EBP.
We are by no means the first to recognize the importance
of seeking practitioner perspectives in understanding the
challenges of implementing EBP in special education. For
example, Landrum, Cook, Tankersley, and Fitzgerald (2002)
surveyed 127 teachers (60 special educators, 67 general edu-
cators) to assess their views about the value of four sources
of information about practice: university coursework,
613592 SEDXXX10.1177/0022466915613592The Journal of
Special EducationHudson et al.
research-article2015
1University of Washington, Seattle, USA
2Northern Illinois University, DeKalb, USA
3Central Michigan University, Mount Pleasant, USA
4American Institutes for Research, Washington, DC, USA
Corresponding Author:
Roxanne F. Hudson, Area of Special Education, University of
Washington, P.O. Box 353600, Seattle, WA 99195, USA.
E-mail: [email protected]
A Socio-Cultural Analysis of Practitioner
Perspectives on Implementation of
Evidence-Based Practice in Special
Education
Roxanne F. Hudson, PhD1, Carol A. Davis, EdD1, Grace Blum,
MEd1,
Rosanne Greenway, MEd1, Jacob Hackett, MEd1, James
Kidwell, MEd1,
Lisa Liberty, PhD1,2, Megan McCollow, PhD1,3, Yelena Patish,
MEd1,
Jennifer Pierce, PhD1,4, Maggie Schulze, MEd1, Maya M.
Smith, PhD1,
and Charles A. Peck, PhD1
Abstract
Despite the central role “evidence-based practice” (EBP) plays
in special education agendas for both research and policy,
it is widely recognized that achieving implementation of EBPs
remains an elusive goal. In an effort to better understand this
problem, we interviewed special education practitioners in four
school districts, inquiring about the role evidence and EBP
played in their work. Our data suggest that practitioners’
responses to policies that press for increased use of EBP are
mediated by a variety of factors, including their interpretations
of the EBP construct itself, as well as the organizational
conditions of their work, and their access to relevant knowledge
and related tools to support implementation. We
interpret these findings in terms of their implications for
understanding the problem of implementation through a more
contextual and ecological lens than has been reflected in much
of the literature to date.
Keywords
evidence-based practices, implementation, special education
practitioners
mailto:[email protected]
http://crossmark.crossref.org/dialog/?doi=10.1177%2F00224669
15613592&domain=pdf&date_stamp=2015-11-08
28 The Journal of Special Education 50(1)
research journals, teaching colleagues, and in-service/pro-
fessional development workshops. Their data indicated that
research journals and university courses (presumably
sources of relatively reliable information about EBP) were
viewed as less useful, less trustworthy, and less accessible
than either information from colleagues or information
received via professional development. Similarly, Boardman,
Argüelles, Vaughn, Hughes, and Klingner (2005) reported
that teachers often expressed the belief that the extant
research was not relevant to the populations they served in
their classrooms, and reported relying on colleagues for rec-
ommendations about practice.
In a more recent study, Jones (2009) investigated the
views of 10 novice special educators regarding EBP. Based
on interview, classroom observation, and rating scale data,
Jones suggested that the novice teachers she studied fell
into three broad groups. “Definitive supporters” expressed
clear and positive views about the importance of research in
decisions about classroom practice. “Cautious consumers”
felt research could be useful, but often did not reflect char-
acteristics and needs of their individual students. A third
group, “The Critics,” expressed skepticism about the value
of research for decisions about classroom practice.
Taken together, these studies (and others) provide a
rather robust picture of the tensions between research and
practice in special education. While significant variation
exists among special education practitioners in their views
about the value and relevance of research to their work in
the classroom, many express more confidence in the knowl-
edge and expertise of local colleagues than in information
they might receive from university coursework and/or
researchers. This result is consistent with research from
other fields and suggests that much remains to be learned
about the conditions under which practitioners utilize
knowledge from research in decisions about practice
(Aarons & Palinkas, 2007; Glasgow, Lichtenstein, &
Marcus, 2003).
In our review of the special education research on this
topic, we noted that most researchers have framed their
analysis of practitioner perspectives related to implementa-
tion of EBP in essentially individualistic and personological
terms—placing teachers (and, in some cases, administra-
tors) in the center of their analysis of the implementation
process. For example, as noted earlier, Jones (2009) parsed
individual teachers into groups such as “the Critics” and
“the Supporters.” Also focusing on individual practitioners,
Landrum et al. (2002) argued,
Only when we have confidence that teachers learn about
empirically sound practice in both their initial preparation and
ongoing professional development, and that their skills reflect
this training, can we predict that students with disabilities will
be afforded the most appropriate learning opportunities
available. (p. 48)
We do not entirely disagree with these conclusions, and
others like them that underscore the importance of persono-
logical variables (e.g., practitioner knowledge, prior train-
ing, attitudes) affecting implementation of EBP. But we
would also argue that in foregrounding characteristics of
individual practitioners as a focus of analysis, these studies
reflect a set of implicit assumptions about the nature of
practice and how it is constructed that narrows our view of
the problems of implementation, and the range of actions to
be considered in engaging those problems. In the present
study, we follow recent recommendations (Harn, Parisi, &
Stoolmiller, 2013; Klingner, Boardman, & McMaster, 2013;
Peck & McDonald, 2014) in undertaking a more holistic
and contextual approach to understanding how practitioner
perspectives on EBP are shaped by the conditions in which
they work.
Theoretical Framing
In conceptualizing “a more contextual” approach to under-
standing practitioner interpretation and implementation of
EBP, we drew on some of the precepts of sociocultural the-
ory as a general framework for investigating ways in which
social and material conditions shape workplace learning
and practice (Billett, 2003; Engeström, 2001; Scribner,
1997; Vygotsky, 1978). Our choice of a sociocultural per-
spective was based on several of its key precepts that we
believed would be useful in understanding practitioner per-
spectives on implementation of EBP. First, sociocultural
theory foregrounds analysis of relationships between indi-
vidual and collective dimensions of social practice—in this
case, the analysis of the transactions that take place between
individual practitioners and the organizations in which they
work (Engeström, 2001). Second, this view assumes that
human thought processes (including, of course, one’s views
about EBP) are shaped by the demands of the practical
activities in which people are regularly engaged. A third
assumption of this stream of sociocultural theory is that par-
ticipation in social practice is affected by the affordances
and constraints of the conceptual and material tools avail-
able (e.g., the characteristics and representations of EBP
available in local school districts and other professional
resources; Falmagne, 1995; Leontev, 1975/1978; Scribner,
1997). Overall, the sociocultural perspective suggests the
value of undertaking a more focused analysis of the social
and organizational conditions in which decisions about
practice are made than has been reflected in much of the
extant research on the problem of implementation. We used
the following research questions to guide our inquiry:
Research Question 1: How do special education practi-
tioners interpret the meaning of EBP in the context of
decisions they make about curriculum and instruction?
Hudson et al. 29
Research Question 2: What contextual factors are asso-
ciated with practitioner interpretations of the role EBP
can and should play in their decisions about instruction?
Method
We used a qualitative methodology (Merriam, 2009) to
investigate the perspectives—that is, the values, beliefs,
and attitudes—held by special education practitioners with
regard to their views about EBP, and the role research
played in their decisions about curriculum and instruction.
We elected this methodological approach because of the
hypothesis-generating, rather than hypothesis-testing, pur-
poses of the study (Glaser & Strauss, 1967).
Participants
A total of 27 special education practitioners participated in
our study. We contacted directors of special education via
email and invited participation from four school districts in
the Seattle/Puget Sound area. Demographics for these dis-
tricts are presented in Table 1.
Teacher participants were nominated by special educa-
tion directors, who were asked to identify individuals they
believed would be interested in being interviewed for the
study. In each district, we requested nominations of teachers
working in three types of programs or settings: resource
rooms serving students with a wide range of disability
labels placed primarily in general education classrooms,
self-contained classes serving students with emotional/
behavioral disabilities (EBD), and self-contained class-
rooms serving students with low-incidence developmental
disabilities. Table 2 reports the number, working context,
and experience level of study participants in each of the dis-
tricts in which we collected data.
Data Collection and Analysis
Interviews. The primary data source for our study consisted
of face-to-face interviews we conducted individually with
the 27 special educators who agreed to participate in the
study. We used semistructured interview protocols for each
of the four types of practitioners we interviewed: special
education directors, resource room teachers, EBD teachers,
and teachers of students with low-incidence developmental
disabilities. While the protocols for administrators and
teachers varied in some ways, both were structured to pro-
ceed from general, context-descriptive questions such as
“Tell me about the work you do,” to more focused questions
about daily practice (“Tell me about a typical day in your
classroom”). We asked each informant to define the term
EBP and tell us what it meant to them in terms of their deci-
sions about curriculum and instruction. Interview protocols
also included a series of questions about district policies
related to EBP in both general and special education, and
how these affected the decisions our informants made in the
classroom. Interviews were generally between 45 min to an
hour in length. Interviews were audio-recorded and subse-
quently transcribed verbatim for analysis. Transcripts were
entered into a web-based platform for qualitative and
mixed-method data analysis (http://www.dedoose.com).
Data analysis. We used the standard procedures for induc-
tive data analysis described by Charmaz (2002), Strauss and
Corbin (1997), and others. Thus, we began our analysis by
having each of the 11 members of our research team read
through the interview transcripts, identifying text segments
of potential relevance to our research questions. Each of
these segments was tagged using low inference descriptors,
such as “classroom assessment” or “progress monitoring.”
Members of the research team then met to discuss examples
of the text segments they had tagged, identifying and defin-
ing codes emerging from individual analysis to be formal-
ized and used collectively. The remainder of the interviews
were then coded, followed by an additional round of team
meetings in which examples of each code were discussed,
with some codes combined, others modified or deleted
based on their perceived value relative to our research ques-
tions. A set of interpretive categories were developed
through this process which were used to aggregate coded
data segments and which became the basis for further anal-
ysis. These categories were then used as a basis for develop-
ing a series of data displays (Miles & Huberman, 1994)
organized by district and by each type of participant (i.e.,
resource room teachers, special education directors, etc.).
Team members met to discuss the implications of these
analyses and to develop a set of analytic memos which inte-
grated the categorical data into larger and more interpretive
case summaries. These summaries were used to develop the
set of cross-case findings described below.
Results
Our findings suggest that personal characteristics (particu-
larly values and beliefs about EBP), the features of organi-
zations (particularly practitioner positionality within these
organizations), and access to relevant tools all affected the
Table 1. School District Characteristics.
District Enrollment
Special education
enrollment (%)
Students eligible for
free or reduced-price
meals (%)
A 18,123 9.70 22.10
B 20,659 13.60 35.10
C 17,973 13.60 66.90
D 8,920 12.40 26.00
http://www.dedoose.com
30 The Journal of Special Education 50(1)
ways practitioners interpreted the relevance of the EBP to
decisions they made about practice. We interpreted these as
dimensions of practical activity that were inseparable and
mutually constitutive (Billett, 2006). As depicted in
Figure 1, our data suggest these factors operate in a highly
interdependent manner. We use this conceptual model to
understand both the points of the triangle and the interac-
tions that take place between points as represented by the
lines of the triangle.
In the following sections, we will present findings both
related to the points of the triangle and the intersections of
elements. First, we use excerpts from our interviews to
illustrate how the practitioners we interviewed interpreted
the idea of EBP, the organizational contexts they worked in,
and the tools and resources available to them. Second, we
present findings that illuminate the connections and interac-
tions between them.
People: Practitioner Definitions of EBP
We asked each of our informants how they defined EBP in
the context of their work in special education. The predomi-
nance of responses to this question reflected the notion that
EBP meant that “someone” had researched a specific pro-
gram or practice and found it to be effective:
There’s obviously been research and studies so what I picture
in my mind is that they have a curriculum and they conduct a
study where they have kids who participate in the study and
then they probably have some pre- and posttest to see if they’ve
made gains.
I’d say evidence-based would be like, that it’s been tried in lots
of different settings, across you know lots of different
populations and there’s been demonstrated success using that
curriculum or whatever the thing is you’re talking about, you
know, the social skills sheet or something. So it’s used with lots
of people and over different settings.
We noticed that our participants typically defined EBP in
ways that emphasized its external origins, and its ostensive
function as a “prescription” for their practice, rather than as
a resource for their own decision making (Cook & Odom,
2013). In some cases, this interpretation was also congruent
with the stance taken by district administrators:
We have adults that want to continue to do what they’ve done
in the past. And it is not research-based nor if you look from a
data perspective has it been particularly effective and that’s not
going to happen and we say, “This is the research, this is what
you’re going to do.” (Special Education Director, District A)
This strong ideological commitment to use of EBP in the
classroom was shared by some teachers:
I believe that by using research-based instruction, and teaching
with fidelity, then you’re more likely to have an outcome that
is specific to the research, as long as we use the curriculum as
it’s designed. Um, I think it’s vital, I think it’s vital that we are
not pulling things out of a hat, that we are using. (Resource
Room Teacher, District A)
More often, however, we found that practitioner views
about research in general, and the value of EBP in decision
making about classroom practice in particular, were more
ambivalent. Perhaps the most widely shared concern about
EBP expressed by our informants had to do with the ten-
sions they perceived between the “general case” and the
specifics of local context, including the special needs of the
children served in the schools and classrooms in which they
worked (Cook, Tankersley, Cook, & Landrum, 2008).
While the value of research and the relevance of EBP were
often acknowledged in abstract terms, both teachers and
administrators were quick to identify what they perceived
to be limitations in the relevance of research for local deci-
sion making and special populations:
. . .well what makes me question it—I’m always curious about
what the norm population is because it is talking about typically
developing kids and research-based practices that are used for
those types of kids. It’s very different for my kids. So when I’m
looking at an evidenced-based practice I want to be clear on
what evidence [is about] Gen Ed versus the Special Ed
population. (Self-Contained Classroom Teacher, District B)
Table 2. Participant Characteristics.
Participants
Number of
participants per
district
Median years in
position District
Special education
director
EBD
teacher
Resource room
teacher
Self-contained
teacher
A 1 1 2 2 6 7
B 1 1 2 3 7 10
C 1 2 2 2 7 6
D 1 2 2 2 7 6
Note. EBD = emotional/behavioral disabilities.
Hudson et al. 31
For many teachers, ambivalence about EBP included
particular tension about who makes decisions about the rel-
evance of evidence to their classroom practice. These teach-
ers often made reference to local perspectives as “forgotten”
or “overlooked” in decisions about practice:
. . . evidence-based is very important because you do need to
look at what you’re doing but there is just the day-to-day
knowledge that is overlooked in the evidence-based piece.
(Self-Contained Classroom Teacher, District B)
Most of the teachers and many administrators we inter-
viewed appeared to locate the authority for making evi-
dence-based decisions about curriculum and instruction
with the district central office or with “general education.”
For example, one director of special education reported,
“for our resource room students . . . we always start with the
Gen Ed and then if we have curriculum, if a part of that cur-
riculum has a supported intervention component to it we
start with that.” Many teachers similarly viewed the locus
of decisions about curriculum and instruction as external to
their classrooms. As a Resource Room Teacher in District D
puts it,
They tell us what to teach and when to teach it. I mean, we
have a calendar and a pacing guide. We can’t, we really don’t
make the decisions too much. I would hope . . . that it’s
supported and making sure that students learn but I don’t
really know.
In some cases, these teachers expressed confidence that
the judgments of district curriculum decision makers were
grounded in appropriate evidence:
I kind of just trust that the district is providing me with
evidence-based stuff. So I’m trusting that the curriculum that
they’ve chosen and that my colleagues have done test work on
is really what they say it is. (EBD Teacher, District D)
However, in other cases, teachers expressed more skepti-
cal views about the trustworthiness of the data district offi-
cials used to make decisions about curriculum:
. . . over the years we’ve had so many evidence-based, research
based and so many changes, that . . . if you just want my honest
[opinion] . . . I know that there’s data behind it, but if it’s
evidence based or research based, why are we always changing?
(EBD Teacher, District B)
Figure 1. Relationships between people, organizations, and
tools. Adapted from McDiarmid & Peck (2012).
32 The Journal of Special Education 50(1)
To summarize, similar to earlier studies (Boardman
et al., 2005; Jones, 2009; Landrum et al., 2002), we found
that the personal characteristics of practitioners—that is,
their experiences, values, beliefs, and attitudes—functioned
as a powerful filter through which they interpreted the
meaning of EBP and evaluated the relevance of this con-
struct for their decision making about curriculum and
instruction. Practitioner definitions of EBP often reflected
the assumption that the locus of authority regarding EBP
lies outside the classroom, and the ostensive function of
EBP was to provide prescriptions for classroom practice.
In the following sections, we report findings related to
our second research question, describing ways in which
contextual features such as organization and access to tools
and resources may influence the way practitioners interpret
the value and relevance of EBP in their daily work.
Organizational Contexts of EBP
Our data suggest that our interviewees’ views about the
value and relevance of evidence in decision making about
practice were often part of a larger process of coping with
the organizational conditions of their work. Several specific
issues were salient in the interviews we conducted. One of
these, of course, had to do with district policies about evi-
dence-based decision making (Honig, 2006). In some dis-
tricts, special education administrators described strong
district commitments related to the use of research evidence
in decision making:
In this district, it’s [EBP] becoming really big. You don’t ever
hear them talk about any initiative without looking at the
research and forming some sort of committee to look at what
practices are out there and what does the research tell us about
it. And then identifying what are the things we’re after and how
well does this research say they support those specific things
we want to see happen. I would say that work has started, and
that is the lens that comes from Day One of anything we do.
(Special Education Administrator, District C)
However, strong district commitments to evidence-based
curriculum decisions in general education were sometimes
viewed as raising dilemmas for special education teachers.
A teacher of students with emotional and behavioral prob-
lems in District B described the problem this way:
I try to make our classroom setting as much like a general Ed
classroom as I can, because the goal is to give them strategies to
work with behaviors so that they can function in Gen Ed
classrooms. Which is a challenge, because they were in Gen Ed
classrooms before they came here, so something wasn’t
working.
Some special education teachers described being caught
between curriculum decisions made in general education
and practices they saw as more beneficial for their students
with disabilities:
. . . if (general education teachers) change their curriculum then
I need to follow it so my kids can be a part of it. Especially
with
my kids being more part of the classroom. So you know 4 years
ago I was not doing the Math Expressions, and now I am doing
the Math Expressions and it’s hard because I’m trying to follow
the Gen Ed curriculum and there’s times where the one lesson
ends up being a 2- or 3-day lesson with some added worksheets
because they just aren’t getting the skill and, being a spiraling
curriculum, it goes pretty fast sometimes too. (Self-Contained
Teacher, District B)
While the tensions between curriculum models in gen-
eral education and special education (with each claiming its
own evidentiary warrants) were problematic for many of
the resource room teachers we interviewed, these dilemmas
were less salient to teachers of children in self-contained
classrooms, including those serving students with EBD and
those serving students with low-incidence disabilities.
Instead, the challenges that these teachers described had to
do with isolation and disconnection from colleagues serv-
ing students like theirs. A Self-Contained Classroom
Teacher, District B, said, “Well, this job is very isolating.
I’m the only one that does it in this building . . . so uh I’m
kind of alone in the decisions I make.” Another Self-
Contained Classroom Teacher, District D, said,
Sometimes it makes me feel like it’s less than professional . . .
I don’t know, I just sometimes wish that, I feel like there’s not
always a lot of oversight as far as what am I doing. Is there a
reason behind what I’m doing? And did it work? I wish there
was more.
In cases where teachers felt isolated and disconnected
from district colleagues, they often reported relying on
other self-contained classroom teachers for support and
consultation, rather than resources in their district or from
the research literature:
When you’re in an EBD class you can get pretty isolated . . .
but the other beauty of being in an EBD room is you have
other adults in the room that you can talk to, or they can have
other ideas, or you can call other teachers. (EBD Teacher,
District B)
Resources and Tools Related to EBP
District commitments to use of EBPs in the classroom were
in many cases accompanied by allocation of both material
and conceptual resources. The resources and supports most
often cited by both administrators and teachers in this con-
nection were focused on curriculum materials:
Susan, who is our Special Ed curriculum developer, and
Louisa, who’s our curriculum facilitator . . . they recognize that
we’ve shifted practice to a really research-based practice . . .
We never really did this [before] we just bought books and
Hudson et al. 33
stuff. And I said, “Well, I don’t operate that way. We’re going
to shift practice and I’m going to help you” . . . and she has
actually been very, very successful at reaching out and
capturing the attention of folks, authors that publish . . . and
some other materials and some other research and then digging
deep. (Special Education Director, District A)
I think there’s something really powerful about having that
scope and sequence and that repetition that gradually builds on
itself. Yeah so instead of me trying to create it as I go, having
that research-based program, and of course I’m going to see if
it’s not working, then I’m flexible to change it, but I’m going to
have a good base to at least start with. (Resource Room
Teacher, District B)
While district curriculum resources (which were often
assumed to be evidence-based) were important tools for
many resource room teachers …
Vol. 79. No. 2, pp. 135-144.
©20 J 3 Council for Exceptional Children.
Exceptional Children
Evidence-Based Practices
and Implementation Science
in Special Education
BRYAN G. COOK
University of Hawaii
SAMUEL L. ODOM
University of North Carolina at Chapel Hill
ABSTRACT:r: Establishing a process for identifying evidence-
based practices (EBPs) in special educa-
tion has been a significant advance for the field because it has
the potential for generating more
effective educational pro-ams and producing more positive
outcomes for students with disabilities.
However, the potential benefit of EBPs is bounded by the
quality, reach, and maintenance of
implementation. The cross-disciplinary field of implementation
science has great relevance for
translating the promise of EBPs into positive outcomes for
children and youth with disabilities.
This article examines the history, extent, and limitations of
EBPs and describes the emergence and
current state of implementation science as applied in special
education. Subsequent articles in this
special issue i?/Exceptional Children address a range of issues
related to implementation science in
special education: the research-to-practice gap, dissemination
and diffusion, adherence and sustain-
ability, scaling up, a model for state-level implementation, and
fostering implementation through
professional development.
ducators generally agree that
broad implementation of prac-
tices shown by scientific
research to reliably cause
increased student performance
(i.e., evidence-based practices; EBPs) will result in
increased student outcomes (Cook, Smith, &:
Tankersley, 2012; Slavin, 2008b). Despite special
educators' general affinity for the concept of
EBPs, as Odom and colleagues (2005) suggested,
the devil of EBPs is in the details. Odom et al.
were referring to the difficulties involved in identi-
fying EBPs (e.g.. How many studies must support
an EBP? What research designs should be consid-
ered? What quality indicators are necessary for a
trustworthy study? What effects must a practice
have to be considered an EBP?). Although these
issues continue to be debated (see Cook, Tankers-
ley, & Landrum, 2009a; Slavin, 2008a), there has
been considerable progress in generating and ap-
plying guidelines for identifying EBPs in general
(e.g.. What Works Clearinghouse, WWC, 2011)
and special (e.g.. National Professional Develop-
ment Center on Autism Spectrum Disorders, n.d.;
National Secondary Transition Technical Assis-
tance Center, n.d.) education. However, the EBP
movement may have leapt from the frying pan
into the fire: The progress made in identifying
Exceptional Children
EBPs has highlighted the devilish details involved
with implementation of EBPs, which now need to
be addressed.
The gap—described by some as a chasm (e.g.,
Donovan &C Cross, 2002)—between research and
practice is a recurring theme in special education.
Indeed, we suspect that the gap has been present
in special education as long as research and prac-
tice have co-existed. Attempts to bridge the re-
search-to-practice gap by identifying and
implementing effective practices are a rich part of
special education's history (Mostert & Crockett,
1999-2000). Despite considerable focus on the
research-to-practice gap (e.g., Carnine, 1997;
Greenwood 8¿ Abbott, 2001) and on identifying
EBPs as means to bridge it (e.g.. Cook et al.,
2009b; Odom et al., 2005), there is little evidence
suggesting that the gap has been meaningfully re-
duced. For example, a U.S. Department of Educa-
tion report (Crosse et al., 2011) noted that only
7.8% of prevention programs related to substance
abuse and school crime used in over 5,300 schools
met their standards for an EBP. And, in special ed-
ucation, practitioners have reported using instruc-
tional practices shown by research to be ineffective
(e.g., learning styles) with similar or greater fre-
quency than some research-based practices (e.g.,
mnemonics; Burns &C Ysseldyke, 2009).
This special issue of Exceptional Children fo-
cuses on addressing some of the devilish details
related to bridging the research-to-practice gap by
achieving broad, sustained, and high-quality im-
plementation of EBPs. There is an emerging field
of implementation science (Eccles & Mittman,
2006) that can be applied in special education to
enhance the utilization of EBPs. To contextualize
consideration of implementation science related
to EBPs in special education, it's important to
define what an EBP is, as well as to be aware of
critical caveats and controversies related to EBPs
in the field of special education.
E V I D E N C E - B A S E D P R A C T I C E S
WHAT ARE EBPS?
Emerging from the field of medicine in the early
1990s (Sackett, Rosenberg, Gray, Haynes, &
Richardson 1996), EBPs are practices and pro-
grams shown by high-quality research to have
meaningful effects on student outcomes. The
logic behind EBPs is simple: Identifying and
using the most generally effective practices will
increase consumer (e.g., student) outcomes. This
logic rests on the assumptions that the most effec-
tive practices were not previously identified, im-
plemented, or both; and that certain types of
research (i.e., high-quality studies using designs
from which causality can be inferred) are the best
tools to determine effectiveness. Although not
without detractors (e.g., Gallagher, 2004; Ham-
mersley, 2005) this logic has been generally
accepted (Slavin, 2008b) and even written into
law (i.e., the No Child Left Behind Act of 2001's
emphasis on "scientifically based research").
Unlike previous approaches for identifying ef-
fective practices in education (e.g., best practices,
research-based practices), supporting research for
EBPs must meet prescribed, rigorous standards
(Cook & Cook, 2011). Although specific stan-
dards for EBPs vary between and within fields,
research support for EBPs generally must meet
standards along several dimensions, including
research design, quality, and quantity. Typical
guidelines require that for a practice to be consid-
ered evidence-based it must be supported by
multiple, high-quality, experimental or quasi-
experimental (often including single-case research)
studies demonstrating that the practice has a
meaningful impact on consumer (e.g., student)
outcomes.
Discussion and promotion of EBPs have be-
come seemingly ubiquitous in education in recent
years (Detrich, 2008)—EBPs are promoted in na-
tional, state, and local educational policies; in pro-
fessional conferences, university courses, and
professional development; in professional stan-
dards; and in informal discussions among educa-
tors. The federally funded WWC (http://ies.ed
.gov/ncee/wwc/), established in 2002, is perhaps
the most comprehensive and well known source of
EBPs for education. Until recently, however, the
WWC did not identify EBPs for students with dis-
abilities, and now does so only for certain disability
groups. (The WWC has begun reviewing the evi-
dence base of practices for students with learning
disabilities, in early childhood special education,
and with emotional and behavioral disorders.)
To address the need for standards for EBPs
designed for and by special educators, Gersten et
1 3 6 Winter 2013
al. (2005) and Horner et al. (2005) generated
standards for identifying EBPs in special educa-
tion using group experimental and single-subject
research, respectively, in a special issue of Excep-
tional Cbildren (Odom, 2005). Since that special
issue, various organizations and teams of special
education scholars have used the standards pro-
posed by Gersten et al. and Horner et al. (2005;
e.g., Cook et al., 2009a), used standards adapted
from Gersten et al. and Horner et al. (Odom,
Collet-Klingenberg, Rogers, & Hatton, 2010),
and developed independent sets of standards (e.g..
National Autism Center, 2009) to begin to iden-
tify a corpus of EBPs in special education. These
and other ongoing efforts to establish EBPs in
special education represent an important advance
for the field. However, EBPs are not a panacea,
and considerable and fundamental work remains
to be done if they are to meaningfully improve
outcomes for children and youth with disabilities.
CAVEATS AND CONTROVERSIES
The introduction of EBPs in any field seems to be
inexorably followed by a period of questioning
and resistance, which certainly has occurred in
education (e.g., Hammersley, 2007; Thomas &
Pring, 2004). Although a complete discussion of
caveats and controversies regarding EBPs in (spe-
cial) education are beyond the scope of this article
(see Cook et al., 2012 for an extended discus-
sion), we focus our attention here on a few
prominent issues of which special educators
should be aware: EBPs are not guaranteed to
work for everyone, identification of EBPs is in-
complete and variable, and EBPs will not be im-
plemented automatically or easily in the "real
world" of schools and classrooms.
EBPs Are Not Guaranteed to Work for Every-
one. No practice will work for every single stu-
dent; this is a reality of education (indeed, for all
social sciences) of which special educators are
keenly aware. As such, when educational re-
searchers speak of causality, they do so in a proba-
bilistic rather than absolute sense. That is, saying
that an instructional practice causes improved ed-
ucational outcomes means that the practice reli-
ably results in improved outcomes for the vast
majority, but not all, students who receive the in-
tervention. Eor example, Torgesen (2000) esti-
mated that the most effective early reading inter-
ventions do not positively impact between 2%
and 6% of children. Researchers typically refer to
students for whom effective practices do not cause
meaningfully improved outcomes as treatment
resisters or nonresponders. Although EBPs have
relatively low rates of nonresponders, it is impor-
tant to recognize that even when implemented
with fidelity and over time EBPs will not result in
optimal outcomes for all students. Thus, when se-
lecting practices to use in special educatiorl pro-
grams, EBPs are a good place to start; but the
application of an EBP, like any other instructional
practice, represents an experiment of sorts in
which special educators must validate its effective-
ness for each individual child.
No practice will work for every single
student; this is a reality of education.
Incomplete and Variable Identification of EBPs.
Although more and more EBPs are being identi-
fied in both general and special education, be-
cause of the considerable time and expertise it
takes to complete an evidence-based review (i.e.,
apply standards for EBPs to the body of research
literature examining the effectiveness of a prac-
tice) many practices have not yet been reviewed.
And because of the relative scarcity of high-
quality, experimental research in the educational
literature (Berliner, 2002; Seethaler & Euchs,
2005), many evidence-based reviews result in the
conclusion that there is simply not enough high-
quality research utilizing appropriate designs to
meaningfully determine whether a practice is evi-
dence-based. In other words, just because a prac-
tice is not considered an EBP does not necessarily
mean that it is ineffective. It is then important to
distinguish between practices that are not consid-
ered evidence-based because (a) they have been
shown by multiple, high-quality research studies
from which causality can be inferred to be ineffec-
tive and (b) an evidence-based review has not
been conducted or there is insufficient research to
conclusively determine whether the practice is
effective (Cook & Smith, 2012). The former
practices should rarely if ever be used, whereas the
latter might be implemented when relevant EBPs
have not been identified or a student has been
shown to be a nonresponder to identified EBPs.
Exceptional Children
Special educators also should recognize that
there are many difFerent approaches for identifying
and categorizing EBPs. For example. Homer et al.
(2005) proposed dichotomously categorizing prac-
tices (i.e., evidence-based or not evidence-based),
Gersten et al. (2005) proposed a three-tiered ap-
proach (i.e., evidence-based, promising, and not
evidence-based), and the W W C (2011) uses six
classifications (i.e., practices with positive, poten-
tially positive, mixed, indeterminate, potentially
negative, and negative effects) to categorize the ev-
idence base of practices. Moreover, approaches for
identifying EBPs in education vary on specific
standards For research design, quality of research,
quantity of research, and efFect size (see Cook et
al., 2012, For an extended discussion). Accord-
ingly, the evidence-based status oF some practices
will likely vary across EBP sources (Cook & Cook,
2011). It is important, then, to consider EBPs
within the context oFthe specific standards used to
identify them.
Implementation. The research-to-practice gap
underlies what is probably the most vexing caveat
related to EBPs: the diFficulty in translating re-
search findings to the everyday practices oF teach-
ers in typical classrooms. As EBPs in education
began to be identified, relatively little attention
was given to how to implement them, perhaps
under the assumption that school personnel would
eagerly and readily apply identified EBPs. How-
ever, as Fixsen, Blase, Homer, and Sugai (2009)
noted, "choosing an evidence-based practice is one
thing, implementation oF that practice is another
thing altogether" (p. 5). The problem oF imple-
mentation is not unique to EBPs and likely under-
lies the generally d i s a p p o i n t i n g o u t c o m e s
associated with most school reForm eFForts (e.g.,
Sarason, 1993). Implementing and sustaining new
practices involves a host oF complex and interre-
lated problems, including issues related to the
practice being promoted (e.g., relevance and fit to
target environment, eFFiciency and practicality),
users (e.g., available time, mistrust oF research,
knowledge oF EBPs, skills), and the institutional
context (e.g., available resources, organizational
structures and culture, stafTlng, coaching, training,
administrative support; Fixsen, Naoom, Blase,
Friedman, & Wallace, 2005; Nelson, LeFfler &
Hansen, 2009; Tseng, 2012).
Implementation issues have been reFerred to
as "wicked" problems (e.g., Fixsen, Blase, Duda,
Naoom, ôqVan Dyke, 2009; Signal et al., 2012)
because, among other characteristics, they are
moving targets that fight back (Rittel S¿ Webber,
1973). For example, Fixsen, Blase, Metz, and Van
Dyke (this issue) noted that organizational sys-
tems work to sustain the status quo by "over-
whelm[ing] virtually any a t t e m p t to use new
evidence-based programs" (i.e., fight back). In
contrast, tame issues may be complex but they
tend not to change or actively resist being solved.
As diFFicult as it may be to address the tame issue
oF how to identify EBPs, it is a fixed, circum-
scribed issue that once solved, stays solved. It is
hardly surprising, then, that typical, passive ap-
proaches For promoting the implementation oF
EBPs (e.g., "train and hope") that do not provide
systematic and ongoing supports almost invari-
ably Fail to address the wicked problems oF imple-
mentation and thereFore seldom result in broad,
sustained change (Fixsen et al., 2005).
Implementation is the critical link between
research and practice. Fixsen et al. (this issue) pro-
posed a simple Formula to represent the critical
interaction oF research eFFicacy and practice (im-
plementation) in generating outcomes:
Effective interventions X effective implementation =
improved outcomes
The implication oF this Formula is that in the ab-
sence oF implementation, even the most eFFective
intervention will not yield desired outcomes.
Glasgow, Vogt, and Boles (1999) conceptualized
the slightly more elaborate RE-AIM Framework to
represent the importance oF multiple dimensions
oF implementation in determining a practice's
real-world impact. The RE-AIM model considers
Four aspects oF implementation in addition to a
practice's eFFicacy in determining impact—R X E
X A X I X M = impact, where:
• Reach: the proportion oF the target popula-
tion reached by a practice.
• EFficacy: the success rate oF a practice when
implemented appropriately.
• Adoption: the proportion oF targeted settings
that adopt the practice.
1 3 8 Winter 2013
• Implementation: the proportion of interven-
tionists who implement the practice with
fidelity in real world settings.
• Maintenance: proportion of organizations
(e.g., schools) and interventionists (e.g.,
teachers) who maintain implementation of
the practice over time.
Imagine, for example, that a school district
adopts an EBP for its students with learning dis-
abilities in elementary schools. District personnel
are understandably excited to begin the new year
by rolling out a practice that has been shown by
multiple, high-quality studies to meaningfully im-
prove outcomes for, say, 95% of elementary chil-
dren with learning disabilities. However, only
80% of elementary schools agree to participate in
the project (reach). Further, given problems re-
lated to training, planning and instructional time,
and reluctance to adopt new practices, only 70%
of teachers within targeted schools end up using
the practice at all (adoption). Due to sometimes
ineffectual training and lack of ongoing support,
perhaps only 60% of teachers who adopt the
practice implement it with fidelity; and only 50%
of those maintain their use of the practice over
the entire school year. In this scenario, actual im-
pact is calculated as
.95 (efficacy) X .80 (reach) X .70 (adoption) X
.60 (implementation) X .50 (maintenance) = .16
In other words, due to problems at various levels
of implementation, the EBP actually had the de-
sired impact on slightly less than 16% of elemen-
tary students with learning disabilities—a far cry
from the rosy 95% efficacy that district adminis-
trators found so attractive.
After considering these numbers, it may
seem that special educators would be better served
by pursuing practices that appeal to teachers and
are easily implemented, but which are less effec-
tive (i.e., typical practice), than by chasing the
large effects of EBPs that may be difficult to real-
ize. However, special educators sell themselves
short—and, more important, do a disservice to
the students they serve—by settling for practices
with limited effects. Efficacy and implementation
both set a ceiling for real-world impact. Just as a
highly efficacious intervention that is not imple-
mented will have no real effect, an ineffective
practice that is broadly implemented remains an
ineffective practice that will, at best, have limited
impact. When considering the importance of im-
plementation, educators should not disregard the
importance of efficacy, but rather realize the sym-
biotic relationship of efficacy and implementation
in determining impact.
The recent emphasis on EBPs in special edu-
cation is laudable, encouraging, and necessary, but
identification of EBPs is insufficient without sup-
porting their use in common practice (Odom,
2009). The challenge is how to achieve high levels
of implementation of the most effective practices.
Unfortunately, because sound research investigat-
ing implementation has been sparse, "we are faced
with the paradox of non-evidence-based imple-
mentation of evidence-based programs" (Drake,
Gorman, & Torrey; as cited in Fixsen et al., 2005,
p. 35). Special educators do not yet have com-
plete, empirically substantiated guidelines for sup-
porting implementation of EBPs. The emerging
field of implementation science has begun to
address this issue by conducting research and gen-
erating theories regarding the implementation of
EBPs.
I M P L E M E N T A T I O N S C I E N C E
In the inaugural issue of Implementation Science,
Eccles and Mittman (2006) defined implementa-
tion science as "the scientific study of methods to
promote the systematic uptake of research find-
ings and other evidence-based practices into rou-
tine practice" (p. 1). A number of related terms
have been used to refer to this area of study (e.g.,
knowledge utilization, knowledge transfer, knowl-
edge translation, implementation research, transla-
tional research, diffusion, uptake; Straus, Tetroe, &
Graham, 2009). We use implementation science
because, in our experience, it is the most fre-
quently used term by contemporary education
scholars. This is not meant to suggest that a
definitive corpus of knowledge has been estab-
lished in the area of implementation (i.e., a sci-
ence of implementation); rather, it denotes a field
of scientific inquiry in which issues related to
implementation are investigated.
Implementation science, which draws on a
rich history of foundational research investigating
Exceptional Children 139
implementation in various fields (e.g., Rogers,
1962; see Weatherly & Lipsky, 1977, for an
example in special education), is associated most
closely with the second of two phases of transla-
tion research. The first phase of translating
research into practice involves the relatively neat,
orderly, and relatively well funded, endeavors of
conducting and synthesizing applied research to
determine what works in real-world settings (i.e.,
establishing EBPs; Hiss, 2004). Hiss suggested
that Phase 2 translation research, which investi-
gates adopting and sustaining the EBPs identified
in Phase 1 translation research, tends to be messy
and poorly funded. However, with the recent
increase in attention being paid to implemen-
tation (or lack thereof), funding appears to be
increasing. For example, the W. T. Grant founda-
tion recently funded 15 research projects in gen-
eral education designed to examine how research
is used to inform policy and practice in local
schools (Tseng, 2012).
Essentially, the goal of inquiry in implemen-
tation science is to research and understand how
innovations are adopted and maintained, so that
implementation moves from "letting it happen"
to "making it happen" (Greenhalgh, Robert, Mac-
Farlane, Bate, Si Kyriakidou, 2004). As has been
the case with the vast majority of previous educa-
tion reforms, letting EBPs happen (i.e., assuming
that they will be implemented by virtue of their
identification) has proven largely unsuccessful
(Tseng, 2012). To bring about the broad and sus-
tained implementation of EBPs, special educators
need to (a) look to the lessons learned thus far
from implementation science and (b) identify
what is not known about making EBP implemen-
tation happen and condtict research to systemati-
cally fill those gaps in our knowledge base.
Based on their comprehensive review of the
literature in implementation science, Fixsen et al.
(2005) concluded that the relatively sparse experi-
mental research in implementation science indi-
cates that providing guidelines, policies,
information, and training are not enough to
"make it happen." In contrast, long-term, multi-
level strategies tend to result in successful imple-
mentation. The authors gleaned seven core
implementation components (or implementation
drivers) that, when in place and functioning at a
high level, can routinely change and improve
practitioner behavior related to the implementa-
tion of EBPs: staff selection, pteservice and inser-
vice training, ongoing consultation and coaching,
staff evaluation, ptogram evaluation, facilitative
administrative support, and systems interventions
(i.e., "strategies to work with external systems to
ensure the availability of the financial, organiza-
tional, and human resources required to support
the work of the practitioners," p. 29). They sug-
gested that purveyors—change agents who are
experts at identifying and addressing obstacles to
implementation—are critical for utilizing core
implementation components to achieve broad
and sustained implementation of EBPs.
Schoolwide positive behavior support
(SWPBS) is a good example of a program used in
special education that incorporates lessons from
implementation science into its design (see Mcln-
tosh. Filter, Bennett, Ryan, & Sugai, 2010). In-
deed, SWPBS implementation is guided by a
model incorporating five principles drawn from
implementation science: contextual fit, priority,
effectiveness, efficiency, and using data for contin-
uous regeneration (Mclntosh, Horner, & Sugai,
2009). For example, SWPBS practices ate modi-
fied to maximize fit with the environment in
which they will be implemented, although modifi-
cations are made with a strong understanding of
SWPBS such that they do not violate the integrity
of core components of the intervention (i.e.,
fidelity with flexibility; see Harn, Parisi, &
Stoolmiller, this issue). Moreover, SWPBS fre-
quently utilizes structures such as state leadership
teams that lead and coordinate training, coaching,
and evaluation to systematically support and scale
up SWPBS (see Fixsen et al., this issue; Sugai &
Horner, 2006). Such attention to the principles of
implementation science has, no doubt, contribu-
ted to SWPBS's extensive, sustained, and effective
application (e.g., Horner, Sugai, & Anderson,
2010).
Fixsen et al. (2005) defined implementation
broadly: "activities designed to put into practice
an activity or program" (p. 5). Thus, virtually any
activity involved in the implementation process
might be considered under the purview of imple-
mentation science. The topics addressed in this
special issue of Exceptional Children (i.e., a theo-
retical framework for linking research to practice,
dissemination, balancing fidelity with fiexibility
1 4 O Winter 2013
and fit, scaling-up implementation efforts,
statewide implementation efforts, and profes-
sional development) are by no means exhaustive
of the many and varied elements of implementa-
tion science that have application for special edu-
cation. We have included topics that represent
what we believe to be among the most critical
areas for improving the implementation of EBPs
in special education.
We have included topics that
represent what we believe to be
among the most critical areas for
improving the implementation
of EBPs in special education.
A R T I C L E S I N T H I S
S P E C I A L I S S U E
The purpose of this special issue, and each of the
articles in it, is two-fold: (a) review emerging evi-
dence in the area of implementation science that
special education scholars, policy makers, admin-
istrators, and other stakeholders can apply to
advance the implementation of EBPs and (b) pro-
vide a framework for identifying unanswered
questions for future research to explore related to
implementation of EBPs in special education. In
the first article. Smith, Schmidt, Edelen-Smith,
and Cook propose a conceptual framework for
understanding and bridging the research-to-prac-
tice gap. Drawing from Stoke's (1997) Pasteur's
quadrant model, they posit that rather than di-
chotomizing research as either rigorous or rele-
vant, research must be both rigorous and relevant
to be translated into practice and positively im-
pact student outcomes. Smith et al. propose that
educational design research conducted within
communities of practices is a promising approach
for conducting relevant and rigorous inquiry that
will facilitate implementation of EBPs.
One of the critical stages of translating re-
search to practice is disseminating and diffusing
EBPs. Unfortunately, EBPs are primarily dissemi-
nated in traditional and passive ways (e.g., journal
articles, research briefs) that hold little sway with
the practitioners who actually implement the
practices. In the second article. Cook, Cook, and
Landrum explore a variety of approaches for
actively and effectively disseminating research-
validated practices. They utilize Heath and
Heath's (2008) SUCCESs model, which posits
that dissemination efforts that "stick" are simple,
unexpected, concrete, credible, emotional, and
conveyed as stories. They provide theoretically
and empirically validated dissemination ap-
proaches that might be utilized and researched
further by special educators in each of these areas.
If practitioners do not implement EBPs with
fidelity or as designed, the practices may not have
the same positive effect demonstrated in research
studies. However, in the third article, Harn, Parisi,
and Stoolmiller note that demanding rigid adher-
ence to predetermined procedures will decrease
the likelihood that practitioners will adopt and
sustain a practice. Moreover, practitioners being
more concerned with adherence than meeting the
needs of their students may actually decrease EBP
effectiveness. Ham et al. discuss different aspects
of the multifaceted construct of implementation
fidelity and how programs and practices can be
designed flexibly so that they can be implemented
with fidelity but still meet the needs of different
students in varying educational contexts.
In the fourth article. Klingner, Boardman,
and McMaster discuss issues related to scaling up
EBPs. The issue of scale is of critical importance in
implementation science. Although implementing
an EBP in a single school will positively impact
the outcomes of a limited number of students
with disabilities, if implementation of EBPs is
addressed one school at a time, the research-to-
practice gap is likely to remain wide. Klingner et
al. propose a model of scaling up at the district
level that involves district—researcher partnerships,
integrating new practices with other district initia-
tives, tailoring the EBP to the …

Más contenido relacionado

Similar a Psychology in the Schools, Vol. 52(2), 2015 C© 2014 Wiley Peri.docx

Evidence based practices for asd a review (2015)
Evidence based practices for asd a review (2015)Evidence based practices for asd a review (2015)
Evidence based practices for asd a review (2015)Jeane Araujo
 
Applying An Observational Lens To Identify Parental Behaviours Associated Wit...
Applying An Observational Lens To Identify Parental Behaviours Associated Wit...Applying An Observational Lens To Identify Parental Behaviours Associated Wit...
Applying An Observational Lens To Identify Parental Behaviours Associated Wit...Karin Faust
 
A Comparison Of The Mystery Motivator And The Get Em On Task Interventions F...
A Comparison Of The Mystery Motivator And The Get  Em On Task Interventions F...A Comparison Of The Mystery Motivator And The Get  Em On Task Interventions F...
A Comparison Of The Mystery Motivator And The Get Em On Task Interventions F...Addison Coleman
 
Using Action Research to Identify Data During Clinical Experience (main)
Using Action Research to Identify Data During Clinical Experience (main)Using Action Research to Identify Data During Clinical Experience (main)
Using Action Research to Identify Data During Clinical Experience (main)Antwuan Stinson
 
1JOURNAL SUMMARY .docx
1JOURNAL SUMMARY                                             .docx1JOURNAL SUMMARY                                             .docx
1JOURNAL SUMMARY .docxdrennanmicah
 
Identifying Effective Math Interventions for Early Elementary Students.docx
Identifying Effective Math Interventions for Early Elementary Students.docxIdentifying Effective Math Interventions for Early Elementary Students.docx
Identifying Effective Math Interventions for Early Elementary Students.docxwrite4
 
The efficacy of using HW in PEKory Hill.docx
The efficacy of using HW in PEKory Hill.docxThe efficacy of using HW in PEKory Hill.docx
The efficacy of using HW in PEKory Hill.docxmehek4
 
PSY 550 Response Paper RubricRequirements of submission Respon.docx
PSY 550 Response Paper RubricRequirements of submission  Respon.docxPSY 550 Response Paper RubricRequirements of submission  Respon.docx
PSY 550 Response Paper RubricRequirements of submission Respon.docxamrit47
 
Approaches To Working With Children And Families A Review Of The Evidence Fo...
Approaches To Working With Children And Families  A Review Of The Evidence Fo...Approaches To Working With Children And Families  A Review Of The Evidence Fo...
Approaches To Working With Children And Families A Review Of The Evidence Fo...Ashley Carter
 
Behavioral Modification
Behavioral ModificationBehavioral Modification
Behavioral Modificationglwilliams99
 
Maestro en servicio revision de videos
Maestro en servicio revision de videosMaestro en servicio revision de videos
Maestro en servicio revision de videosSisercom SAC
 
Inclusion And Its Effect On Preschool Children With[1]
Inclusion And Its  Effect On  Preschool  Children With[1]Inclusion And Its  Effect On  Preschool  Children With[1]
Inclusion And Its Effect On Preschool Children With[1]rhepadmin
 
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING .docx
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING   .docxRunning Header PROJECT BASED LEARNING PROJECT BASED LEARNING   .docx
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING .docxagnesdcarey33086
 
TeachingAccountability
TeachingAccountabilityTeachingAccountability
TeachingAccountabilityBarry Duncan
 
I have an a reflection assignment on professional issue, what Ive.docx
I have an a reflection assignment on professional issue, what Ive.docxI have an a reflection assignment on professional issue, what Ive.docx
I have an a reflection assignment on professional issue, what Ive.docxwilcockiris
 
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docxIntervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docxnormanibarber20063
 

Similar a Psychology in the Schools, Vol. 52(2), 2015 C© 2014 Wiley Peri.docx (20)

Evidence based practices for asd a review (2015)
Evidence based practices for asd a review (2015)Evidence based practices for asd a review (2015)
Evidence based practices for asd a review (2015)
 
Applying An Observational Lens To Identify Parental Behaviours Associated Wit...
Applying An Observational Lens To Identify Parental Behaviours Associated Wit...Applying An Observational Lens To Identify Parental Behaviours Associated Wit...
Applying An Observational Lens To Identify Parental Behaviours Associated Wit...
 
Wheatleyetal2009
Wheatleyetal2009Wheatleyetal2009
Wheatleyetal2009
 
A Comparison Of The Mystery Motivator And The Get Em On Task Interventions F...
A Comparison Of The Mystery Motivator And The Get  Em On Task Interventions F...A Comparison Of The Mystery Motivator And The Get  Em On Task Interventions F...
A Comparison Of The Mystery Motivator And The Get Em On Task Interventions F...
 
Using Action Research to Identify Data During Clinical Experience (main)
Using Action Research to Identify Data During Clinical Experience (main)Using Action Research to Identify Data During Clinical Experience (main)
Using Action Research to Identify Data During Clinical Experience (main)
 
1JOURNAL SUMMARY .docx
1JOURNAL SUMMARY                                             .docx1JOURNAL SUMMARY                                             .docx
1JOURNAL SUMMARY .docx
 
FAB Strategies
FAB StrategiesFAB Strategies
FAB Strategies
 
Identifying Effective Math Interventions for Early Elementary Students.docx
Identifying Effective Math Interventions for Early Elementary Students.docxIdentifying Effective Math Interventions for Early Elementary Students.docx
Identifying Effective Math Interventions for Early Elementary Students.docx
 
DATA COLLECTION TOOLS .docx
DATA COLLECTION TOOLS                                             .docxDATA COLLECTION TOOLS                                             .docx
DATA COLLECTION TOOLS .docx
 
The efficacy of using HW in PEKory Hill.docx
The efficacy of using HW in PEKory Hill.docxThe efficacy of using HW in PEKory Hill.docx
The efficacy of using HW in PEKory Hill.docx
 
PSY 550 Response Paper RubricRequirements of submission Respon.docx
PSY 550 Response Paper RubricRequirements of submission  Respon.docxPSY 550 Response Paper RubricRequirements of submission  Respon.docx
PSY 550 Response Paper RubricRequirements of submission Respon.docx
 
Approaches To Working With Children And Families A Review Of The Evidence Fo...
Approaches To Working With Children And Families  A Review Of The Evidence Fo...Approaches To Working With Children And Families  A Review Of The Evidence Fo...
Approaches To Working With Children And Families A Review Of The Evidence Fo...
 
BCProposalPDF
BCProposalPDFBCProposalPDF
BCProposalPDF
 
Behavioral Modification
Behavioral ModificationBehavioral Modification
Behavioral Modification
 
Maestro en servicio revision de videos
Maestro en servicio revision de videosMaestro en servicio revision de videos
Maestro en servicio revision de videos
 
Inclusion And Its Effect On Preschool Children With[1]
Inclusion And Its  Effect On  Preschool  Children With[1]Inclusion And Its  Effect On  Preschool  Children With[1]
Inclusion And Its Effect On Preschool Children With[1]
 
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING .docx
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING   .docxRunning Header PROJECT BASED LEARNING PROJECT BASED LEARNING   .docx
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING .docx
 
TeachingAccountability
TeachingAccountabilityTeachingAccountability
TeachingAccountability
 
I have an a reflection assignment on professional issue, what Ive.docx
I have an a reflection assignment on professional issue, what Ive.docxI have an a reflection assignment on professional issue, what Ive.docx
I have an a reflection assignment on professional issue, what Ive.docx
 
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docxIntervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
 

Más de denneymargareta

For this piece of the Humanities Project, submit your topic choice a.docx
For this piece of the Humanities Project, submit your topic choice a.docxFor this piece of the Humanities Project, submit your topic choice a.docx
For this piece of the Humanities Project, submit your topic choice a.docxdenneymargareta
 
For this weeks discussion, pick ONE of the following to answer in a.docx
For this weeks discussion, pick ONE of the following to answer in a.docxFor this weeks discussion, pick ONE of the following to answer in a.docx
For this weeks discussion, pick ONE of the following to answer in a.docxdenneymargareta
 
For this weeks discussion, review the normal profiles for childre.docx
For this weeks discussion, review the normal profiles for childre.docxFor this weeks discussion, review the normal profiles for childre.docx
For this weeks discussion, review the normal profiles for childre.docxdenneymargareta
 
For this project, you will be exploring the developments in material.docx
For this project, you will be exploring the developments in material.docxFor this project, you will be exploring the developments in material.docx
For this project, you will be exploring the developments in material.docxdenneymargareta
 
For this Discussion, you will explore the nature of globalization an.docx
For this Discussion, you will explore the nature of globalization an.docxFor this Discussion, you will explore the nature of globalization an.docx
For this Discussion, you will explore the nature of globalization an.docxdenneymargareta
 
For this Discussion, you will assess the roles of culture and divers.docx
For this Discussion, you will assess the roles of culture and divers.docxFor this Discussion, you will assess the roles of culture and divers.docx
For this Discussion, you will assess the roles of culture and divers.docxdenneymargareta
 
For this discussion, choose one of challenges to overcome in helping.docx
For this discussion, choose one of challenges to overcome in helping.docxFor this discussion, choose one of challenges to overcome in helping.docx
For this discussion, choose one of challenges to overcome in helping.docxdenneymargareta
 
For this Discussion imagine that you are speaking to a group of pare.docx
For this Discussion imagine that you are speaking to a group of pare.docxFor this Discussion imagine that you are speaking to a group of pare.docx
For this Discussion imagine that you are speaking to a group of pare.docxdenneymargareta
 
For this Discussion Board, write about some part of the videos that .docx
For this Discussion Board, write about some part of the videos that .docxFor this Discussion Board, write about some part of the videos that .docx
For this Discussion Board, write about some part of the videos that .docxdenneymargareta
 
For this Assignment,choose one sexual attitude (e.g., abstinence.docx
For this Assignment,choose one sexual attitude (e.g., abstinence.docxFor this Assignment,choose one sexual attitude (e.g., abstinence.docx
For this Assignment,choose one sexual attitude (e.g., abstinence.docxdenneymargareta
 
For this assignment, you will use your Intercultural Interview from .docx
For this assignment, you will use your Intercultural Interview from .docxFor this assignment, you will use your Intercultural Interview from .docx
For this assignment, you will use your Intercultural Interview from .docxdenneymargareta
 
For this assignment, you will research an issue related to informati.docx
For this assignment, you will research an issue related to informati.docxFor this assignment, you will research an issue related to informati.docx
For this assignment, you will research an issue related to informati.docxdenneymargareta
 
For this assignment, you will explore the official sources of crime .docx
For this assignment, you will explore the official sources of crime .docxFor this assignment, you will explore the official sources of crime .docx
For this assignment, you will explore the official sources of crime .docxdenneymargareta
 
For this assignment, prepare a paper to evaluate the following quest.docx
For this assignment, prepare a paper to evaluate the following quest.docxFor this assignment, prepare a paper to evaluate the following quest.docx
For this assignment, prepare a paper to evaluate the following quest.docxdenneymargareta
 
For this assignment, Conduct a thorough case study analysis of the c.docx
For this assignment, Conduct a thorough case study analysis of the c.docxFor this assignment, Conduct a thorough case study analysis of the c.docx
For this assignment, Conduct a thorough case study analysis of the c.docxdenneymargareta
 
For this Assignment, choose a discreet environmental policy issue th.docx
For this Assignment, choose a discreet environmental policy issue th.docxFor this Assignment, choose a discreet environmental policy issue th.docx
For this Assignment, choose a discreet environmental policy issue th.docxdenneymargareta
 
For this assignment, you are going to use your skills of research an.docx
For this assignment, you are going to use your skills of research an.docxFor this assignment, you are going to use your skills of research an.docx
For this assignment, you are going to use your skills of research an.docxdenneymargareta
 
For the Final Project, you will assume the role of a classroom teach.docx
For the Final Project, you will assume the role of a classroom teach.docxFor the Final Project, you will assume the role of a classroom teach.docx
For the Final Project, you will assume the role of a classroom teach.docxdenneymargareta
 
For the first part of your final project, the critical analysis  por.docx
For the first part of your final project, the critical analysis  por.docxFor the first part of your final project, the critical analysis  por.docx
For the first part of your final project, the critical analysis  por.docxdenneymargareta
 
FOR THE CRIMINAL ( Name will be given when bid is accepted)  DO THE .docx
FOR THE CRIMINAL ( Name will be given when bid is accepted)  DO THE .docxFOR THE CRIMINAL ( Name will be given when bid is accepted)  DO THE .docx
FOR THE CRIMINAL ( Name will be given when bid is accepted)  DO THE .docxdenneymargareta
 

Más de denneymargareta (20)

For this piece of the Humanities Project, submit your topic choice a.docx
For this piece of the Humanities Project, submit your topic choice a.docxFor this piece of the Humanities Project, submit your topic choice a.docx
For this piece of the Humanities Project, submit your topic choice a.docx
 
For this weeks discussion, pick ONE of the following to answer in a.docx
For this weeks discussion, pick ONE of the following to answer in a.docxFor this weeks discussion, pick ONE of the following to answer in a.docx
For this weeks discussion, pick ONE of the following to answer in a.docx
 
For this weeks discussion, review the normal profiles for childre.docx
For this weeks discussion, review the normal profiles for childre.docxFor this weeks discussion, review the normal profiles for childre.docx
For this weeks discussion, review the normal profiles for childre.docx
 
For this project, you will be exploring the developments in material.docx
For this project, you will be exploring the developments in material.docxFor this project, you will be exploring the developments in material.docx
For this project, you will be exploring the developments in material.docx
 
For this Discussion, you will explore the nature of globalization an.docx
For this Discussion, you will explore the nature of globalization an.docxFor this Discussion, you will explore the nature of globalization an.docx
For this Discussion, you will explore the nature of globalization an.docx
 
For this Discussion, you will assess the roles of culture and divers.docx
For this Discussion, you will assess the roles of culture and divers.docxFor this Discussion, you will assess the roles of culture and divers.docx
For this Discussion, you will assess the roles of culture and divers.docx
 
For this discussion, choose one of challenges to overcome in helping.docx
For this discussion, choose one of challenges to overcome in helping.docxFor this discussion, choose one of challenges to overcome in helping.docx
For this discussion, choose one of challenges to overcome in helping.docx
 
For this Discussion imagine that you are speaking to a group of pare.docx
For this Discussion imagine that you are speaking to a group of pare.docxFor this Discussion imagine that you are speaking to a group of pare.docx
For this Discussion imagine that you are speaking to a group of pare.docx
 
For this Discussion Board, write about some part of the videos that .docx
For this Discussion Board, write about some part of the videos that .docxFor this Discussion Board, write about some part of the videos that .docx
For this Discussion Board, write about some part of the videos that .docx
 
For this Assignment,choose one sexual attitude (e.g., abstinence.docx
For this Assignment,choose one sexual attitude (e.g., abstinence.docxFor this Assignment,choose one sexual attitude (e.g., abstinence.docx
For this Assignment,choose one sexual attitude (e.g., abstinence.docx
 
For this assignment, you will use your Intercultural Interview from .docx
For this assignment, you will use your Intercultural Interview from .docxFor this assignment, you will use your Intercultural Interview from .docx
For this assignment, you will use your Intercultural Interview from .docx
 
For this assignment, you will research an issue related to informati.docx
For this assignment, you will research an issue related to informati.docxFor this assignment, you will research an issue related to informati.docx
For this assignment, you will research an issue related to informati.docx
 
For this assignment, you will explore the official sources of crime .docx
For this assignment, you will explore the official sources of crime .docxFor this assignment, you will explore the official sources of crime .docx
For this assignment, you will explore the official sources of crime .docx
 
For this assignment, prepare a paper to evaluate the following quest.docx
For this assignment, prepare a paper to evaluate the following quest.docxFor this assignment, prepare a paper to evaluate the following quest.docx
For this assignment, prepare a paper to evaluate the following quest.docx
 
For this assignment, Conduct a thorough case study analysis of the c.docx
For this assignment, Conduct a thorough case study analysis of the c.docxFor this assignment, Conduct a thorough case study analysis of the c.docx
For this assignment, Conduct a thorough case study analysis of the c.docx
 
For this Assignment, choose a discreet environmental policy issue th.docx
For this Assignment, choose a discreet environmental policy issue th.docxFor this Assignment, choose a discreet environmental policy issue th.docx
For this Assignment, choose a discreet environmental policy issue th.docx
 
For this assignment, you are going to use your skills of research an.docx
For this assignment, you are going to use your skills of research an.docxFor this assignment, you are going to use your skills of research an.docx
For this assignment, you are going to use your skills of research an.docx
 
For the Final Project, you will assume the role of a classroom teach.docx
For the Final Project, you will assume the role of a classroom teach.docxFor the Final Project, you will assume the role of a classroom teach.docx
For the Final Project, you will assume the role of a classroom teach.docx
 
For the first part of your final project, the critical analysis  por.docx
For the first part of your final project, the critical analysis  por.docxFor the first part of your final project, the critical analysis  por.docx
For the first part of your final project, the critical analysis  por.docx
 
FOR THE CRIMINAL ( Name will be given when bid is accepted)  DO THE .docx
FOR THE CRIMINAL ( Name will be given when bid is accepted)  DO THE .docxFOR THE CRIMINAL ( Name will be given when bid is accepted)  DO THE .docx
FOR THE CRIMINAL ( Name will be given when bid is accepted)  DO THE .docx
 

Último

ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17Celine George
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfPatidar M
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptxMusic 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptxleah joy valeriano
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxlancelewisportillo
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management SystemChristalin Nelson
 
Food processing presentation for bsc agriculture hons
Food processing presentation for bsc agriculture honsFood processing presentation for bsc agriculture hons
Food processing presentation for bsc agriculture honsManeerUddin
 

Último (20)

ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdf
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptxMusic 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
Music 9 - 4th quarter - Vocal Music of the Romantic Period.pptx
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management System
 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
 
Food processing presentation for bsc agriculture hons
Food processing presentation for bsc agriculture honsFood processing presentation for bsc agriculture hons
Food processing presentation for bsc agriculture hons
 

Psychology in the Schools, Vol. 52(2), 2015 C© 2014 Wiley Peri.docx

  • 1. Psychology in the Schools, Vol. 52(2), 2015 C© 2014 Wiley Periodicals, Inc. View this article online at wileyonlinelibrary.com/journal/pits DOI: 10.1002/pits.21815 TRAINING TEACHERS TO USE EVIDENCE-BASED PRACTICES FOR AUTISM: EXAMINING PROCEDURAL IMPLEMENTATION FIDELITY AUBYN C. STAHMER AND SARAH RIETH Rady Children’s Hospital, San Diego and University of California, San Diego EMBER LEE Rady Children’s Hospital, San Diego ERICA M. REISINGER AND DAVID S. MANDELL The Children’s Hospital of Philadelphia Center for Autism Research JAMES E. CONNELL AJ Drexel Autism Institute The purpose of this study was to examine the extent to which public school teachers implemented evidence-based interventions for students with autism in the way these practices were designed. Evidence-based practices for students with autism are rarely
  • 2. incorporated into community settings, and little is known about the quality of implementation. An indicator of intervention quality is procedural implementation fidelity (the degree to which a treatment is implemented as prescribed). Procedural fidelity likely affects student outcomes. This project examined procedural implemen- tation fidelity of three evidence-based practices used in a randomized trial of a comprehensive program for students with autism in partnership with a large, urban school district. Results indicate that teachers in public school special education classrooms can learn to implement evidence-based strategies; however, they require extensive training, coaching, and time to reach and maintain moderate procedural implementation fidelity. Procedural fidelity over time and across intervention strategies is examined. C© 2014 Wiley Periodicals, Inc. Special education enrollment for children with autism in the United States has quadrupled since 2000 (Scull & Winkler, 2011), and schools struggle to provide adequate programming to these students. A growing number of interventions for children with autism have been proven efficacious in university-based research settings, but much less attention has been given to practical issues of implementing these programs in the classroom, where most children with autism receive the majority of their care (Sindelar, Brownell, & Billingsley, 2010). In general, evidence-based practices for children with autism are rarely incorporated into community settings (Stahmer & Ingersoll, 2004). Teachers in public schools report receiving inadequate training and rate their personal efficacy in working with children with autism as low (Jennett, Harris, &
  • 3. Mesibov, 2003). Training public educators to provide evidence-based practices to children with autism is a central issue facing the field (Simpson, de Boer-Ott, & Smith-Myles, 2003). One major challenge to implementing evidence-based practices for children with autism in community settings is the complexity of these practices. Strategies based on the principles of applied behavior analysis have the strongest evidence to support their use (National Standards This research was funded by grants from the National Institute of Mental Health (5R01MH083717) and the Institute of Education Sciences (R324A080195). We thank the School District of Philadelphia and its teachers and families for their collaboration and support. Additionally, Dr. Stahmer is an investigator with the Implementation Research Institute at the George Warren Brown School of Social Work, Washington University, St. Louis, through an award from the National Institute of Mental Health (R25MH080916). Correspondence to: Aubyn C. Stahmer, Child and Adolescent Services Research Center & Autism Discovery Institute, Rady Children’s Hospital, San Diego, 3020 Children’s Way, MC5033, San Diego, CA 92123. E-mail: [email protected] 181 182 Stahmer et al. Project, 2009). These practices vary greatly in structure and difficulty. Some strategies, such as
  • 4. discrete trial teaching (DTT; Leaf & McEachin, 1999; Lovaas, 1987), are highly structured and occur in one-on-one settings, whereas others are naturalistic, can be conducted individually or during daily activities, and tend to be more complex to implement (e.g., incidental teaching; Fenske, Krantz, & McClannahan, 2001; or pivotal response training [PRT]; Koegel et al., 1989). There are also classroom-wide strategies and structures based on applied behavior analysis, such as teaching within functional routines (FR; Brown, Evans, Weed, & Owen, 1987; Cooper, Heron, & Heward, 1987; Marcus, Schopler, & Lord, 2000; McClannahan & Krantz, 1999). Although all of these evidence- based practices share the common foundational principles of applied behavior analysis, each is made up of different techniques. These and other intervention techniques are often packaged together as “comprehensive interventions” (Odom, Boyd, Hall, & Hume, 2010) or used in combination in the field to facilitate learning and expand the conditions under which new student behaviors occur (Hess, Morrier, Heflin, & Ivey, 2008; Stahmer, 2007). Teachers can learn these evidence-based strategies within the context of a research study (e.g., Suhrheinrich, 2011); however, studies report a highly variable number of hours of training needed to master the intervention strategy. For example, the amount of time required to train classroom educators in DTT in published studies ranges from 3 hours (Sarokoff & Sturmey, 2004) at its most brief, to recommendations of 26 to 60 hours of supervised experience (Koegel, Russo, & Rincover, 1977; Smith, Buch, & Gamby, 2000; Smith, Parker, Taubman, & Lovaas, 1992). Teachers have been
  • 5. trained to fidelity in PRT in 8 to 20 hours (Suhrheinrich, 2011). To achieve concurrent mastery of several different intervention techniques and to incorporate the development of appropriate student goals, some researchers have suggested that teachers may need a year or more of full-time, supervised practicum training (Smith, Donahoe, & Davis, 2000). There are several reasons why teachers may not implement evidence-based practices the way they were designed. First, teachers typically receive limited instruction in specific interventions. For example, instruction often comprises attendance at a didactic workshop and receipt of a manual. Teachers are then expected to implement evidence-based practices without the ongoing coaching and feedback that is critical for intervention mastery (Bush, 1984; Cornett & Knight, 2009). Second, most evidence-based practices were not designed for school settings and therefore may be difficult to implement appropriately in the classroom (Stahmer, Suhrheinrich, Reed, Bolduc, & Schreibman, 2011). Perhaps as a result, teachers often report that they combine or modify evidence-based practices to meet the specific needs of their classroom and students (Stahmer, Collings, & Palinkas, 2005). Finally, school administrators sometimes mandate the use of programs that may not align with teachers’ classroom environment, beliefs, or pedagogy (Dingfelder & Mandell, 2011). A major indication of the quality of the implementation of any evidence-based practices is treatment fidelity, also known as implementation fidelity (Gersten et al., 2005; Horner et al., 2005; Noell, Duhon, Gatti, & Connell, 2002; Noell et al., 2005;
  • 6. Proctor et al., 2011; Schoenwald et al., 2011). Implementation fidelity is the degree to which a treatment is implemented as prescribed, or the level of adherence to the specific procedures of the intervention (e.g., Gresham, 1989; Rabin, Brownson, Haire-Joshu, Kreuter, & Weaver, 2008; Schoenwald et al., 2011). There are several types of implementation fidelity. Procedural fidelity (Odom et al., 2010; also called program adherence; Schoenwald et al., 2011) is the degree to which the provider uses procedures required to execute the treatment as intended. Other types of fidelity include treatment differentiation (the extent to which treatments differ from one another), therapist competence (the level of skill and judgment used in executing the treatment; Schoenwald et al., 2011), and dosage (Odom et al., 2010). Although, ideally, all types of fidelity would be examined to determine the fit of an intervention in a school program (Harn, Parisi, & Stoolmiller, 2013), procedural fidelity provides one important avenue for Psychology in the Schools DOI: 10.1002/pits Training Teachers in Autism Practices 183 measuring the extent to which an intervention resembles an evidence-based practice or elements of evidence-based practice (Garland, Bickman, & Chorpita, 2010). Procedural implementation fidelity is likely a potential mediating variable affecting student outcomes, with higher fidelity resulting in better outcomes (Durlak & DuPre, 2008; Gresham,
  • 7. MacMilan, Beebe-Grankenberger, & Bocian, 2000; Stahmer & Gist, 2001); however, it is not often measured. In behavioral services research, three separate reviews of reported implementation fidelity data have been published. In the Journal of Applied Behavior Analysis, fidelity data were reported in only 16% to 30% of published articles (Gresham, Gansle, & Noell, 1993; McIntyre, Gresham, DiGennaro, & Reed, 2007; Peterson, Homer, & Wonderlich, 1982). Three separate reviews indicated that only 13% to 32% of autism intervention studies included fidelity measures (Odom & Wolery, 2003; Wheeler, Baggett, Fox, & Blevins, 2006; Wolery & Garfinkle, 2002). A recent review of special education journals found that fewer than half (47%) of intervention articles reported any type of fidelity scores (Swanson, Wanzek, Haring, Ciullo, & McCulley, 2011). Indeed, limited reporting of implementation adherence is evident across a diverse body of fields (Gresham, 2009). The lack of reporting (and therefore, the presumable lack of actual measurement of implementation) limits the conclusions that can be drawn regarding the association between student outcomes and the specific treatment provided. Therefore, examination of implementation fidelity, although complicated, is important to advance the understanding of how evidence-based interventions are being implemented in school settings. Our research team recently completed a large-scale randomized trial of a comprehensive pro- gram for students with autism in partnership with a large, urban public school district. Procedural implementation fidelity of the overall program (which includes three evidence-based practices) was
  • 8. highly variable, ranging from 12% to 92% (Mandell et al., 2013). The three strategies included in this program, DTT, PRT, and FR (see description in the Method section), share an underlying theo- retical base, but rely on different specific techniques. The purpose of this study was to examine the extent to which public school teachers implemented evidence- based interventions for students with autism in the way these practices were designed. Examining implementation fidelity of each strategy individually may provide insight into whether specific interventions are more easily implemented in the classroom environment. In particular, we examined whether special education classroom teach- ers and staff: 1) mastered specific strategies that form the backbone of applied behavioral analysis programs for autism; 2) used the strategies in their classroom; and 3) maintained their procedural fidelity to these strategies over time. METHOD Participants Participants were classroom teachers and staff in an urban school district’s kindergarten- through-second-grade autism support classrooms (each in a different school) participating in a larger trial of autism services. Of the 67 total autism support classrooms in the district at the time of the study, teachers and staff from 57 (85%) of the schools participated. Each classroom included one participating teacher and 0 to 2 classroom assistants (M = 1). Throughout the district, staff were required to participate in intervention training as part of professional development, but were not
  • 9. required to consent to participate in the study. Data from the current study are reported only for the 57 teachers and staff who consented to participate. Teachers received intensive training in Strategies in Teaching Based on Autism Research (STAR) during their first year of participation in the project. During the second year, continuing teachers received in-classroom coaching every other week. From the original 57, 38 teachers (67%) Psychology in the Schools DOI: 10.1002/pits 184 Stahmer et al. Table 1 Teacher Demographic Characteristics N % Female Total Years Teaching, M (range) Years Teaching Children with ASD, M (range) Education Level % Bachelor’s Degree/% Master’s Degree
  • 10. 57 97.3 10.8 (1–38) 6.8 (1–33) 30/70 participated in the second year of the study. See Table 1 for teacher demographics. A complete description of adult and student participants can be found elsewhere (Mandell et al., 2013). Intervention Strategies for Teaching Based on Autism Research. The goal of the Strategies for Teaching Based on Autism Research (STAR) program is to develop children’s skills in a highly structured environment and then generalize those skills to more naturalistic settings. The program includes a curriculum in which each skill is matched to a specific instructional strategy. The STAR program includes three evidence-based strategies: DTT, PRT, and FR. DTT relies on highly structured, teacher-directed, one-on-one interactions between the teacher and student. In these interactions, the teacher initiates a specific stimulus to evoke the child’s response, generally a discrete skill, which is an element of a larger behavioral repertoire (Krug et al., 1979; Krug, Rosenblum, Almond, & Arick, 1981; Lovaas, 1981, 1987; Smith, 2001). DTT is used in STAR for teaching pre-academic and receptive language skills, where the desired behavior takes a very specific form, such as learning to identify colors, sequencing events from a story into a first-next-then-last structure or counting with one-to-one correspondence. The consequence of the desired behavior is an external reinforcer, such as a token or a preferred edible (Lovaas, 2003; Lovaas
  • 11. & Buch, 1997). PRT can occur in both one-on-one interactions and small-group interactions with the teacher. It is considered student directed because it occurs in the regular classroom environment, where the teaching area is pre-arranged to include highly preferred activities or toys that the student will be motivated to acquire. In PRT, students initiate the teaching episode by indicating interest in an item or activity or selecting among available teaching materials. Materials are varied frequently to enhance student motivation and generalization of skills and make PRT appropriate for targeting expressive and spontaneous language (Koegel, O’Dell, & Koegel, 1987; Koegel et al., 1989; Laski, Charlop, & Schreibman, 1988; Pierce & Schreibman, 1997; Schreibman & Koegel, 1996). After the student expresses interest in an activity or item, he or she is required to perform a specific behavior related to the item. The consequence of the desired behavior is getting access to the activity or item. For example, students’ attempts to label and request items are reinforced by the delivery of the item, which may then provide the opportunity to focus on other skills, such as joint attention, imitation, play skills, and generalization of other skills learned in the DTT format. FR are the least structured of the STAR instructional strategies. FR strategies are routines that occur throughout the day and include school arrival and dismissal, mealtime, toileting, transitions between classroom activities, and recreational activities. Each routine is broken into discrete steps called a task analysis and then chained together using behavior
  • 12. analytic procedures such as stimulus prompts (visual and verbal) and reinforcement of each step in the routine (Brown et al., 1987; Cooper et al., 1987; Marcus et al., 2000; McClannahan & Krantz, 1999). For example, a routine to change activities may include cuing the transition (verbal prompt), checking a schedule (visual prompt), pulling a picture card from the schedule to indicate the next activity, taking the card to the location of Psychology in the Schools DOI: 10.1002/pits Training Teachers in Autism Practices 185 the new activity, putting the card into a pocket utilizing a match-to-sample technique, and beginning the new activity, followed by a token for routine completion. The advantage of this strategy is that each transition component is taught within the context of performing the routine, so that the child learns to respond to natural cues and reinforcers. FR strategies are conducted in both individual and group formats, depending on the skills being taught (e.g., toileting versus appropriate participation in snack time). Training STAR training occurred in accordance with the STAR developers’ training protocols. The research team contracted with the program developers to provide training directly to the teachers. Training included workshops, help with classroom setup, and observation and coaching throughout
  • 13. the first academic year of STAR implementation (described in detail in the following sections). Six local coaches also were trained by the STAR developers to provide ongoing consultation to classroom staff during the second year of STAR implementation. The training protocol for STAR is manualized and publicly available. Additional information about the STAR program can be found at www.starautismsupport.com. Training provided to classroom teachers and staff included the following components: Workshops. The STAR program developers provided a series of trainings on the use of the STAR program. The training began in September and consisted of 28 hours of intensive workshops that covered the STAR program, including the use of the curriculum assessment, classroom setup, and training in DTT, PRT, and FR. Workshops included didactic teaching, video examples, role- playing, and a visit to each classroom to help with classroom setup. STAR workshops took place outside the school day (i.e., during professional development days, at night, and on the weekends). Observation and coaching. During the first year, program developers observed classroom staff during regular school hours and provided feedback on use of STAR strategies with students. Trainers provided 5 days of observation and coaching immediately following training, 3 days of follow-up coaching throughout the academic year, and ongoing advising and coaching by e-mail and phone. On average, classrooms received 26.5 (range, 1.5–36) hours of coaching over 5.7 (range, 3–7) visits in the first year. During the second year, local coaches trained
  • 14. by the STAR developers provided coaching in the STAR strategies. Coaching was provided September through May on a monthly basis. On average, classrooms received 36.1 (range, 0–59) hours of coaching over 10 (range, 0–10) visits in the second year. Data Collection Procedures Data on adherence to the instructional strategies used in STAR were collected throughout the academic year via video recording of teaching interactions with students for coding of implementa- tion fidelity in each of the three STAR intervention methods. Classroom staff members were filmed for 30 minutes every month in Years 1 and 2. Research assistants trained in filming methods recorded the intervention during a specified date each month. Visits were timed to coincide with regularly scheduled use of each of the intervention methods. The 30-minute film was composed of 10 minutes of DTT, 10 minutes of PRT, and 10 minutes of FR to provide a sample of the use of each intervention. Recording included any consented staff member providing the intervention. The staff member filmed by the research staff varied depending on which staff member (i.e., teacher or paraprofessional) was conducting the intervention that day. The primary classroom teacher conducted the intervention in 86% of the videos collected, and paraprofessional staff conducted the intervention in the remaining 14% of videos. There were no Psychology in the Schools DOI: 10.1002/pits
  • 15. 186 Stahmer et al. statistically significant differences in the proportion of videos collected by intervention provider (teacher vs. paraprofessional) for any strategy or time period (p > .05). Implementation Fidelity Measures Coding procedures. The primary method for assessing fidelity of STAR strategies was through video recordings of teachers and aides interacting with students. Coding relied on different criteria based on specific coding definitions created for each instructional component, as well as general teaching strategies (see following sections). Coding schemes for each method were developed by the first author and were reviewed by the STAR program developers. Trained research assistants blinded to the study hypotheses coded all video recordings. For each intervention method, the core research team established correct codes for a subset of videos through consensus coding (keys). Each research assistant coder then learned one coding system (i.e., DTT, PRT, or FR) and was required to achieve 80% reliability across two keys before beginning to code any classroom sessions independently. One third of all tapes were double coded to ensure ongoing reliability of data coding throughout the duration of the project. The core research team also re-coded two tapes for each research assistant every other month, providing a measure of criterion validity. If
  • 16. there was less than 80% agreement between the reliability coder and the research assistant, additional training and coaching were provided until criterion was achieved and previous videos were re-coded. Coding involved direct computer entry while viewing videos using “The Observer Video- Pro” software (Noldus Information Technology, Inc., 2008), a computerized system for collection, analysis, and management of direct observation data. For each instructional strategy, the coder observed the 10-minute segment and subsequently rated the adults’ use of each component of the strategy on a 1 to 5 Likert scale, with 1 indicating Adult does not implement throughout segment and 5 indicating Adult implements consistently throughout the segment. These Likert ratings were found to have high concordance with more detailed trial-by-trial coding of each strategy component (88% agreement) used in previous research (Stahmer, 2010). A score of 4 or 5 on a component was considered passing and correlated with 80% correct use of strategies in the more detailed coding scheme. Following are the individual components included in each strategy. Complete coding definitions are available from the first author. Discrete trial teaching. For DTT, coders examined the use of the following components: gaining the student’s attention, choosing appropriate target skills, using clear and appropriate cues, using accurate prompting strategies, providing clear and correct consequences, using appropriate inter-trial intervals, and utilizing error correction procedures effectively (error correction evaluated against procedures described in Arick, Loos, Falco, & Krug,
  • 17. 2004). The criterion for passing implementation fidelity was defined as the correct use of 80% of components (score of 4 or 5) during the observation. Pivotal response training. For PRT, coders examined the use of the following components: gaining the student’s attention, providing clear and developmentally appropriate cues related to the activity, providing the student a choice of stimuli/activities, interspersing a mixture of maintenance (previously acquired) and acquisition (not yet mastered) tasks, taking turns to model appropriate be- havior, providing contingent consequences, rewarding goal- directed attempts, and using reinforcers directly related to the teaching activity. The criterion for passing implementation fidelity was defined as the correct use of 80% of components (score of 4 or 5) during the observation. Functional routines. For FR, coders examined adherence to each step of the FR used in class- rooms during group and individual routines. The use of the following components was coded: using error correction procedures appropriately, adhering to FR lesson plan, and supporting transitions Psychology in the Schools DOI: 10.1002/pits Training Teachers in Autism Practices 187 between activities. The criterion for passing implementation fidelity was defined as correct use of 80% of components (score of 4 or 5) during the observation.
  • 18. Reliability of Data Recording Inter-rater reliability, as measured by percent agreement within 1 Likert point, was calculated for coding of each instructional strategy and each month of videos by having a second coder, blinded to the initial codes, score one third of the videos per strategy for each month. The average overall percent agreement for each strategy was 86% for DTT (range, 60%–100%); 90% for PRT (range, 75%–100%); and 90% for FR (range, 67%–100%). A primary coder was assigned to each strategy, and those codes were used in the analyses. Data Reduction and Analyses Data were examined across four periods. Time 1 included the first measurement for available classrooms in Year 1, which was conducted in October, November, or December of 2008. Filming occurred after the initial training workshops. Coaching was ongoing throughout the year. If class- rooms were filmed in more than one of those months, both the average and the best performance were analyzed. All classroom staff participated in their initial training prior to the Time 1 measure- ment. Time 2 was defined as the performance from the last three measurements of the school year (February, March, or April 2009) for Year 1. The same procedures were used for Year 2 (Times 3 and 4). Time 3 included the first observation in Year 2 (October, November, or December 2009). Time 4 included the performance during the last 3 months of observations (February, March, or April, 2010). Both average and best performance from each period
  • 19. were utilized to provide an estimate of the staff’s capacity to implement the strategy in the classroom environment (best) and variability in competency of use (average). Data from Year 1 and Year 2 were analyzed. One-way within- subject (or repeated measures) analyses of variance (ANOVAs) were conducted for each intervention strategy to examine change in implementation fidelity scores for over time. Post-hoc comparisons were made using paired sample t tests between time periods when ANOVA results indicated statistically significant differences. In addition, we examined differences in fidelity of implementation across intervention strategies using a one-way ANOVA with paired sample t tests to follow up on significant results. Type I error probability was maintained at .05 (two-tailed) for all analyses using a Bonferroni correction. Pearson correlations were conducted to examine the relationship between fidelity of implemen- tation of each intervention strategy and teaching experience, experience working with children with autism spectrum disorder (ASD), level of education, and number of hours of coaching received. RESULTS Use of the Strategies Because teachers who did not allow filming in their classrooms cited staffing difficulties or lack of preparation as the reason, they were considered not to be implementing DTT, PRT, or FR in their classrooms on a regular basis. At Time 1, two teachers
  • 20. (4%) explicitly indicated that they did not use DTT at any time, and 13 teachers (23%) indicated that did not use PRT at any time. The percentage of classrooms filmed using the strategy is displayed in Figure 1. In Year 1, classrooms were filmed most often conducting DTT at both Time 1 (70% of classrooms) and Time 2 (96%). Only 23% of classrooms were filmed conducting PRT at Time 1, and 68% were filmed at Time 2. FR was filmed in 67% of classrooms at Time 1 and 81% at Time 2. In Year 2, filming was much more consistent across strategies. DTT and PRT were both filmed in 92% of classrooms at Time 3 Psychology in the Schools DOI: 10.1002/pits 188 Stahmer et al. FIGURE 1. The percentage of classrooms using the strategy during each time period. and 97% of classrooms at Time 4. For FR, 89% of classrooms were filmed at Time 3 and 97% at Time 4. Overall Competence in the Instructional Strategies Discrete trial training. The percentage of DTT components on which teachers met fidelity (i.e., a score of 4 or 5 during the observation) was used as the dependent variable for these analyses. Mean results are displayed in Table 2. No statistically significant changes over time were found in average or best DTT fidelity over time. In general, classrooms
  • 21. had a relatively high average and best DTT fidelity during all time periods. The range of scores for individual performance was variable at both time periods, as evidenced by the large standard deviations. The percentage of classrooms in which teachers met DTT fidelity (i.e., correct implementation of 80% of the DTT strategies during the observation) was examined. Fifty-six percent of classrooms met fidelity at Time 1 based on the average of all observations at Time 1, 47% at Time 2, 46% at Time 3, and 59% at Time 4. When considering only the best example, 65% of classrooms met fidelity at Time 1, and this increased to 81% by Time 4 (see Figure 2). Pivotal response training. The dependent variable for these analyses was the percentage of PRT components on which teachers met fidelity (i.e., a … Medical Case Study with Minitab for solutions Background: You work for a government agency and your management asked you to take a look at data from prescription drugs administered at hospitals in your geography. She asked you to analyze the data with some common tools and build a DMAIC model for how you would work with the hospitals to improve results, since their performance is below the average. She would like a simple model for you to present to her that you will propose to representatives from the hospitals. The hospital representatives will have to be brought on board and understand the issues and their role in the study. Use the DMAIC model from the course material to create a model of the effort to be completed by the hospitals.
  • 22. Define: 1. What would you say about the DMAIC model to the hospital staff on your team? 2. Write a problem statement for the work you are considering. 3. Develop a team charter so that each of the representatives understands what is expected of them and to brainstorm improvements upon it. 4. What are the key deliverables of the define step that you expect of the team? Measure: 1. What activities would you propose that the team work on? 2. What measures would you propose to the team to pursue? 3. What data collection would you propose? 4. What are the key steps to get to items 1-3 above? Analyze: Prepare data to show the team about the extent of the problem: 1. A Pareto chart of the errors from the Error Type chart below 1. What would you suggest the team focus upon? 2. What would you tell the team about the data they need to collect and what will be done with it? 2. Another example of measures is the administration of Drug A, which needs to be administered every 30 minutes. The requirement for the drug is to be administered no more than 3 minutes early or 3 minutes late or between 27-33 minutes. Make a histogram of the data below (Time between administration of drug chart). What is it saying about the process? 3. Do a normalcy test. Is that a normal distribution?
  • 23.
  • 24. Improve: 1. You don’t have a process flow or any information on how hospitals administer drugs or their improvement plans if any. What would you tell the participants about what is expected in this phase of the program? Control: 1. What are the key steps for control? 2. Develop a sample response plan that you would use to show the team what is expected to be done. 3. What are the key deliverables for this step? Test data in Excel format: Error Type Type of High Alert Medication Error Omission 8461 Improper dose/quantity 7124 Unauthorized/wrong drug 5463 Prescribing error 2923 Wrong Time 2300 Extra Dose 2256 Wrong patient 1786 Mislabeling
  • 25. 636 Wrong dosage form 586 Wrong administration 335 Drug prepared incorrectly 311 Wrong route 252 Other 113 32546 Observation Time between administration of Drug 1 35.5 2 26.2 3 31.6 4 26.4 5 28.5 6 24.6 7
  • 28. 29.5 44 27.1 45 28.3 46 31.3 47 27.4 48 25.0 49 24.6 50 40.0 1 The Journal of Special Education 2016, Vol. 50(1) 27 –36 © Hammill Institute on Disabilities 2015 Reprints and permissions: sagepub.com/journalsPermissions.nav DOI: 10.1177/0022466915613592 journalofspecialeducation.sagepub.com Article In the field of special education, a commitment to the logic and ethics of using research to inform decisions about prac- tice has been reflected in the field’s efforts to identify and
  • 29. use evidence-based practices (EBPs) as a standard for the profession (Council for Exceptional Children, 2014; Odom et al., 2005). As in other fields, this focus has led inexorably back to what some commentators have termed the “wicked” problem of implementation (Cook & Odom, 2013). Fixen and his colleagues (following Rittel & Webber, 1973) described wicked problems as those that are “difficult to define and fight back when you try to solve them” (Fixen, Blaze, Metz, & Van Dyke, 2013, p. 218). Indeed, the obser- vation that “interests vested in the system-as-is suddenly appear and typically deter attempts to change the system” (Fixen et al., 2013, p. 218) has been made by ecologically oriented observers of human behavior since time of Marx (Bronfenbrenner, 1979; Lewin, 1951; Marx, 1888/1984). One implication of this view, of course, is that the problem of (non)implementation of EBP may be most usefully viewed not simply as a “deficit” in the knowledge, skills, or ideological commitments of practitioners but as a product of the set of social, organizational, and material conditions that operate in a given human service setting. In this article, we draw on interviews conducted with special education practitioners to investigate how these kinds of contextual factors (and others) may affect the ways in which practitio- ners interpret and respond to contemporary press for imple- mentation of EBP. We are by no means the first to recognize the importance of seeking practitioner perspectives in understanding the challenges of implementing EBP in special education. For example, Landrum, Cook, Tankersley, and Fitzgerald (2002) surveyed 127 teachers (60 special educators, 67 general edu- cators) to assess their views about the value of four sources of information about practice: university coursework,
  • 30. 613592 SEDXXX10.1177/0022466915613592The Journal of Special EducationHudson et al. research-article2015 1University of Washington, Seattle, USA 2Northern Illinois University, DeKalb, USA 3Central Michigan University, Mount Pleasant, USA 4American Institutes for Research, Washington, DC, USA Corresponding Author: Roxanne F. Hudson, Area of Special Education, University of Washington, P.O. Box 353600, Seattle, WA 99195, USA. E-mail: [email protected] A Socio-Cultural Analysis of Practitioner Perspectives on Implementation of Evidence-Based Practice in Special Education Roxanne F. Hudson, PhD1, Carol A. Davis, EdD1, Grace Blum, MEd1, Rosanne Greenway, MEd1, Jacob Hackett, MEd1, James Kidwell, MEd1, Lisa Liberty, PhD1,2, Megan McCollow, PhD1,3, Yelena Patish, MEd1, Jennifer Pierce, PhD1,4, Maggie Schulze, MEd1, Maya M. Smith, PhD1, and Charles A. Peck, PhD1 Abstract Despite the central role “evidence-based practice” (EBP) plays in special education agendas for both research and policy, it is widely recognized that achieving implementation of EBPs remains an elusive goal. In an effort to better understand this problem, we interviewed special education practitioners in four school districts, inquiring about the role evidence and EBP played in their work. Our data suggest that practitioners’
  • 31. responses to policies that press for increased use of EBP are mediated by a variety of factors, including their interpretations of the EBP construct itself, as well as the organizational conditions of their work, and their access to relevant knowledge and related tools to support implementation. We interpret these findings in terms of their implications for understanding the problem of implementation through a more contextual and ecological lens than has been reflected in much of the literature to date. Keywords evidence-based practices, implementation, special education practitioners mailto:[email protected] http://crossmark.crossref.org/dialog/?doi=10.1177%2F00224669 15613592&domain=pdf&date_stamp=2015-11-08 28 The Journal of Special Education 50(1) research journals, teaching colleagues, and in-service/pro- fessional development workshops. Their data indicated that research journals and university courses (presumably sources of relatively reliable information about EBP) were viewed as less useful, less trustworthy, and less accessible than either information from colleagues or information received via professional development. Similarly, Boardman, Argüelles, Vaughn, Hughes, and Klingner (2005) reported that teachers often expressed the belief that the extant research was not relevant to the populations they served in their classrooms, and reported relying on colleagues for rec- ommendations about practice. In a more recent study, Jones (2009) investigated the views of 10 novice special educators regarding EBP. Based
  • 32. on interview, classroom observation, and rating scale data, Jones suggested that the novice teachers she studied fell into three broad groups. “Definitive supporters” expressed clear and positive views about the importance of research in decisions about classroom practice. “Cautious consumers” felt research could be useful, but often did not reflect char- acteristics and needs of their individual students. A third group, “The Critics,” expressed skepticism about the value of research for decisions about classroom practice. Taken together, these studies (and others) provide a rather robust picture of the tensions between research and practice in special education. While significant variation exists among special education practitioners in their views about the value and relevance of research to their work in the classroom, many express more confidence in the knowl- edge and expertise of local colleagues than in information they might receive from university coursework and/or researchers. This result is consistent with research from other fields and suggests that much remains to be learned about the conditions under which practitioners utilize knowledge from research in decisions about practice (Aarons & Palinkas, 2007; Glasgow, Lichtenstein, & Marcus, 2003). In our review of the special education research on this topic, we noted that most researchers have framed their analysis of practitioner perspectives related to implementa- tion of EBP in essentially individualistic and personological terms—placing teachers (and, in some cases, administra- tors) in the center of their analysis of the implementation process. For example, as noted earlier, Jones (2009) parsed individual teachers into groups such as “the Critics” and “the Supporters.” Also focusing on individual practitioners, Landrum et al. (2002) argued,
  • 33. Only when we have confidence that teachers learn about empirically sound practice in both their initial preparation and ongoing professional development, and that their skills reflect this training, can we predict that students with disabilities will be afforded the most appropriate learning opportunities available. (p. 48) We do not entirely disagree with these conclusions, and others like them that underscore the importance of persono- logical variables (e.g., practitioner knowledge, prior train- ing, attitudes) affecting implementation of EBP. But we would also argue that in foregrounding characteristics of individual practitioners as a focus of analysis, these studies reflect a set of implicit assumptions about the nature of practice and how it is constructed that narrows our view of the problems of implementation, and the range of actions to be considered in engaging those problems. In the present study, we follow recent recommendations (Harn, Parisi, & Stoolmiller, 2013; Klingner, Boardman, & McMaster, 2013; Peck & McDonald, 2014) in undertaking a more holistic and contextual approach to understanding how practitioner perspectives on EBP are shaped by the conditions in which they work. Theoretical Framing In conceptualizing “a more contextual” approach to under- standing practitioner interpretation and implementation of EBP, we drew on some of the precepts of sociocultural the- ory as a general framework for investigating ways in which social and material conditions shape workplace learning and practice (Billett, 2003; Engeström, 2001; Scribner, 1997; Vygotsky, 1978). Our choice of a sociocultural per- spective was based on several of its key precepts that we believed would be useful in understanding practitioner per- spectives on implementation of EBP. First, sociocultural
  • 34. theory foregrounds analysis of relationships between indi- vidual and collective dimensions of social practice—in this case, the analysis of the transactions that take place between individual practitioners and the organizations in which they work (Engeström, 2001). Second, this view assumes that human thought processes (including, of course, one’s views about EBP) are shaped by the demands of the practical activities in which people are regularly engaged. A third assumption of this stream of sociocultural theory is that par- ticipation in social practice is affected by the affordances and constraints of the conceptual and material tools avail- able (e.g., the characteristics and representations of EBP available in local school districts and other professional resources; Falmagne, 1995; Leontev, 1975/1978; Scribner, 1997). Overall, the sociocultural perspective suggests the value of undertaking a more focused analysis of the social and organizational conditions in which decisions about practice are made than has been reflected in much of the extant research on the problem of implementation. We used the following research questions to guide our inquiry: Research Question 1: How do special education practi- tioners interpret the meaning of EBP in the context of decisions they make about curriculum and instruction? Hudson et al. 29 Research Question 2: What contextual factors are asso- ciated with practitioner interpretations of the role EBP can and should play in their decisions about instruction? Method We used a qualitative methodology (Merriam, 2009) to
  • 35. investigate the perspectives—that is, the values, beliefs, and attitudes—held by special education practitioners with regard to their views about EBP, and the role research played in their decisions about curriculum and instruction. We elected this methodological approach because of the hypothesis-generating, rather than hypothesis-testing, pur- poses of the study (Glaser & Strauss, 1967). Participants A total of 27 special education practitioners participated in our study. We contacted directors of special education via email and invited participation from four school districts in the Seattle/Puget Sound area. Demographics for these dis- tricts are presented in Table 1. Teacher participants were nominated by special educa- tion directors, who were asked to identify individuals they believed would be interested in being interviewed for the study. In each district, we requested nominations of teachers working in three types of programs or settings: resource rooms serving students with a wide range of disability labels placed primarily in general education classrooms, self-contained classes serving students with emotional/ behavioral disabilities (EBD), and self-contained class- rooms serving students with low-incidence developmental disabilities. Table 2 reports the number, working context, and experience level of study participants in each of the dis- tricts in which we collected data. Data Collection and Analysis Interviews. The primary data source for our study consisted of face-to-face interviews we conducted individually with the 27 special educators who agreed to participate in the study. We used semistructured interview protocols for each
  • 36. of the four types of practitioners we interviewed: special education directors, resource room teachers, EBD teachers, and teachers of students with low-incidence developmental disabilities. While the protocols for administrators and teachers varied in some ways, both were structured to pro- ceed from general, context-descriptive questions such as “Tell me about the work you do,” to more focused questions about daily practice (“Tell me about a typical day in your classroom”). We asked each informant to define the term EBP and tell us what it meant to them in terms of their deci- sions about curriculum and instruction. Interview protocols also included a series of questions about district policies related to EBP in both general and special education, and how these affected the decisions our informants made in the classroom. Interviews were generally between 45 min to an hour in length. Interviews were audio-recorded and subse- quently transcribed verbatim for analysis. Transcripts were entered into a web-based platform for qualitative and mixed-method data analysis (http://www.dedoose.com). Data analysis. We used the standard procedures for induc- tive data analysis described by Charmaz (2002), Strauss and Corbin (1997), and others. Thus, we began our analysis by having each of the 11 members of our research team read through the interview transcripts, identifying text segments of potential relevance to our research questions. Each of these segments was tagged using low inference descriptors, such as “classroom assessment” or “progress monitoring.” Members of the research team then met to discuss examples of the text segments they had tagged, identifying and defin- ing codes emerging from individual analysis to be formal- ized and used collectively. The remainder of the interviews were then coded, followed by an additional round of team meetings in which examples of each code were discussed, with some codes combined, others modified or deleted
  • 37. based on their perceived value relative to our research ques- tions. A set of interpretive categories were developed through this process which were used to aggregate coded data segments and which became the basis for further anal- ysis. These categories were then used as a basis for develop- ing a series of data displays (Miles & Huberman, 1994) organized by district and by each type of participant (i.e., resource room teachers, special education directors, etc.). Team members met to discuss the implications of these analyses and to develop a set of analytic memos which inte- grated the categorical data into larger and more interpretive case summaries. These summaries were used to develop the set of cross-case findings described below. Results Our findings suggest that personal characteristics (particu- larly values and beliefs about EBP), the features of organi- zations (particularly practitioner positionality within these organizations), and access to relevant tools all affected the Table 1. School District Characteristics. District Enrollment Special education enrollment (%) Students eligible for free or reduced-price meals (%) A 18,123 9.70 22.10 B 20,659 13.60 35.10 C 17,973 13.60 66.90 D 8,920 12.40 26.00
  • 38. http://www.dedoose.com 30 The Journal of Special Education 50(1) ways practitioners interpreted the relevance of the EBP to decisions they made about practice. We interpreted these as dimensions of practical activity that were inseparable and mutually constitutive (Billett, 2006). As depicted in Figure 1, our data suggest these factors operate in a highly interdependent manner. We use this conceptual model to understand both the points of the triangle and the interac- tions that take place between points as represented by the lines of the triangle. In the following sections, we will present findings both related to the points of the triangle and the intersections of elements. First, we use excerpts from our interviews to illustrate how the practitioners we interviewed interpreted the idea of EBP, the organizational contexts they worked in, and the tools and resources available to them. Second, we present findings that illuminate the connections and interac- tions between them. People: Practitioner Definitions of EBP We asked each of our informants how they defined EBP in the context of their work in special education. The predomi- nance of responses to this question reflected the notion that EBP meant that “someone” had researched a specific pro- gram or practice and found it to be effective: There’s obviously been research and studies so what I picture in my mind is that they have a curriculum and they conduct a study where they have kids who participate in the study and
  • 39. then they probably have some pre- and posttest to see if they’ve made gains. I’d say evidence-based would be like, that it’s been tried in lots of different settings, across you know lots of different populations and there’s been demonstrated success using that curriculum or whatever the thing is you’re talking about, you know, the social skills sheet or something. So it’s used with lots of people and over different settings. We noticed that our participants typically defined EBP in ways that emphasized its external origins, and its ostensive function as a “prescription” for their practice, rather than as a resource for their own decision making (Cook & Odom, 2013). In some cases, this interpretation was also congruent with the stance taken by district administrators: We have adults that want to continue to do what they’ve done in the past. And it is not research-based nor if you look from a data perspective has it been particularly effective and that’s not going to happen and we say, “This is the research, this is what you’re going to do.” (Special Education Director, District A) This strong ideological commitment to use of EBP in the classroom was shared by some teachers: I believe that by using research-based instruction, and teaching with fidelity, then you’re more likely to have an outcome that is specific to the research, as long as we use the curriculum as it’s designed. Um, I think it’s vital, I think it’s vital that we are not pulling things out of a hat, that we are using. (Resource Room Teacher, District A) More often, however, we found that practitioner views about research in general, and the value of EBP in decision
  • 40. making about classroom practice in particular, were more ambivalent. Perhaps the most widely shared concern about EBP expressed by our informants had to do with the ten- sions they perceived between the “general case” and the specifics of local context, including the special needs of the children served in the schools and classrooms in which they worked (Cook, Tankersley, Cook, & Landrum, 2008). While the value of research and the relevance of EBP were often acknowledged in abstract terms, both teachers and administrators were quick to identify what they perceived to be limitations in the relevance of research for local deci- sion making and special populations: . . .well what makes me question it—I’m always curious about what the norm population is because it is talking about typically developing kids and research-based practices that are used for those types of kids. It’s very different for my kids. So when I’m looking at an evidenced-based practice I want to be clear on what evidence [is about] Gen Ed versus the Special Ed population. (Self-Contained Classroom Teacher, District B) Table 2. Participant Characteristics. Participants Number of participants per district Median years in position District Special education director EBD
  • 41. teacher Resource room teacher Self-contained teacher A 1 1 2 2 6 7 B 1 1 2 3 7 10 C 1 2 2 2 7 6 D 1 2 2 2 7 6 Note. EBD = emotional/behavioral disabilities. Hudson et al. 31 For many teachers, ambivalence about EBP included particular tension about who makes decisions about the rel- evance of evidence to their classroom practice. These teach- ers often made reference to local perspectives as “forgotten” or “overlooked” in decisions about practice: . . . evidence-based is very important because you do need to look at what you’re doing but there is just the day-to-day knowledge that is overlooked in the evidence-based piece. (Self-Contained Classroom Teacher, District B) Most of the teachers and many administrators we inter- viewed appeared to locate the authority for making evi- dence-based decisions about curriculum and instruction with the district central office or with “general education.” For example, one director of special education reported, “for our resource room students . . . we always start with the
  • 42. Gen Ed and then if we have curriculum, if a part of that cur- riculum has a supported intervention component to it we start with that.” Many teachers similarly viewed the locus of decisions about curriculum and instruction as external to their classrooms. As a Resource Room Teacher in District D puts it, They tell us what to teach and when to teach it. I mean, we have a calendar and a pacing guide. We can’t, we really don’t make the decisions too much. I would hope . . . that it’s supported and making sure that students learn but I don’t really know. In some cases, these teachers expressed confidence that the judgments of district curriculum decision makers were grounded in appropriate evidence: I kind of just trust that the district is providing me with evidence-based stuff. So I’m trusting that the curriculum that they’ve chosen and that my colleagues have done test work on is really what they say it is. (EBD Teacher, District D) However, in other cases, teachers expressed more skepti- cal views about the trustworthiness of the data district offi- cials used to make decisions about curriculum: . . . over the years we’ve had so many evidence-based, research based and so many changes, that . . . if you just want my honest [opinion] . . . I know that there’s data behind it, but if it’s evidence based or research based, why are we always changing? (EBD Teacher, District B) Figure 1. Relationships between people, organizations, and tools. Adapted from McDiarmid & Peck (2012).
  • 43. 32 The Journal of Special Education 50(1) To summarize, similar to earlier studies (Boardman et al., 2005; Jones, 2009; Landrum et al., 2002), we found that the personal characteristics of practitioners—that is, their experiences, values, beliefs, and attitudes—functioned as a powerful filter through which they interpreted the meaning of EBP and evaluated the relevance of this con- struct for their decision making about curriculum and instruction. Practitioner definitions of EBP often reflected the assumption that the locus of authority regarding EBP lies outside the classroom, and the ostensive function of EBP was to provide prescriptions for classroom practice. In the following sections, we report findings related to our second research question, describing ways in which contextual features such as organization and access to tools and resources may influence the way practitioners interpret the value and relevance of EBP in their daily work. Organizational Contexts of EBP Our data suggest that our interviewees’ views about the value and relevance of evidence in decision making about practice were often part of a larger process of coping with the organizational conditions of their work. Several specific issues were salient in the interviews we conducted. One of these, of course, had to do with district policies about evi- dence-based decision making (Honig, 2006). In some dis- tricts, special education administrators described strong district commitments related to the use of research evidence in decision making: In this district, it’s [EBP] becoming really big. You don’t ever hear them talk about any initiative without looking at the
  • 44. research and forming some sort of committee to look at what practices are out there and what does the research tell us about it. And then identifying what are the things we’re after and how well does this research say they support those specific things we want to see happen. I would say that work has started, and that is the lens that comes from Day One of anything we do. (Special Education Administrator, District C) However, strong district commitments to evidence-based curriculum decisions in general education were sometimes viewed as raising dilemmas for special education teachers. A teacher of students with emotional and behavioral prob- lems in District B described the problem this way: I try to make our classroom setting as much like a general Ed classroom as I can, because the goal is to give them strategies to work with behaviors so that they can function in Gen Ed classrooms. Which is a challenge, because they were in Gen Ed classrooms before they came here, so something wasn’t working. Some special education teachers described being caught between curriculum decisions made in general education and practices they saw as more beneficial for their students with disabilities: . . . if (general education teachers) change their curriculum then I need to follow it so my kids can be a part of it. Especially with my kids being more part of the classroom. So you know 4 years ago I was not doing the Math Expressions, and now I am doing the Math Expressions and it’s hard because I’m trying to follow the Gen Ed curriculum and there’s times where the one lesson ends up being a 2- or 3-day lesson with some added worksheets because they just aren’t getting the skill and, being a spiraling curriculum, it goes pretty fast sometimes too. (Self-Contained
  • 45. Teacher, District B) While the tensions between curriculum models in gen- eral education and special education (with each claiming its own evidentiary warrants) were problematic for many of the resource room teachers we interviewed, these dilemmas were less salient to teachers of children in self-contained classrooms, including those serving students with EBD and those serving students with low-incidence disabilities. Instead, the challenges that these teachers described had to do with isolation and disconnection from colleagues serv- ing students like theirs. A Self-Contained Classroom Teacher, District B, said, “Well, this job is very isolating. I’m the only one that does it in this building . . . so uh I’m kind of alone in the decisions I make.” Another Self- Contained Classroom Teacher, District D, said, Sometimes it makes me feel like it’s less than professional . . . I don’t know, I just sometimes wish that, I feel like there’s not always a lot of oversight as far as what am I doing. Is there a reason behind what I’m doing? And did it work? I wish there was more. In cases where teachers felt isolated and disconnected from district colleagues, they often reported relying on other self-contained classroom teachers for support and consultation, rather than resources in their district or from the research literature: When you’re in an EBD class you can get pretty isolated . . . but the other beauty of being in an EBD room is you have other adults in the room that you can talk to, or they can have other ideas, or you can call other teachers. (EBD Teacher, District B) Resources and Tools Related to EBP
  • 46. District commitments to use of EBPs in the classroom were in many cases accompanied by allocation of both material and conceptual resources. The resources and supports most often cited by both administrators and teachers in this con- nection were focused on curriculum materials: Susan, who is our Special Ed curriculum developer, and Louisa, who’s our curriculum facilitator . . . they recognize that we’ve shifted practice to a really research-based practice . . . We never really did this [before] we just bought books and Hudson et al. 33 stuff. And I said, “Well, I don’t operate that way. We’re going to shift practice and I’m going to help you” . . . and she has actually been very, very successful at reaching out and capturing the attention of folks, authors that publish . . . and some other materials and some other research and then digging deep. (Special Education Director, District A) I think there’s something really powerful about having that scope and sequence and that repetition that gradually builds on itself. Yeah so instead of me trying to create it as I go, having that research-based program, and of course I’m going to see if it’s not working, then I’m flexible to change it, but I’m going to have a good base to at least start with. (Resource Room Teacher, District B) While district curriculum resources (which were often assumed to be evidence-based) were important tools for many resource room teachers …
  • 47. Vol. 79. No. 2, pp. 135-144. ©20 J 3 Council for Exceptional Children. Exceptional Children Evidence-Based Practices and Implementation Science in Special Education BRYAN G. COOK University of Hawaii SAMUEL L. ODOM University of North Carolina at Chapel Hill ABSTRACT:r: Establishing a process for identifying evidence- based practices (EBPs) in special educa- tion has been a significant advance for the field because it has the potential for generating more effective educational pro-ams and producing more positive outcomes for students with disabilities. However, the potential benefit of EBPs is bounded by the quality, reach, and maintenance of implementation. The cross-disciplinary field of implementation science has great relevance for translating the promise of EBPs into positive outcomes for children and youth with disabilities. This article examines the history, extent, and limitations of EBPs and describes the emergence and
  • 48. current state of implementation science as applied in special education. Subsequent articles in this special issue i?/Exceptional Children address a range of issues related to implementation science in special education: the research-to-practice gap, dissemination and diffusion, adherence and sustain- ability, scaling up, a model for state-level implementation, and fostering implementation through professional development. ducators generally agree that broad implementation of prac- tices shown by scientific research to reliably cause increased student performance (i.e., evidence-based practices; EBPs) will result in increased student outcomes (Cook, Smith, &: Tankersley, 2012; Slavin, 2008b). Despite special educators' general affinity for the concept of EBPs, as Odom and colleagues (2005) suggested, the devil of EBPs is in the details. Odom et al. were referring to the difficulties involved in identi- fying EBPs (e.g.. How many studies must support an EBP? What research designs should be consid- ered? What quality indicators are necessary for a trustworthy study? What effects must a practice have to be considered an EBP?). Although these issues continue to be debated (see Cook, Tankers- ley, & Landrum, 2009a; Slavin, 2008a), there has
  • 49. been considerable progress in generating and ap- plying guidelines for identifying EBPs in general (e.g.. What Works Clearinghouse, WWC, 2011) and special (e.g.. National Professional Develop- ment Center on Autism Spectrum Disorders, n.d.; National Secondary Transition Technical Assis- tance Center, n.d.) education. However, the EBP movement may have leapt from the frying pan into the fire: The progress made in identifying Exceptional Children EBPs has highlighted the devilish details involved with implementation of EBPs, which now need to be addressed. The gap—described by some as a chasm (e.g., Donovan &C Cross, 2002)—between research and practice is a recurring theme in special education. Indeed, we suspect that the gap has been present in special education as long as research and prac- tice have co-existed. Attempts to bridge the re- search-to-practice gap by identifying and implementing effective practices are a rich part of special education's history (Mostert & Crockett, 1999-2000). Despite considerable focus on the research-to-practice gap (e.g., Carnine, 1997; Greenwood 8¿ Abbott, 2001) and on identifying EBPs as means to bridge it (e.g.. Cook et al., 2009b; Odom et al., 2005), there is little evidence suggesting that the gap has been meaningfully re- duced. For example, a U.S. Department of Educa- tion report (Crosse et al., 2011) noted that only 7.8% of prevention programs related to substance
  • 50. abuse and school crime used in over 5,300 schools met their standards for an EBP. And, in special ed- ucation, practitioners have reported using instruc- tional practices shown by research to be ineffective (e.g., learning styles) with similar or greater fre- quency than some research-based practices (e.g., mnemonics; Burns &C Ysseldyke, 2009). This special issue of Exceptional Children fo- cuses on addressing some of the devilish details related to bridging the research-to-practice gap by achieving broad, sustained, and high-quality im- plementation of EBPs. There is an emerging field of implementation science (Eccles & Mittman, 2006) that can be applied in special education to enhance the utilization of EBPs. To contextualize consideration of implementation science related to EBPs in special education, it's important to define what an EBP is, as well as to be aware of critical caveats and controversies related to EBPs in the field of special education. E V I D E N C E - B A S E D P R A C T I C E S WHAT ARE EBPS? Emerging from the field of medicine in the early 1990s (Sackett, Rosenberg, Gray, Haynes, & Richardson 1996), EBPs are practices and pro- grams shown by high-quality research to have meaningful effects on student outcomes. The logic behind EBPs is simple: Identifying and using the most generally effective practices will increase consumer (e.g., student) outcomes. This logic rests on the assumptions that the most effec-
  • 51. tive practices were not previously identified, im- plemented, or both; and that certain types of research (i.e., high-quality studies using designs from which causality can be inferred) are the best tools to determine effectiveness. Although not without detractors (e.g., Gallagher, 2004; Ham- mersley, 2005) this logic has been generally accepted (Slavin, 2008b) and even written into law (i.e., the No Child Left Behind Act of 2001's emphasis on "scientifically based research"). Unlike previous approaches for identifying ef- fective practices in education (e.g., best practices, research-based practices), supporting research for EBPs must meet prescribed, rigorous standards (Cook & Cook, 2011). Although specific stan- dards for EBPs vary between and within fields, research support for EBPs generally must meet standards along several dimensions, including research design, quality, and quantity. Typical guidelines require that for a practice to be consid- ered evidence-based it must be supported by multiple, high-quality, experimental or quasi- experimental (often including single-case research) studies demonstrating that the practice has a meaningful impact on consumer (e.g., student) outcomes. Discussion and promotion of EBPs have be- come seemingly ubiquitous in education in recent years (Detrich, 2008)—EBPs are promoted in na- tional, state, and local educational policies; in pro- fessional conferences, university courses, and professional development; in professional stan- dards; and in informal discussions among educa- tors. The federally funded WWC (http://ies.ed
  • 52. .gov/ncee/wwc/), established in 2002, is perhaps the most comprehensive and well known source of EBPs for education. Until recently, however, the WWC did not identify EBPs for students with dis- abilities, and now does so only for certain disability groups. (The WWC has begun reviewing the evi- dence base of practices for students with learning disabilities, in early childhood special education, and with emotional and behavioral disorders.) To address the need for standards for EBPs designed for and by special educators, Gersten et 1 3 6 Winter 2013 al. (2005) and Horner et al. (2005) generated standards for identifying EBPs in special educa- tion using group experimental and single-subject research, respectively, in a special issue of Excep- tional Cbildren (Odom, 2005). Since that special issue, various organizations and teams of special education scholars have used the standards pro- posed by Gersten et al. and Horner et al. (2005; e.g., Cook et al., 2009a), used standards adapted from Gersten et al. and Horner et al. (Odom, Collet-Klingenberg, Rogers, & Hatton, 2010), and developed independent sets of standards (e.g.. National Autism Center, 2009) to begin to iden- tify a corpus of EBPs in special education. These and other ongoing efforts to establish EBPs in special education represent an important advance for the field. However, EBPs are not a panacea, and considerable and fundamental work remains to be done if they are to meaningfully improve
  • 53. outcomes for children and youth with disabilities. CAVEATS AND CONTROVERSIES The introduction of EBPs in any field seems to be inexorably followed by a period of questioning and resistance, which certainly has occurred in education (e.g., Hammersley, 2007; Thomas & Pring, 2004). Although a complete discussion of caveats and controversies regarding EBPs in (spe- cial) education are beyond the scope of this article (see Cook et al., 2012 for an extended discus- sion), we focus our attention here on a few prominent issues of which special educators should be aware: EBPs are not guaranteed to work for everyone, identification of EBPs is in- complete and variable, and EBPs will not be im- plemented automatically or easily in the "real world" of schools and classrooms. EBPs Are Not Guaranteed to Work for Every- one. No practice will work for every single stu- dent; this is a reality of education (indeed, for all social sciences) of which special educators are keenly aware. As such, when educational re- searchers speak of causality, they do so in a proba- bilistic rather than absolute sense. That is, saying that an instructional practice causes improved ed- ucational outcomes means that the practice reli- ably results in improved outcomes for the vast majority, but not all, students who receive the in- tervention. Eor example, Torgesen (2000) esti- mated that the most effective early reading inter- ventions do not positively impact between 2% and 6% of children. Researchers typically refer to
  • 54. students for whom effective practices do not cause meaningfully improved outcomes as treatment resisters or nonresponders. Although EBPs have relatively low rates of nonresponders, it is impor- tant to recognize that even when implemented with fidelity and over time EBPs will not result in optimal outcomes for all students. Thus, when se- lecting practices to use in special educatiorl pro- grams, EBPs are a good place to start; but the application of an EBP, like any other instructional practice, represents an experiment of sorts in which special educators must validate its effective- ness for each individual child. No practice will work for every single student; this is a reality of education. Incomplete and Variable Identification of EBPs. Although more and more EBPs are being identi- fied in both general and special education, be- cause of the considerable time and expertise it takes to complete an evidence-based review (i.e., apply standards for EBPs to the body of research literature examining the effectiveness of a prac- tice) many practices have not yet been reviewed. And because of the relative scarcity of high- quality, experimental research in the educational literature (Berliner, 2002; Seethaler & Euchs, 2005), many evidence-based reviews result in the conclusion that there is simply not enough high- quality research utilizing appropriate designs to meaningfully determine whether a practice is evi- dence-based. In other words, just because a prac- tice is not considered an EBP does not necessarily mean that it is ineffective. It is then important to distinguish between practices that are not consid-
  • 55. ered evidence-based because (a) they have been shown by multiple, high-quality research studies from which causality can be inferred to be ineffec- tive and (b) an evidence-based review has not been conducted or there is insufficient research to conclusively determine whether the practice is effective (Cook & Smith, 2012). The former practices should rarely if ever be used, whereas the latter might be implemented when relevant EBPs have not been identified or a student has been shown to be a nonresponder to identified EBPs. Exceptional Children Special educators also should recognize that there are many difFerent approaches for identifying and categorizing EBPs. For example. Homer et al. (2005) proposed dichotomously categorizing prac- tices (i.e., evidence-based or not evidence-based), Gersten et al. (2005) proposed a three-tiered ap- proach (i.e., evidence-based, promising, and not evidence-based), and the W W C (2011) uses six classifications (i.e., practices with positive, poten- tially positive, mixed, indeterminate, potentially negative, and negative effects) to categorize the ev- idence base of practices. Moreover, approaches for identifying EBPs in education vary on specific standards For research design, quality of research, quantity of research, and efFect size (see Cook et al., 2012, For an extended discussion). Accord- ingly, the evidence-based status oF some practices will likely vary across EBP sources (Cook & Cook, 2011). It is important, then, to consider EBPs within the context oFthe specific standards used to
  • 56. identify them. Implementation. The research-to-practice gap underlies what is probably the most vexing caveat related to EBPs: the diFficulty in translating re- search findings to the everyday practices oF teach- ers in typical classrooms. As EBPs in education began to be identified, relatively little attention was given to how to implement them, perhaps under the assumption that school personnel would eagerly and readily apply identified EBPs. How- ever, as Fixsen, Blase, Homer, and Sugai (2009) noted, "choosing an evidence-based practice is one thing, implementation oF that practice is another thing altogether" (p. 5). The problem oF imple- mentation is not unique to EBPs and likely under- lies the generally d i s a p p o i n t i n g o u t c o m e s associated with most school reForm eFForts (e.g., Sarason, 1993). Implementing and sustaining new practices involves a host oF complex and interre- lated problems, including issues related to the practice being promoted (e.g., relevance and fit to target environment, eFFiciency and practicality), users (e.g., available time, mistrust oF research, knowledge oF EBPs, skills), and the institutional context (e.g., available resources, organizational structures and culture, stafTlng, coaching, training, administrative support; Fixsen, Naoom, Blase, Friedman, & Wallace, 2005; Nelson, LeFfler & Hansen, 2009; Tseng, 2012). Implementation issues have been reFerred to as "wicked" problems (e.g., Fixsen, Blase, Duda, Naoom, ôqVan Dyke, 2009; Signal et al., 2012) because, among other characteristics, they are moving targets that fight back (Rittel S¿ Webber,
  • 57. 1973). For example, Fixsen, Blase, Metz, and Van Dyke (this issue) noted that organizational sys- tems work to sustain the status quo by "over- whelm[ing] virtually any a t t e m p t to use new evidence-based programs" (i.e., fight back). In contrast, tame issues may be complex but they tend not to change or actively resist being solved. As diFFicult as it may be to address the tame issue oF how to identify EBPs, it is a fixed, circum- scribed issue that once solved, stays solved. It is hardly surprising, then, that typical, passive ap- proaches For promoting the implementation oF EBPs (e.g., "train and hope") that do not provide systematic and ongoing supports almost invari- ably Fail to address the wicked problems oF imple- mentation and thereFore seldom result in broad, sustained change (Fixsen et al., 2005). Implementation is the critical link between research and practice. Fixsen et al. (this issue) pro- posed a simple Formula to represent the critical interaction oF research eFFicacy and practice (im- plementation) in generating outcomes: Effective interventions X effective implementation = improved outcomes The implication oF this Formula is that in the ab- sence oF implementation, even the most eFFective intervention will not yield desired outcomes. Glasgow, Vogt, and Boles (1999) conceptualized the slightly more elaborate RE-AIM Framework to represent the importance oF multiple dimensions oF implementation in determining a practice's real-world impact. The RE-AIM model considers Four aspects oF implementation in addition to a
  • 58. practice's eFFicacy in determining impact—R X E X A X I X M = impact, where: • Reach: the proportion oF the target popula- tion reached by a practice. • EFficacy: the success rate oF a practice when implemented appropriately. • Adoption: the proportion oF targeted settings that adopt the practice. 1 3 8 Winter 2013 • Implementation: the proportion of interven- tionists who implement the practice with fidelity in real world settings. • Maintenance: proportion of organizations (e.g., schools) and interventionists (e.g., teachers) who maintain implementation of the practice over time. Imagine, for example, that a school district adopts an EBP for its students with learning dis- abilities in elementary schools. District personnel are understandably excited to begin the new year by rolling out a practice that has been shown by multiple, high-quality studies to meaningfully im- prove outcomes for, say, 95% of elementary chil- dren with learning disabilities. However, only
  • 59. 80% of elementary schools agree to participate in the project (reach). Further, given problems re- lated to training, planning and instructional time, and reluctance to adopt new practices, only 70% of teachers within targeted schools end up using the practice at all (adoption). Due to sometimes ineffectual training and lack of ongoing support, perhaps only 60% of teachers who adopt the practice implement it with fidelity; and only 50% of those maintain their use of the practice over the entire school year. In this scenario, actual im- pact is calculated as .95 (efficacy) X .80 (reach) X .70 (adoption) X .60 (implementation) X .50 (maintenance) = .16 In other words, due to problems at various levels of implementation, the EBP actually had the de- sired impact on slightly less than 16% of elemen- tary students with learning disabilities—a far cry from the rosy 95% efficacy that district adminis- trators found so attractive. After considering these numbers, it may seem that special educators would be better served by pursuing practices that appeal to teachers and are easily implemented, but which are less effec- tive (i.e., typical practice), than by chasing the large effects of EBPs that may be difficult to real- ize. However, special educators sell themselves short—and, more important, do a disservice to the students they serve—by settling for practices with limited effects. Efficacy and implementation both set a ceiling for real-world impact. Just as a highly efficacious intervention that is not imple- mented will have no real effect, an ineffective
  • 60. practice that is broadly implemented remains an ineffective practice that will, at best, have limited impact. When considering the importance of im- plementation, educators should not disregard the importance of efficacy, but rather realize the sym- biotic relationship of efficacy and implementation in determining impact. The recent emphasis on EBPs in special edu- cation is laudable, encouraging, and necessary, but identification of EBPs is insufficient without sup- porting their use in common practice (Odom, 2009). The challenge is how to achieve high levels of implementation of the most effective practices. Unfortunately, because sound research investigat- ing implementation has been sparse, "we are faced with the paradox of non-evidence-based imple- mentation of evidence-based programs" (Drake, Gorman, & Torrey; as cited in Fixsen et al., 2005, p. 35). Special educators do not yet have com- plete, empirically substantiated guidelines for sup- porting implementation of EBPs. The emerging field of implementation science has begun to address this issue by conducting research and gen- erating theories regarding the implementation of EBPs. I M P L E M E N T A T I O N S C I E N C E In the inaugural issue of Implementation Science, Eccles and Mittman (2006) defined implementa- tion science as "the scientific study of methods to promote the systematic uptake of research find- ings and other evidence-based practices into rou- tine practice" (p. 1). A number of related terms
  • 61. have been used to refer to this area of study (e.g., knowledge utilization, knowledge transfer, knowl- edge translation, implementation research, transla- tional research, diffusion, uptake; Straus, Tetroe, & Graham, 2009). We use implementation science because, in our experience, it is the most fre- quently used term by contemporary education scholars. This is not meant to suggest that a definitive corpus of knowledge has been estab- lished in the area of implementation (i.e., a sci- ence of implementation); rather, it denotes a field of scientific inquiry in which issues related to implementation are investigated. Implementation science, which draws on a rich history of foundational research investigating Exceptional Children 139 implementation in various fields (e.g., Rogers, 1962; see Weatherly & Lipsky, 1977, for an example in special education), is associated most closely with the second of two phases of transla- tion research. The first phase of translating research into practice involves the relatively neat, orderly, and relatively well funded, endeavors of conducting and synthesizing applied research to determine what works in real-world settings (i.e., establishing EBPs; Hiss, 2004). Hiss suggested that Phase 2 translation research, which investi- gates adopting and sustaining the EBPs identified in Phase 1 translation research, tends to be messy and poorly funded. However, with the recent increase in attention being paid to implemen-
  • 62. tation (or lack thereof), funding appears to be increasing. For example, the W. T. Grant founda- tion recently funded 15 research projects in gen- eral education designed to examine how research is used to inform policy and practice in local schools (Tseng, 2012). Essentially, the goal of inquiry in implemen- tation science is to research and understand how innovations are adopted and maintained, so that implementation moves from "letting it happen" to "making it happen" (Greenhalgh, Robert, Mac- Farlane, Bate, Si Kyriakidou, 2004). As has been the case with the vast majority of previous educa- tion reforms, letting EBPs happen (i.e., assuming that they will be implemented by virtue of their identification) has proven largely unsuccessful (Tseng, 2012). To bring about the broad and sus- tained implementation of EBPs, special educators need to (a) look to the lessons learned thus far from implementation science and (b) identify what is not known about making EBP implemen- tation happen and condtict research to systemati- cally fill those gaps in our knowledge base. Based on their comprehensive review of the literature in implementation science, Fixsen et al. (2005) concluded that the relatively sparse experi- mental research in implementation science indi- cates that providing guidelines, policies, information, and training are not enough to "make it happen." In contrast, long-term, multi- level strategies tend to result in successful imple- mentation. The authors gleaned seven core implementation components (or implementation drivers) that, when in place and functioning at a
  • 63. high level, can routinely change and improve practitioner behavior related to the implementa- tion of EBPs: staff selection, pteservice and inser- vice training, ongoing consultation and coaching, staff evaluation, ptogram evaluation, facilitative administrative support, and systems interventions (i.e., "strategies to work with external systems to ensure the availability of the financial, organiza- tional, and human resources required to support the work of the practitioners," p. 29). They sug- gested that purveyors—change agents who are experts at identifying and addressing obstacles to implementation—are critical for utilizing core implementation components to achieve broad and sustained implementation of EBPs. Schoolwide positive behavior support (SWPBS) is a good example of a program used in special education that incorporates lessons from implementation science into its design (see Mcln- tosh. Filter, Bennett, Ryan, & Sugai, 2010). In- deed, SWPBS implementation is guided by a model incorporating five principles drawn from implementation science: contextual fit, priority, effectiveness, efficiency, and using data for contin- uous regeneration (Mclntosh, Horner, & Sugai, 2009). For example, SWPBS practices ate modi- fied to maximize fit with the environment in which they will be implemented, although modifi- cations are made with a strong understanding of SWPBS such that they do not violate the integrity of core components of the intervention (i.e., fidelity with flexibility; see Harn, Parisi, & Stoolmiller, this issue). Moreover, SWPBS fre- quently utilizes structures such as state leadership
  • 64. teams that lead and coordinate training, coaching, and evaluation to systematically support and scale up SWPBS (see Fixsen et al., this issue; Sugai & Horner, 2006). Such attention to the principles of implementation science has, no doubt, contribu- ted to SWPBS's extensive, sustained, and effective application (e.g., Horner, Sugai, & Anderson, 2010). Fixsen et al. (2005) defined implementation broadly: "activities designed to put into practice an activity or program" (p. 5). Thus, virtually any activity involved in the implementation process might be considered under the purview of imple- mentation science. The topics addressed in this special issue of Exceptional Children (i.e., a theo- retical framework for linking research to practice, dissemination, balancing fidelity with fiexibility 1 4 O Winter 2013 and fit, scaling-up implementation efforts, statewide implementation efforts, and profes- sional development) are by no means exhaustive of the many and varied elements of implementa- tion science that have application for special edu- cation. We have included topics that represent what we believe to be among the most critical areas for improving the implementation of EBPs in special education. We have included topics that represent what we believe to be
  • 65. among the most critical areas for improving the implementation of EBPs in special education. A R T I C L E S I N T H I S S P E C I A L I S S U E The purpose of this special issue, and each of the articles in it, is two-fold: (a) review emerging evi- dence in the area of implementation science that special education scholars, policy makers, admin- istrators, and other stakeholders can apply to advance the implementation of EBPs and (b) pro- vide a framework for identifying unanswered questions for future research to explore related to implementation of EBPs in special education. In the first article. Smith, Schmidt, Edelen-Smith, and Cook propose a conceptual framework for understanding and bridging the research-to-prac- tice gap. Drawing from Stoke's (1997) Pasteur's quadrant model, they posit that rather than di- chotomizing research as either rigorous or rele- vant, research must be both rigorous and relevant to be translated into practice and positively im- pact student outcomes. Smith et al. propose that educational design research conducted within communities of practices is a promising approach for conducting relevant and rigorous inquiry that will facilitate implementation of EBPs. One of the critical stages of translating re- search to practice is disseminating and diffusing EBPs. Unfortunately, EBPs are primarily dissemi- nated in traditional and passive ways (e.g., journal
  • 66. articles, research briefs) that hold little sway with the practitioners who actually implement the practices. In the second article. Cook, Cook, and Landrum explore a variety of approaches for actively and effectively disseminating research- validated practices. They utilize Heath and Heath's (2008) SUCCESs model, which posits that dissemination efforts that "stick" are simple, unexpected, concrete, credible, emotional, and conveyed as stories. They provide theoretically and empirically validated dissemination ap- proaches that might be utilized and researched further by special educators in each of these areas. If practitioners do not implement EBPs with fidelity or as designed, the practices may not have the same positive effect demonstrated in research studies. However, in the third article, Harn, Parisi, and Stoolmiller note that demanding rigid adher- ence to predetermined procedures will decrease the likelihood that practitioners will adopt and sustain a practice. Moreover, practitioners being more concerned with adherence than meeting the needs of their students may actually decrease EBP effectiveness. Ham et al. discuss different aspects of the multifaceted construct of implementation fidelity and how programs and practices can be designed flexibly so that they can be implemented with fidelity but still meet the needs of different students in varying educational contexts. In the fourth article. Klingner, Boardman, and McMaster discuss issues related to scaling up EBPs. The issue of scale is of critical importance in implementation science. Although implementing
  • 67. an EBP in a single school will positively impact the outcomes of a limited number of students with disabilities, if implementation of EBPs is addressed one school at a time, the research-to- practice gap is likely to remain wide. Klingner et al. propose a model of scaling up at the district level that involves district—researcher partnerships, integrating new practices with other district initia- tives, tailoring the EBP to the …