SlideShare una empresa de Scribd logo
1 de 15
Descargar para leer sin conexión
Participatory evaluation: A case study of involving stakeholders in the evaluation process
                     by Amy Grack Nelson & Robby Callahan Schreiber

Citation: Grack Nelson, A. & Callahan Schreiber, R. (2009). Participatory evaluation: A case
study of involving stakeholders in the evaluation process. Visitor Studies, 12(2), 199 – 213.

Abstract:
One way to ensure that an evaluation has “utility” is to use a participatory evaluation approach
where the evaluator partners with primary users to carry out various phases of an evaluation.
This article provides a case study of how participatory evaluation was used in an out-of-school
youth development and employment program at the Science Museum of Minnesota's Kitty
Andersen Youth Science Center. Youth staff participated in a series of evaluation workshops
where they learned about evaluation, made their own meaning of evaluation data, generated
recommendations to improve their museum work, and ultimately took more ownership and
control of their program. As a result of the workshops, youth also carried out their own
formative evaluation of four museum programs. The program manager and the youth staff reaped
a number of benefits from this process. The factors leading to its success are discussed.

Keywords: participatory evaluation, formative evaluation, program evaluation, youth
development, evaluation capacity building, museum

According to The Joint Committee on Standards for Educational Evaluation's Program
Evaluation Standards (1994), one of the four essential features of all evaluations is "utility." The
Joint Committee's rationale for the utility standard was, "An evaluation should not be done if
there is no prospect for its being useful to some audience" (Stufflebeam, 1980, pp. 90). One way
to help ensure use is to increase the primary intended users’ level of participation in the
evaluation (Cousins & Earl, 1995a; Patton, 2008). A means to do this is to undertake evaluation
using a participatory evaluation process, which has the core purpose of increasing evaluation use
(Cousins & Earl, 1995a). Participatory evaluation is "applied social research that involves a
partnership between trained evaluation personnel and practice-based decision-makers,
organizational members with program responsibility, or people with a vital interest in the
program" (Cousins & Earl, 1992, pg. 399-400). The process recognizes and draws upon the
evaluation expertise of the evaluator and the program expertise of the primary user. The
evaluator partners with primary intended users to carry out various phases of the evaluation
including constructing evaluation questions, designing instruments, collecting data, analyzing
results, generating recommendations, and disseminating findings (Cousins & Earl, 1995a).
Involving users in the evaluation enhances the relevance of the evaluation, understanding of the
data, and ownership of the findings, all important to increasing use (Cousins & Whitmore, 1998;
King & Stevahn, 2002; Patton, 2008). In addition to increasing use, participating in the
evaluation provides opportunities for users to develop the analytic and evaluative skills necessary
for meaningful participation in the evaluation process (Cousins & Earl, 1992).

This article provides a case study of how a participatory evaluation approach was used in an out-
of-school youth program at the Science Museum of Minnesota's Kitty Andersen Youth Science
Center (KAYSC). In the youth development field, participatory evaluation involving youth is
referred to as youth participatory evaluation (Sabo Flores, 2008). Youth participatory evaluation



                                                                                                  1
actively involves youth in a meaningful way to improve the experience they have in a program,
instead of being passive recipients of program services or subjects in an evaluation (Checkoway,
Dobbie, & Richards-Schuster, 2003; Horsch, Little, Chase Smith, Goodyear, & Harris, 2002;
London, Zimmerman, & Erbstein, 2003). When adults treat youth as equals and acknowledge
and respect their voices in the evaluation process, it increases youth self-confidence that they
have valuable assets, knowledge and insights (Checkoway et al., 2003; Sabo Flores, 2008). This
article discusses the participatory process that was used with a group of youth, the factors that
led to a successful process, and the benefits of the process to the program manager and youth.

CASE STUDY
The KAYSC was created in 1996 to encourage young people to engage in experiential learning
that develops their confidence in and appreciation for STEM fields (Science, Technology,
Engineering and Mathematics). The KAYSC is built on the belief that youth are active
participants in their own learning where discovery, ownership, and responsibility incite a positive
attitude toward the sciences. KAYSC programs annually engage approximately 100 youth 12-18
years old in STEM educational and professional opportunities. Youth work as museum
volunteers or part-time, paid staff in small teams facilitated by adult program managers. They
educate museum visitors, build exhibits, develop community STEM projects/presentations, and
do scientific research, which make their experiences real and meaningful (Velure Roholt &
Steiner, 2005). Participants are empowered to take an active role in the planning, research,
design, implementation, and evaluation of their own program and projects.

The Park Crew is one of the senior level teams in the KAYSC. Throughout this article, Park
Crew youth staff will be referred to as “participants” while those who participated in the crew’s
programs will be referred to as “visitors.” The team is comprised of eight to twelve junior and
senior high school students who are led by an adult program manager, Robby Callahan
Schreiber. The Park Crew began in 2003 through funding from the National Center for Earth-
surface Dynamics (NCED), a National Science Foundation Science and Technology Center. The
team provides public outreach activities based on the work of NCED and related topics. As a
result of participating in the Park Crew, participants learn about earth-surface processes,
specifically those related to water, and the role humans play in affecting the processes;
personally develop and strengthen teaching skills so they can educate the public (museum
visitors and school outreach groups) about earth-surface processes; and become aware of STEM
careers and explore educational and professional opportunities in these fields.

The primary activities of the Park Crew are developing and sharing water-based earth-surface
process activities with the public. The participants visit after-school program sites during the
winter to share activities with elementary students. During the summer, the Park Crew facilitates
activities in the museum’s Big Back Yard (BBY), a 1.75-acre outdoor science park. Over 2,000
museum visitors interact with the Park Crew’s activities each summer. The activities include
Watershed Walk, Fossil Making, Macroinvertebrate Investigation, and Restore the River.
     • Watershed Walk is a hands-on demonstration of non-point source pollution that includes
        helpful suggestions of how visitors can improve local water quality. Watershed Walk is
        the only activity with a predefined schedule of demonstrations every half hour.
     • Fossil Making is a make-and-take activity in which visitors learn about local fossils and
        create their own fossil to take home.



                                                                                                 2
•   Macroinvertebrate Investigation is a free-choice activity where visitors lead their own up-
        close exploration of aquatic macroinvertebrates using microscopes and hand lenses.
    •   Restore the River is an interactive demonstration of what happens to a dammed river
        before and after the removal of the dam.

During the summer of 2007, Amy Grack Nelson, Evaluation & Research Associate at the
Science Museum of Minnesota, carried out a summative evaluation of the Park Crew program1.
The purpose of the evaluation was to understand how the participants implemented the activities
and what they learned about earth-surface processes, teaching others, and STEM careers. A
mixed-methods design was used, which included observations and pre- and post-program
interviews of the participants. The evaluator observed two activities with varying presentation
styles, Watershed Walk and Macroinvertebrate Investigation. Observations focused on how
participants presented the activities and interacted with visitors. Pre-program interviews included
questions related to what participants wanted visitors to learn from each of the four activities,
while post-program interviews asked participants about their experience working in the BBY and
their comfort level presenting the activities.

The summative evaluation not only provided insight into the extent to which the program’s goals
and outcomes had been addressed, it also brought to light many areas of potential improvement.
Since the program was ongoing, the program manager wanted to find ways to address some of
the program’s shortcomings and make improvements before the following summer, thus using
the summative findings in a formative manner. The evaluator and program manager decided they
wanted to share the evaluation data with the participants as a way to help identify areas of
improvement. Since the evaluation findings were not entirely positive, it was important to share
the data in a way that allowed participants to make their own conclusions instead of having an
adult appear to point out what they did wrong. It was also important that the Park Crew's first
experience with evaluation data was positive and did not result in participants having negative
feelings about their work or evaluation. A situation had to be devised to allow participants to
personally identify their successes and take ownership of what improvements needed to be made.
For this reason, a participatory evaluation approach was used to share and make sense of the data
with the participants.

Youth participation in evaluation was not new to the KAYSC. In the past, external evaluators
used a participatory process with two KAYSC program evaluations. Unfortunately, a
participatory approach did not carry over into other evaluations. Staff turnover, lack of
evaluation capacity of the staff, and the absence of an internal evaluator, made it difficult to
establish a commitment to participatory evaluation processes across all KAYSC youth programs.
In order for participatory evaluation to become integrated and a routine part of an organization's
culture, repeated participatory evaluation processes are required (Cousins & Earl, 1995b).
Various factors have contributed to a recent commitment to youth participation in evaluation
within the KAYSC. In 2005, the Science Museum of Minnesota's Department of Evaluation &
Research in Learning was established and an internal evaluator, Amy Grack Nelson, was
designated to work closely with the KAYSC. The internal evaluator began to build the
evaluation capacity of the KAYSC staff in order to create an organizational culture of evaluation.
This work was supported by KAYSC's 2007 strategic plan that stressed a commitment to a
collaborative approach to evaluation and the need for evaluation to be a component of all new



                                                                                                  3
programs. Participatory evaluation also aligns with KAYSC's philosophy of transforming youth
from being passive consumers of knowledge to creators and connectors of knowledge,
recognizing that youth can and should be in a constant state of learning and improvement,
thinking critically about and evaluating their experiences in and outside of the KAYSC.

Involving Participants in Interpreting Data and Generating Recommendations
Participants were brought into the evaluation process during the data analysis phase of the Park
Crew summative evaluation. Since they were not actively involved in the evaluation from the
beginning, it was not a full participatory evaluation process, but a youth participatory evaluation
approach was used to analyze data and generate recommendations. As a result of the experience,
the participants requested to engage in a full participatory evaluation process to formatively
evaluate their BBY activities with visitors.

In April 2008, both new Park Crew youth and veteran youth from the previous summer (who
were involved in the summative evaluation) participated in four evaluation workshops facilitated
by their program manager and internal evaluator (See Figure 1). There were four goals of the
workshops:
       1) Teach participants about evaluation;
       2) Give participants an opportunity to make their own meaning of evaluation data;
       3) Increase participants’ understanding of BBY activities; and
       4) Allow participants to take more ownership and control of their work.
Over the course of four three-hour workshops, participants learned about evaluation, reflected on
their activities' learning objectives, interpreted the previous summer's evaluation data and created
recommendations for the upcoming summer. At the end of each workshop, the program manager
led the participants through brief, facilitated discussions to provide an opportunity for them to
reflect on their experience participating in the evaluation. Through focused questioning from the
program manager, participants talked about parts of the participatory workshops that resonated
with them, shared feelings that came up throughout the workshops, connected the process to the
bigger picture of their work in the KAYSC, and identified ways to move forward. These
discussions allowed participants to continually check-in with each other and the program
manager and keep the workshops moving in a direction and speed with which they were
comfortable. The success of the participatory workshops became evident when the Park Crew
requested to lead their own formative evaluation to improve visitors’ experiences with their
activities.




                                                                                                  4
Figure 1: Evaluation workshop agenda

Workshops 1 & 2: Reflecting on Experience and Data
The first workshop began with an activity to help the participants gain a deeper understanding of
evaluation, why their activities were evaluated, and how the evaluation data were gathered. To
explain evaluation in a way that connected to their current experiences and interests, the
evaluator led the participants through an activity in which they evaluated a cell phone. The team
identified what they wanted teens to be able to do with a cell phone (the objectives). They talked
about what information they would need to gather and how to gather it to ensure teens were able
to successfully use the phone and were satisfied with the product. The evaluator then explained
how they could look at the data and create recommendations to communicate to the manufacturer
how to improve the phone. Parallels were then drawn between how participants said they would
evaluate a phone and how the evaluator evaluated the Park Crew program. The program manager
talked about the goals and objectives of the Park Crew, identified the goals of the evaluation,
explained the methods the evaluator used to gather data, and emphasized that the participants
were brought into the evaluation process to interpret the data and find ways to improve the work
of the Park Crew.

 During the first two workshops, participants reflected on learning objectives for the activities,
discussed what they felt worked well the previous summer and what should be changed, and
reviewed the evaluation data. The participants began the workshop by reflecting on what they
did for each activity. Next they were asked to list what they wanted visitors to learn from the
activity. The evaluator then introduced similar data from the previous summer’s summative
evaluation, where participants were asked the same question related to learning objectives.



                                                                                                     5
Participants added any additional objectives that were mentioned in the summative interviews to
their list. This led to a discussion about learning objectives and how activities can be taught to
ensure objectives are addressed.

 After defining the learning objectives for an activity, participants reflected on their experience
leading that activity. The participants were asked to answer two questions for each of their
activities:
    • What would you keep and why? (e.g. things they enjoyed, visitors enjoyed, worked well,
         etc.)
    • What would you change and why? (e.g. things they did not enjoy, visitors were not
         interested in, did not work well, etc.)
The team worked in groups of three to come up with as many ideas as they could for each
activity. They were encouraged to brainstorm without limitations of cost or feasibility.
Participants considered all aspects of doing a program including the training they received to
learn the activities, their own process of learning the activities, setting up the activities in the
BBY, leading the activities with visitors, and cleaning up after the activities. The program
manager provided encouragement to participants as they worked through the process and
reminded them to elaborate on ideas when needed. When the participants finished coming up
with their ideas, they shared them with the larger group. Using a sticky wall, participants placed
their ideas under the corresponding keep or change column for the activity (See Figure 2). As a
large group, they identified any similarities among ideas and created clusters of things that
worked well or things they would change.




Figure 2: Photograph of sticky wall with keep and change data for Watershed Walk

After participants reflected on the activity, they were presented with evaluation data for that
activity. Instead of seeing the interpretations of the data from the summative evaluation report,
the participants were presented with tables and graphs of the data. This allowed them to come to
their own conclusions about the data without the evaluator’s interpretations influencing their


                                                                                                   6
thinking. The evaluator started the data discussion by leading the team through a brief overview
of reading graphs, sampling, means, and medians. This introduction also provided a learning
opportunity for participants to apply and improve their math skills. The participants were then
shown the data for each corresponding activity. To help them think about and discuss the data,
the following questions were posed:
    • What are your first impressions of the data?
    • Is there anything that surprises you?
    • Based on the data, what are some changes you would suggest?
    • What would you keep doing the same way?
Participants wrote down additional things they wanted to change or keep based on their
interpretation of the data and added them to the sticky wall. During the data discussions, the
evaluator and program manager did not interpret the data, but allowed each participant to come
up with his or her own interpretations. As needed, the evaluator asked probing questions to help
participants correctly read a graph, make comparisons between some of the data, and think about
why they were interpreting the data in certain ways.

Participants’ interpretations of the data were often similar to those of the evaluator in the
summative evaluation report. One example was the interpretation of how effectively the content
was delivered for the Watershed Walk activity. While observing this activity, evaluators had
recorded the total number of times participants mentioned reasons why something was
considered a pollutant and what people could do to prevent that particular kind of pollution.
When participants looked at graphs of this data, they were surprised at how infrequently they had
talked about the reasons and actions for the various pollutants, because they felt they had covered
these topics in all of their demonstrations (See Figure 3). One participant said, “We should be
talking about these 100% of the time.” The evaluator pointed out that this feeling was also
reflected in the interviews from the previous year’s summative evaluation, when two-thirds of
the participants had said they wanted visitors to learn about these two topics. Participants had
identified that they wanted visitors to learn this information, but when they actually carried out
the activity, they did not follow through. This discrepancy led to a discussion about objectives
and the disconnect between what they saw as objectives and what they were actually teaching
visitors. Participants said in some cases they did not talk about certain pollutants because they
either did not know the information or feel confident enough to talk about it. In order to attain
these objectives, participants decided they needed to increase their knowledge about why certain
things are considered pollutants and what people can do to prevent them. They created cards for
the sticky wall that reflected their expectation of what content should be delivered during every
Watershed Walk and their desire for more specific content-related training. Participants ended up
creating their own expectations for their performance, expectations that the program manager
always had of them but now the participants were adopting for themselves.




                                                                                                 7
Figure 3: How often participants talked about why something is considered a pollutant (n=27)

There was one instance where participants interpreted the data quite differently than the
evaluator and program manager. For the Watershed Walk activity, participants were expected to
present the walk every half hour. Between walks, they cleaned up from the previous walk, set up
the next walk, or waited for visitors to stop by. One of the expected outcomes of participating in
the Park Crew is for participants to acquire skills and confidence interacting with museum
visitors. These skills were measured by observing participant interactions with visitors passing
by and/or stopping at the watershed model (the basis for the Watershed Walk activity).
Observations revealed that participants were not comfortable with the most basic levels of
interaction, greeting visitors and inviting them to participate in the activity. Most of the time
(86%), participants said nothing to visitors as they passed by, not even “hello.” When
participants were shown these data, they did not interpret it as a potential area of improvement.
Instead they talked about their need to set up or clean up for the activity. It did not occur to any
of the participants that they should stop for a moment to at least acknowledge the visitor, if not
invite her or him back for the Watershed Walk. This was eye opening for the program manger. It
became clear that training for the BBY needed to include the same customer service training that
other museum staff receive, instead of solely focusing on presenting the four activities. This was
one case during the workshops where the program manager added an idea to the sticky wall to
make sure the necessity of customer service training did not get lost in the process.

Workshop 3: Generating Recommendations
Participants spent the third workshop drawing connections from the ideas generated during the
first two workshops to their future BBY work. The program manager facilitated this process
using a consensus-building workshop technique (Bucki, 2008). The primary goal of the
consensus-building workshop was to give the participants a voice in creating recommendations
for program improvement. While the evaluator and program manager could have developed
recommendations on their own, it was important to include participants’ thoughts and opinions
as part of the democratic process of learning and decision-making. The workshop began by


                                                                                                   8
posing the question, “What do we need to do in order to reach our goals working in the Big Back
Yard?” Participants utilized their list of intended objectives for each activity and their
keep/change reflections from the first two workshops to answer the question. The participants
and program manager individually brainstormed answers to the question and shared their ideas
with a partner. The partners chose 8-10 of their best ideas and wrote them down on half sheets of
paper. Then they posted several of their ideas on the sticky wall and identified ones with shared
meaning. They clustered these into pairs and each became a column, or theme. They placed all
their remaining ideas under the column to which each best fit. By the end of the consensus-
building workshop, the team had identified and named seven themes of recommendations they
wanted to focus on in preparation for their upcoming BBY work: 1) things to change, 2) seeking
new information, 3) hands-on field trips, 4) visitor interaction, 5) activity preparation, 6)
advertising activities, and 7) space and comfort.

   1) Things to change. The team identified a number of large and small changes to each
      activity that would improve either their ability to teach the activity or the visitor's
      experience.
   2) Seeking new information. The team decided to seek new information on local watershed
      issues and local fossil history based on where they lacked or desired more content
      knowledge.
   3) Hands-on field trips. They also wanted to increase their content knowledge by going on
      more field trips.
   4) Visitor interaction. In order to improve visitor interactions, participants suggested
      creating cards for visitor questions they were unable to answer that could be sent back to
      the visitor after the answer was found. Participants also decided they wanted to plan
      multiple ways to talk to and interact with visitors, as well as survey visitors about their
      experience in the BBY.
   5) Activity preparation. The team desired time to revise the activities’ learning objectives
      and practice teaching the activities to each other.
   6) Advertising activities. Responding to occasional low attendance in the BBY, participants
      proposed new advertising techniques such as placing one Park Crew staff inside the
      museum to recruit visitors and creating and posting additional signs throughout the
      museum.
   7) Space and comfort. Participants identified ways to improve their working conditions such
      as adding tents or awnings to their outdoor workspaces.

Workshop 4: Implementing Recommendations
After participants brainstormed, clustered, and named their ideas, the program manager
facilitated a final workshop to incorporate the team’s recommendations into training sessions for
the four BBY activities. He mapped out the remaining shifts until the start of summer and had
the participants think about how to best use the shifts to prepare for the BBY season. The team
decided to work in small groups to plan training sessions for the four activities. Each group
chose one of the activities and was responsible for planning two days of training for the rest of
the crew. Participants made sure their training days addressed each of the seven recommendation
themes as they related to their activity. This final workshop allowed participants to have a direct
say in how they were trained and what skills and knowledge they needed to feel comfortable
teaching their activities.



                                                                                                  9
It was important that participants had the chance to make suggestions and offer feedback on their
experience throughout the four participatory evaluation workshops. Rather than simply receiving
a mixture of positive and negative feedback from the evaluation report via the program manager,
participants were given the opportunity to interpret and respond critically to the evaluation data.
The team worked together to make sense of the data in their own way and synthesized it into a
comprehensive list of training suggestions for the upcoming season, some of which the program
manager might not have considered had he not sought their input. Participants left the workshops
with a stronger sense of ownership and control of their work because they were given time to
express their opinions, provide their own interpretations of what was observed in the BBY, and
create recommendations for improvement, many of which were implemented in the upcoming
summer.

Involving Participants in the Entire Evaluation Process

The participatory evaluation workshops led to the team’s participation in a full participatory
evaluation process. After reviewing the Park Crew summative evaluation data, the team realized
there was an information gap. One participant said, "Well, we need to talk to visitors 'cuz we
want to find out what they think!" Shortly after the workshops the participants came to the
evaluator with their desire to formatively evaluate each of the four BBY activities with visitors
that summer. The participants had already identified the goals of their evaluation, to understand
what visitors learn from their activities and to improve visitors’ experiences. The program
manager helped participants identify questions they wanted to ask visitors, which they then
shared with the evaluator. The evaluator and participants discussed the questions and the types of
information the questions would elicit to ensure they gathered their desired information.
Participants also identified additional questions to include in the survey. The participants
stressed that they wanted two versions of the survey, one for adults and a “kid-friendly” version.
The evaluator formatted the surveys and had the participants review them. The evaluator trained
the participants to collect survey data from visitors participating in BBY activities. Evaluators
modeled the process and then shadowed participants during their first day of data collection to
offer pointers and be available for questions. After that point, the participants managed the
survey data collection. Due to scheduling constraints, the participants did not have time to enter
the data so the evaluator completed that step. The survey data was then shared in a participatory
manner, similar to how the participants engaged with the summative data from the previous
summer. The team discussed graphs and tables of data, decided what they wanted to keep or
change based on the data, and generated recommendations for their future work in the BBY.

DISCUSSION
Factors That Led to a Successful Participatory Evaluation Experience
The participatory evaluation process was an overall success and a beneficial experience for all
involved. The literature identifies a variety of factors that lead to a successful participatory
evaluation process, including balanced partnerships, sufficient time allocated to the process, and
the program manager's regard for evaluation and interest in including participants in the process
(Cousins & Earl, 1992, 1995a; Cousins & Whitmore, 1998; King, 1995). These factors were
present in carrying out the participatory evaluation workshops, leading to the success of the
process..


                                                                                                10
Balanced partnerships are vital to participatory evaluation, with the evaluator and participants
bringing their own knowledge and expertise to the process (Cousins & Earl, 1995a; Cousins &
Whitmore, 1998). The program manager brought his expertise of working with the participants.
He identified strategies to engage participants with the data, kept them on task during the
workshops, and offered them encouragement when needed. The evaluator brought her evaluation
expertise. She taught participants about evaluation and helped them use their analytical skills to
make sense of the data. Participants brought their expertise as front-line staff working directly
with visitors. They were able to provide their unique perspective to the interpretation of the data
and generate recommendations based on their first-hand knowledge of the activities.


Sufficient time is necessary to engage in participatory evaluation without rushing the process
(King, 1995). Originally, the workshops were scheduled to cover two three-hour sessions.
However, participants became deeply engaged, often spending considerable amounts of time
reflecting on their prior experiences in the BBY. The participants’ positive energy and
engagement made it clear by the end of the first day that the process of interpreting data and
generating recommendations would have to be extended to allow enough time for a meaningful
experience. Rushing participants through the workshops could have negatively affected their first
participatory evaluation experience and diminished the use of the findings (King, 1998). Because
of the program manager's flexibility with the team’s schedule, two additional three-hour
workshop days were added.

Participants must value evaluation and have an interest in engaging in the evaluation process
(Cousins & Earl, 1992; King, 1995). Without the interest of those involved, the evaluator will
have a difficult time successfully carrying out a participatory process (King, 1995). The interest
of the program manager to engage participants in the evaluation process was vital to ensuring the
successful implementation of a participatory evaluation approach. The engagement of
participants during the workshops also led to its overall success. Participant engagement was
evident by the lengthy discussions of the four activities. It was clear participants wanted to
improve their work in the BBY on both a personal and team level. The interest and enthusiasm
for the process was also evident in the participants’ desire to evaluate their own activities with
visitors, as well as the program manager advocating participatory evaluation to other KAYSC
staff.

Benefits of the Participatory Evaluation Process
The program manager and youth reaped a number of benefits from their involvement in the
participatory evaluation process, as evident from their reflections and behaviors after the
workshops. The youth reflected on their experience and the benefits after each workshop and
while planning for the upcoming summer. The evaluator and the program manager debriefed on
the evaluation process, discussing benefits of the process for them and what they had observed
and heard as benefits for the youth.

Benefits to the Program Manager
A major outcome of participatory evaluation is evaluation capacity building of those involved in
the process (King & Stevahn, 2002). The participatory evaluation process provided professional



                                                                                                11
development for the program manager, who had little evaluation experience going into the
workshops. The collaboration with the evaluator increased the program manager's understanding
of evaluation, his ability to interpret evaluation data and generate recommendations, and his
confidence in including youth as participants in future evaluations.


The program manager's evaluation capacity led to his desire to engage a different youth team, the
Water Crew, in a participatory evaluation process. The team was developing interactive quizzes
for an upcoming exhibition and wanted to find out if the quizzes successfully communicated
their desired messages. Having recently completed the participatory evaluation workshops, the
program manager suggested some possible evaluation methods and the team contacted the
internal evaluator to help carry out the evaluation. The participants had already identified some
questions they wanted to ask visitors, so the evaluator helped them refine their interview
questions and trained the participants in conducting visitor interviews. The Water Crew then
carried out cued interviews with visitors. The evaluator offered participants support and guidance
as needed. The evaluator and program manager then shared the data with the participants. The
participants interpreted the data and identified revisions to make to their quizzes. The evaluation
of the Water Crew quizzes is an example of a small-scale participatory evaluation that was
prompted by the program manager's evaluation knowledge and capacity.

In addition to increasing his evaluation capacity, the participatory evaluation process also
provided the program manager with important feedback about the Park Crew's work. Throughout
the workshops, participants reflected on their work, talking about what they personally felt
worked well and what they would change. Participants also gave thoughtful suggestions on how
to better prepare for their work in the BBY and keep motivated throughout the summer. They
shed light on aspects of their work the program manager had not considered, such as their lack of
confidence in delivering some of the information and their desire for more training opportunities.
As a result of the participatory process, the program manager had a deeper understanding of the
participants’ experience in the BBY and could proactively identify and respond to their needs.

Benefits to the Participants
The participatory evaluation process also increased participants’ understanding of evaluation.
Meaningful involvement in evaluation can help youth develop higher order thinking skills,
specifically analytic and evaluative skills (London et al., 2003). Participants also become more
reflective about themselves, their work, and their overall program, as was evident in the Park
Crew’s deep engagement in the workshops and the thoughtful recommendations they created
(Sabo Flores, 2008). Participants’ understanding and comfort with evaluation was also evident
by their desire to engage in a formative evaluation process to further inform and improve their
BBY activities.

Involving participants in the evaluation process transforms their understanding of their program
(Checkoway et al., 2003). The workshops served the dual purpose of program improvement and
training for the upcoming year. Some of the participants were new to the Park Crew and had not
led activities in the BBY before. By taking part in such a thorough reflection of the team's
previous work and planning for the upcoming summer, new participants left the workshops well
poised to begin training. One participant shared his increased understanding, "I know more about



                                                                                                12
the things we do in the Big Back Yard." Participant comments also reflected their increased level
of knowledge about the activities and confidence in sharing that knowledge with visitors. One
participant said, "I feel very confident now," while another shared, "I feel we are going to do
really good this summer." The team's increased knowledge and confidence created a shared
positive attitude toward their BBY work which provided a motivational springboard to the
training work.

Participants experience a sense of empowerment and pride when they have an influence on the
way programs are run and see their ideas acted upon (Checkoway et al., 2003; Horsch et al.,
2002; London et al., 2003). The participants generated thoughtful suggestions for how they could
improve their work experience, but they needed outlets to implement their ideas in order to feel
empowered by the process. The program manager, rather than planning and leading the BBY
training himself, had the participants use their suggestions to develop their own training sessions.
Participants also shared their ideas with a museum operations staff member who was responsible
for BBY operations. The team suggested ways to increase the number of BBY visitors and
amenities that would allow them to more effectively do their job. The museum operations staff
member implemented several ideas that were realistic given time and financial constraints.
Participants were able to see their suggestions put in place over the course of the summer. They
learned firsthand how to make changes within a large organization and realized they shared a
collective voice as valued staff members.

Participants also made connections that helped them understand the evaluation work that occurs
within the museum. Initially, some of the participants were a bit uncomfortable being observed
by the evaluation staff as part of the summative evaluation. After meeting the evaluator and
engaging with the evaluation data, they had a deeper understanding of what observations can
reveal about a program. One participant expressed a renewed sense of comfort with the
evaluation process sharing, "I feel being watched paid off in the long run." The team learned the
importance of evaluation in a museum setting and the different ways evaluators help improve
museum programs and projects. It was empowering for participants to feel a deeper tie to the
museum by knowing more about the various types of work done in the institution.

CONCLUSION

As a result of using a participatory evaluation approach to involve youth in evaluation, the
KAYSC, and more broadly the Science Museum of Minnesota, now have a format for engaging
participants in evaluation practice and training them to be more self-reflective in their work. A
sustainable cycle of formative evaluation (evaluation, reflection, change, and more evaluation)
has been developed for a group of participants that previously had little or no knowledge of
evaluation practice. The internal evaluator and program manager’s partnership increased their
confidence and interest in participatory evaluation, which is evident in their use of the process
with other KAYSC evaluations. Additionally, the successful Park Crew participatory evaluation
experiences influenced the KAYSC's commitment to using a youth participatory evaluation
approach with future evaluations.

Although this article focused on youth participatory evaluation, participatory processes can be
used with all types of primary users. Participatory evaluation is also not limited to program



                                                                                                  13
evaluation. Exhibit evaluation can also benefit from a participatory evaluation approach.
Partnering with primary intended users, building their evaluation capacity, and increasing their
level of participation in carrying out various phases of a formative evaluation enhances their
understanding and ownership of the findings, which in turn leads to increased use of the data to
improve their exhibit or program (Cousins & Earl, 1995a; Cousins & Whitmore, 1998; King &
Stevahn, 2002; Patton, 2008). Users also gain a deeper understanding of evaluation practice and
enhance their evaluative thinking skills through direct involvement in carrying out the
evaluation, offering input, and reflecting on the process (Cousins & Earl, 1992). Primary
intended users become more reflective of their work throughout the exhibit or program
development process and develop the capacity to become more active participants in future
evaluations (King, 1995).

Note
1. To obtain a copy of the KAYSC BBY Park Crew Summative Evaluation Report, which also
includes the participants’ recommendations, visit www.smm.org/researchandeval

Acknowledgments
The Park Crew and related evaluation work are funded by the National Center for Earth-surface
Dynamics, a National Science Foundation Science and Technology Center. We thank the
Science Museum of Minnesota's KAYSC and Department of Evaluation & Research in Learning
staff for their feedback and support throughout the project. We also express our gratitude to the
2007 and 2008 Park Crew youth staff for their energetic participation and openness to the youth
participatory evaluation process.

References
Bucki, J. S. (2008). Participatory leadership skills. St. Paul, MN: GoodWorkTools.com.
Checkoway, B., Dobbie, E., & Richards-Schuster, K. (2003). The Wingspread Symposium:
       Involving young people in community evaluation research. CYD Journal, 4(1), 7-11.
Cousins, J.B., & Earl, L. (1995a). The case for participatory evaluation: Theory, research,
       practice. In Cousins, J.B., & Earl, L. (Eds.). Participatory evaluation in education:
       Studies in evaluation use and organizational learning. (pp. 3-18). London: Falmer.
Cousins, J.B., & Earl, L. (1995b). Participatory evaluation in education: What do we know?
       Where do we go? In Cousins, J.B., & Earl, L. (Eds.). Participatory evaluation in
       education: Studies in evaluation use and organizational learning. (pp. 159-180). London:
       Falmer.
Cousins, J.B., & Earl, L. (1992). The case for participatory evaluation. Educational Evaluation
       and Policy Analysis, 14, 397-418.
Cousins, J.B., & Whitmore, E. (1998). Framing participatory evaluation. New Directions for
       Evaluation, 80, 5-23.
Horsch, K., Little, P., Chase Smith, J., Goodyear, L., & Harris, E. (2002). Youth involvement in
       evaluation and research. (Issues and Opportunities in Out-of-School Time Evaluation
       No. 1). Retrieved December 14, 2008, from
       http://www.hfrp.org/content/download/1093/48598/file/issuebrief1.pdf
King, J. A. (1995). Involving practitioners in evaluation studies: How viable is collaborative
       evaluation in schools? In Cousins, J.B., & Earl, L. (Eds.). Participatory evaluation in
       education: Studies in evaluation use and organizational learning. (pp. 159-180). London:



                                                                                               14
Falmer.
King, J.A. & Stevahn, L. (2002). Three frameworks for considering evaluator role. In Ryan,
        K.E., & Schwandt, T.A. (Eds.). Exploring evaluator role and identity. (pp. 1-16).
        Greenwich, CT: Information Age Publishing.
London, J., Zimmerman, K., & Erbstein, N. (2003). Youth-led research and evaluation: Tools for
        youth, organizational, and community development. New Directions for Evaluation, 98,
        33-45.
Patton, M.Q. (2008). Utilization-focused evaluation. (4th ed.) Thousand Oaks, CA: Sage.
Sabo Flores, K. (2008). Youth participatory evaluation: Strategies for engaging young people.
        San Francisco, CA: Jossey-Bass.
Stufflebeam, D. L. (1980). An interview with Daniel L. Stufflebeam. Educational Evaluation
        and Policy Analysis, 2(4): 90-92.
The Joint Committee on Standards for Educational Evaluation. (1994). The program evaluation
        standards (2nd ed.) Thousand Oaks, CA: Sage.
Velure Roholt, R., & Steiner M. (2005). "Not Your Average Workplace" - the Youth Science
        Center, Science Museum of Minnesota. Curator, 4(2), 141-157.

About the Authors
Amy Grack Nelson is an Evaluation and Research Associate in the Science Museum of
Minnesota's Department of Evaluation and Research in Learning. Email: agnelson@smm.org

Robby Callahan Schreiber is a Youth Program Manager in the Science Museum of Minnesota's
Kitty Andersen Youth Science Center. Email: rschreiber@smm.org.




                                                                                           15

Más contenido relacionado

Similar a Empowering youth to be evaluators: Involving Young People in Evaluating Informal Education Programs - Callahan Schreiber Article

Increasing students’ environmental attitude through Visual and Performance Ar...
Increasing students’ environmental attitude through Visual and Performance Ar...Increasing students’ environmental attitude through Visual and Performance Ar...
Increasing students’ environmental attitude through Visual and Performance Ar...INNS PUBNET
 
Analysis Of Learners Fieldtrip Talk During A Collaborative Inquiry Task
Analysis Of Learners  Fieldtrip Talk During A Collaborative Inquiry TaskAnalysis Of Learners  Fieldtrip Talk During A Collaborative Inquiry Task
Analysis Of Learners Fieldtrip Talk During A Collaborative Inquiry TaskLori Moore
 
Project co curriculam
Project  co curriculamProject  co curriculam
Project co curriculamaryapratheesh
 
Citizen scientist - Open w/ badges
Citizen scientist - Open w/ badgesCitizen scientist - Open w/ badges
Citizen scientist - Open w/ badgesEileen O'Connor
 
Authentic assessment_ An instructional tool to enhance students l.pdf
Authentic assessment_ An instructional tool to enhance students l.pdfAuthentic assessment_ An instructional tool to enhance students l.pdf
Authentic assessment_ An instructional tool to enhance students l.pdfFelizaGalleo1
 
From ‘Citizen to Civic Science’ – Linking Our Activities to Quality Education...
From ‘Citizen to Civic Science’ – Linking Our Activities to Quality Education...From ‘Citizen to Civic Science’ – Linking Our Activities to Quality Education...
From ‘Citizen to Civic Science’ – Linking Our Activities to Quality Education...ESD UNU-IAS
 
Amulya Project Briefs final
Amulya Project Briefs finalAmulya Project Briefs final
Amulya Project Briefs finalJoy Amulya
 
SL Newsletter 5:19:15
SL Newsletter 5:19:15SL Newsletter 5:19:15
SL Newsletter 5:19:15Eugene Sedita
 
Tina Phillips (Cornell Lab of Ornithology) - the DEVISE project
Tina Phillips (Cornell Lab of Ornithology) - the DEVISE projectTina Phillips (Cornell Lab of Ornithology) - the DEVISE project
Tina Phillips (Cornell Lab of Ornithology) - the DEVISE projectCitizenCyberlab
 
The power of cs in education moraitopoulou elina republica 2017
The power of cs in education moraitopoulou elina republica 2017The power of cs in education moraitopoulou elina republica 2017
The power of cs in education moraitopoulou elina republica 2017Elina MORAITOPOULOU
 
Action Research And Community Problem Solving Environmental Education In An ...
Action Research And Community Problem Solving  Environmental Education In An ...Action Research And Community Problem Solving  Environmental Education In An ...
Action Research And Community Problem Solving Environmental Education In An ...Justin Knight
 
SOWK 5339 Integration Paper
SOWK 5339 Integration PaperSOWK 5339 Integration Paper
SOWK 5339 Integration PaperSarah Walters
 
The guide of best practices on open knowledge activities.pptx.pdf
The guide of best practices on open knowledge activities.pptx.pdfThe guide of best practices on open knowledge activities.pptx.pdf
The guide of best practices on open knowledge activities.pptx.pdfKai Pata
 

Similar a Empowering youth to be evaluators: Involving Young People in Evaluating Informal Education Programs - Callahan Schreiber Article (20)

Engaging Next Generation WS - Justin Hougham
Engaging Next Generation WS - Justin HoughamEngaging Next Generation WS - Justin Hougham
Engaging Next Generation WS - Justin Hougham
 
Increasing students’ environmental attitude through Visual and Performance Ar...
Increasing students’ environmental attitude through Visual and Performance Ar...Increasing students’ environmental attitude through Visual and Performance Ar...
Increasing students’ environmental attitude through Visual and Performance Ar...
 
Analysis Of Learners Fieldtrip Talk During A Collaborative Inquiry Task
Analysis Of Learners  Fieldtrip Talk During A Collaborative Inquiry TaskAnalysis Of Learners  Fieldtrip Talk During A Collaborative Inquiry Task
Analysis Of Learners Fieldtrip Talk During A Collaborative Inquiry Task
 
172157
172157172157
172157
 
Project
ProjectProject
Project
 
JURNAL SAINS 6
JURNAL SAINS 6JURNAL SAINS 6
JURNAL SAINS 6
 
Project co curriculam
Project  co curriculamProject  co curriculam
Project co curriculam
 
AERA paperfinal
AERA paperfinalAERA paperfinal
AERA paperfinal
 
Citizen scientist - Open w/ badges
Citizen scientist - Open w/ badgesCitizen scientist - Open w/ badges
Citizen scientist - Open w/ badges
 
Authentic assessment_ An instructional tool to enhance students l.pdf
Authentic assessment_ An instructional tool to enhance students l.pdfAuthentic assessment_ An instructional tool to enhance students l.pdf
Authentic assessment_ An instructional tool to enhance students l.pdf
 
From ‘Citizen to Civic Science’ – Linking Our Activities to Quality Education...
From ‘Citizen to Civic Science’ – Linking Our Activities to Quality Education...From ‘Citizen to Civic Science’ – Linking Our Activities to Quality Education...
From ‘Citizen to Civic Science’ – Linking Our Activities to Quality Education...
 
Amulya Project Briefs final
Amulya Project Briefs finalAmulya Project Briefs final
Amulya Project Briefs final
 
SL Newsletter 5:19:15
SL Newsletter 5:19:15SL Newsletter 5:19:15
SL Newsletter 5:19:15
 
Tina Phillips (Cornell Lab of Ornithology) - the DEVISE project
Tina Phillips (Cornell Lab of Ornithology) - the DEVISE projectTina Phillips (Cornell Lab of Ornithology) - the DEVISE project
Tina Phillips (Cornell Lab of Ornithology) - the DEVISE project
 
Ed Grant 2008
Ed Grant 2008Ed Grant 2008
Ed Grant 2008
 
The power of cs in education moraitopoulou elina republica 2017
The power of cs in education moraitopoulou elina republica 2017The power of cs in education moraitopoulou elina republica 2017
The power of cs in education moraitopoulou elina republica 2017
 
Action Research And Community Problem Solving Environmental Education In An ...
Action Research And Community Problem Solving  Environmental Education In An ...Action Research And Community Problem Solving  Environmental Education In An ...
Action Research And Community Problem Solving Environmental Education In An ...
 
Sakimoto Iya09
Sakimoto Iya09Sakimoto Iya09
Sakimoto Iya09
 
SOWK 5339 Integration Paper
SOWK 5339 Integration PaperSOWK 5339 Integration Paper
SOWK 5339 Integration Paper
 
The guide of best practices on open knowledge activities.pptx.pdf
The guide of best practices on open knowledge activities.pptx.pdfThe guide of best practices on open knowledge activities.pptx.pdf
The guide of best practices on open knowledge activities.pptx.pdf
 

Más de MN Association for Environmental Education

The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...MN Association for Environmental Education
 
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...MN Association for Environmental Education
 
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...MN Association for Environmental Education
 
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...MN Association for Environmental Education
 
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...MN Association for Environmental Education
 

Más de MN Association for Environmental Education (10)

The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
 
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
 
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
 
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
The Ripple Effect: Using Math, Science and Technology to Learn about Water Re...
 
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
Empowering youth to be evaluators: Involving Young People in Evaluating Infor...
 
Youth Energy Summit Presentation Maee
Youth Energy Summit Presentation MaeeYouth Energy Summit Presentation Maee
Youth Energy Summit Presentation Maee
 
Families 101: Engaging Multi-Aged Audiences
Families 101: Engaging Multi-Aged AudiencesFamilies 101: Engaging Multi-Aged Audiences
Families 101: Engaging Multi-Aged Audiences
 
The Mentoring Life Cycle - Handout
The Mentoring Life Cycle - HandoutThe Mentoring Life Cycle - Handout
The Mentoring Life Cycle - Handout
 
Sights, Sounds, Smells - Exploring School Forests Lesson Plan
Sights, Sounds, Smells - Exploring School Forests Lesson PlanSights, Sounds, Smells - Exploring School Forests Lesson Plan
Sights, Sounds, Smells - Exploring School Forests Lesson Plan
 
Lee Frelich's "Climate Change & Forests" Presentation
Lee Frelich's  "Climate Change & Forests" PresentationLee Frelich's  "Climate Change & Forests" Presentation
Lee Frelich's "Climate Change & Forests" Presentation
 

Empowering youth to be evaluators: Involving Young People in Evaluating Informal Education Programs - Callahan Schreiber Article

  • 1. Participatory evaluation: A case study of involving stakeholders in the evaluation process by Amy Grack Nelson & Robby Callahan Schreiber Citation: Grack Nelson, A. & Callahan Schreiber, R. (2009). Participatory evaluation: A case study of involving stakeholders in the evaluation process. Visitor Studies, 12(2), 199 – 213. Abstract: One way to ensure that an evaluation has “utility” is to use a participatory evaluation approach where the evaluator partners with primary users to carry out various phases of an evaluation. This article provides a case study of how participatory evaluation was used in an out-of-school youth development and employment program at the Science Museum of Minnesota's Kitty Andersen Youth Science Center. Youth staff participated in a series of evaluation workshops where they learned about evaluation, made their own meaning of evaluation data, generated recommendations to improve their museum work, and ultimately took more ownership and control of their program. As a result of the workshops, youth also carried out their own formative evaluation of four museum programs. The program manager and the youth staff reaped a number of benefits from this process. The factors leading to its success are discussed. Keywords: participatory evaluation, formative evaluation, program evaluation, youth development, evaluation capacity building, museum According to The Joint Committee on Standards for Educational Evaluation's Program Evaluation Standards (1994), one of the four essential features of all evaluations is "utility." The Joint Committee's rationale for the utility standard was, "An evaluation should not be done if there is no prospect for its being useful to some audience" (Stufflebeam, 1980, pp. 90). One way to help ensure use is to increase the primary intended users’ level of participation in the evaluation (Cousins & Earl, 1995a; Patton, 2008). A means to do this is to undertake evaluation using a participatory evaluation process, which has the core purpose of increasing evaluation use (Cousins & Earl, 1995a). Participatory evaluation is "applied social research that involves a partnership between trained evaluation personnel and practice-based decision-makers, organizational members with program responsibility, or people with a vital interest in the program" (Cousins & Earl, 1992, pg. 399-400). The process recognizes and draws upon the evaluation expertise of the evaluator and the program expertise of the primary user. The evaluator partners with primary intended users to carry out various phases of the evaluation including constructing evaluation questions, designing instruments, collecting data, analyzing results, generating recommendations, and disseminating findings (Cousins & Earl, 1995a). Involving users in the evaluation enhances the relevance of the evaluation, understanding of the data, and ownership of the findings, all important to increasing use (Cousins & Whitmore, 1998; King & Stevahn, 2002; Patton, 2008). In addition to increasing use, participating in the evaluation provides opportunities for users to develop the analytic and evaluative skills necessary for meaningful participation in the evaluation process (Cousins & Earl, 1992). This article provides a case study of how a participatory evaluation approach was used in an out- of-school youth program at the Science Museum of Minnesota's Kitty Andersen Youth Science Center (KAYSC). In the youth development field, participatory evaluation involving youth is referred to as youth participatory evaluation (Sabo Flores, 2008). Youth participatory evaluation 1
  • 2. actively involves youth in a meaningful way to improve the experience they have in a program, instead of being passive recipients of program services or subjects in an evaluation (Checkoway, Dobbie, & Richards-Schuster, 2003; Horsch, Little, Chase Smith, Goodyear, & Harris, 2002; London, Zimmerman, & Erbstein, 2003). When adults treat youth as equals and acknowledge and respect their voices in the evaluation process, it increases youth self-confidence that they have valuable assets, knowledge and insights (Checkoway et al., 2003; Sabo Flores, 2008). This article discusses the participatory process that was used with a group of youth, the factors that led to a successful process, and the benefits of the process to the program manager and youth. CASE STUDY The KAYSC was created in 1996 to encourage young people to engage in experiential learning that develops their confidence in and appreciation for STEM fields (Science, Technology, Engineering and Mathematics). The KAYSC is built on the belief that youth are active participants in their own learning where discovery, ownership, and responsibility incite a positive attitude toward the sciences. KAYSC programs annually engage approximately 100 youth 12-18 years old in STEM educational and professional opportunities. Youth work as museum volunteers or part-time, paid staff in small teams facilitated by adult program managers. They educate museum visitors, build exhibits, develop community STEM projects/presentations, and do scientific research, which make their experiences real and meaningful (Velure Roholt & Steiner, 2005). Participants are empowered to take an active role in the planning, research, design, implementation, and evaluation of their own program and projects. The Park Crew is one of the senior level teams in the KAYSC. Throughout this article, Park Crew youth staff will be referred to as “participants” while those who participated in the crew’s programs will be referred to as “visitors.” The team is comprised of eight to twelve junior and senior high school students who are led by an adult program manager, Robby Callahan Schreiber. The Park Crew began in 2003 through funding from the National Center for Earth- surface Dynamics (NCED), a National Science Foundation Science and Technology Center. The team provides public outreach activities based on the work of NCED and related topics. As a result of participating in the Park Crew, participants learn about earth-surface processes, specifically those related to water, and the role humans play in affecting the processes; personally develop and strengthen teaching skills so they can educate the public (museum visitors and school outreach groups) about earth-surface processes; and become aware of STEM careers and explore educational and professional opportunities in these fields. The primary activities of the Park Crew are developing and sharing water-based earth-surface process activities with the public. The participants visit after-school program sites during the winter to share activities with elementary students. During the summer, the Park Crew facilitates activities in the museum’s Big Back Yard (BBY), a 1.75-acre outdoor science park. Over 2,000 museum visitors interact with the Park Crew’s activities each summer. The activities include Watershed Walk, Fossil Making, Macroinvertebrate Investigation, and Restore the River. • Watershed Walk is a hands-on demonstration of non-point source pollution that includes helpful suggestions of how visitors can improve local water quality. Watershed Walk is the only activity with a predefined schedule of demonstrations every half hour. • Fossil Making is a make-and-take activity in which visitors learn about local fossils and create their own fossil to take home. 2
  • 3. Macroinvertebrate Investigation is a free-choice activity where visitors lead their own up- close exploration of aquatic macroinvertebrates using microscopes and hand lenses. • Restore the River is an interactive demonstration of what happens to a dammed river before and after the removal of the dam. During the summer of 2007, Amy Grack Nelson, Evaluation & Research Associate at the Science Museum of Minnesota, carried out a summative evaluation of the Park Crew program1. The purpose of the evaluation was to understand how the participants implemented the activities and what they learned about earth-surface processes, teaching others, and STEM careers. A mixed-methods design was used, which included observations and pre- and post-program interviews of the participants. The evaluator observed two activities with varying presentation styles, Watershed Walk and Macroinvertebrate Investigation. Observations focused on how participants presented the activities and interacted with visitors. Pre-program interviews included questions related to what participants wanted visitors to learn from each of the four activities, while post-program interviews asked participants about their experience working in the BBY and their comfort level presenting the activities. The summative evaluation not only provided insight into the extent to which the program’s goals and outcomes had been addressed, it also brought to light many areas of potential improvement. Since the program was ongoing, the program manager wanted to find ways to address some of the program’s shortcomings and make improvements before the following summer, thus using the summative findings in a formative manner. The evaluator and program manager decided they wanted to share the evaluation data with the participants as a way to help identify areas of improvement. Since the evaluation findings were not entirely positive, it was important to share the data in a way that allowed participants to make their own conclusions instead of having an adult appear to point out what they did wrong. It was also important that the Park Crew's first experience with evaluation data was positive and did not result in participants having negative feelings about their work or evaluation. A situation had to be devised to allow participants to personally identify their successes and take ownership of what improvements needed to be made. For this reason, a participatory evaluation approach was used to share and make sense of the data with the participants. Youth participation in evaluation was not new to the KAYSC. In the past, external evaluators used a participatory process with two KAYSC program evaluations. Unfortunately, a participatory approach did not carry over into other evaluations. Staff turnover, lack of evaluation capacity of the staff, and the absence of an internal evaluator, made it difficult to establish a commitment to participatory evaluation processes across all KAYSC youth programs. In order for participatory evaluation to become integrated and a routine part of an organization's culture, repeated participatory evaluation processes are required (Cousins & Earl, 1995b). Various factors have contributed to a recent commitment to youth participation in evaluation within the KAYSC. In 2005, the Science Museum of Minnesota's Department of Evaluation & Research in Learning was established and an internal evaluator, Amy Grack Nelson, was designated to work closely with the KAYSC. The internal evaluator began to build the evaluation capacity of the KAYSC staff in order to create an organizational culture of evaluation. This work was supported by KAYSC's 2007 strategic plan that stressed a commitment to a collaborative approach to evaluation and the need for evaluation to be a component of all new 3
  • 4. programs. Participatory evaluation also aligns with KAYSC's philosophy of transforming youth from being passive consumers of knowledge to creators and connectors of knowledge, recognizing that youth can and should be in a constant state of learning and improvement, thinking critically about and evaluating their experiences in and outside of the KAYSC. Involving Participants in Interpreting Data and Generating Recommendations Participants were brought into the evaluation process during the data analysis phase of the Park Crew summative evaluation. Since they were not actively involved in the evaluation from the beginning, it was not a full participatory evaluation process, but a youth participatory evaluation approach was used to analyze data and generate recommendations. As a result of the experience, the participants requested to engage in a full participatory evaluation process to formatively evaluate their BBY activities with visitors. In April 2008, both new Park Crew youth and veteran youth from the previous summer (who were involved in the summative evaluation) participated in four evaluation workshops facilitated by their program manager and internal evaluator (See Figure 1). There were four goals of the workshops: 1) Teach participants about evaluation; 2) Give participants an opportunity to make their own meaning of evaluation data; 3) Increase participants’ understanding of BBY activities; and 4) Allow participants to take more ownership and control of their work. Over the course of four three-hour workshops, participants learned about evaluation, reflected on their activities' learning objectives, interpreted the previous summer's evaluation data and created recommendations for the upcoming summer. At the end of each workshop, the program manager led the participants through brief, facilitated discussions to provide an opportunity for them to reflect on their experience participating in the evaluation. Through focused questioning from the program manager, participants talked about parts of the participatory workshops that resonated with them, shared feelings that came up throughout the workshops, connected the process to the bigger picture of their work in the KAYSC, and identified ways to move forward. These discussions allowed participants to continually check-in with each other and the program manager and keep the workshops moving in a direction and speed with which they were comfortable. The success of the participatory workshops became evident when the Park Crew requested to lead their own formative evaluation to improve visitors’ experiences with their activities. 4
  • 5. Figure 1: Evaluation workshop agenda Workshops 1 & 2: Reflecting on Experience and Data The first workshop began with an activity to help the participants gain a deeper understanding of evaluation, why their activities were evaluated, and how the evaluation data were gathered. To explain evaluation in a way that connected to their current experiences and interests, the evaluator led the participants through an activity in which they evaluated a cell phone. The team identified what they wanted teens to be able to do with a cell phone (the objectives). They talked about what information they would need to gather and how to gather it to ensure teens were able to successfully use the phone and were satisfied with the product. The evaluator then explained how they could look at the data and create recommendations to communicate to the manufacturer how to improve the phone. Parallels were then drawn between how participants said they would evaluate a phone and how the evaluator evaluated the Park Crew program. The program manager talked about the goals and objectives of the Park Crew, identified the goals of the evaluation, explained the methods the evaluator used to gather data, and emphasized that the participants were brought into the evaluation process to interpret the data and find ways to improve the work of the Park Crew. During the first two workshops, participants reflected on learning objectives for the activities, discussed what they felt worked well the previous summer and what should be changed, and reviewed the evaluation data. The participants began the workshop by reflecting on what they did for each activity. Next they were asked to list what they wanted visitors to learn from the activity. The evaluator then introduced similar data from the previous summer’s summative evaluation, where participants were asked the same question related to learning objectives. 5
  • 6. Participants added any additional objectives that were mentioned in the summative interviews to their list. This led to a discussion about learning objectives and how activities can be taught to ensure objectives are addressed. After defining the learning objectives for an activity, participants reflected on their experience leading that activity. The participants were asked to answer two questions for each of their activities: • What would you keep and why? (e.g. things they enjoyed, visitors enjoyed, worked well, etc.) • What would you change and why? (e.g. things they did not enjoy, visitors were not interested in, did not work well, etc.) The team worked in groups of three to come up with as many ideas as they could for each activity. They were encouraged to brainstorm without limitations of cost or feasibility. Participants considered all aspects of doing a program including the training they received to learn the activities, their own process of learning the activities, setting up the activities in the BBY, leading the activities with visitors, and cleaning up after the activities. The program manager provided encouragement to participants as they worked through the process and reminded them to elaborate on ideas when needed. When the participants finished coming up with their ideas, they shared them with the larger group. Using a sticky wall, participants placed their ideas under the corresponding keep or change column for the activity (See Figure 2). As a large group, they identified any similarities among ideas and created clusters of things that worked well or things they would change. Figure 2: Photograph of sticky wall with keep and change data for Watershed Walk After participants reflected on the activity, they were presented with evaluation data for that activity. Instead of seeing the interpretations of the data from the summative evaluation report, the participants were presented with tables and graphs of the data. This allowed them to come to their own conclusions about the data without the evaluator’s interpretations influencing their 6
  • 7. thinking. The evaluator started the data discussion by leading the team through a brief overview of reading graphs, sampling, means, and medians. This introduction also provided a learning opportunity for participants to apply and improve their math skills. The participants were then shown the data for each corresponding activity. To help them think about and discuss the data, the following questions were posed: • What are your first impressions of the data? • Is there anything that surprises you? • Based on the data, what are some changes you would suggest? • What would you keep doing the same way? Participants wrote down additional things they wanted to change or keep based on their interpretation of the data and added them to the sticky wall. During the data discussions, the evaluator and program manager did not interpret the data, but allowed each participant to come up with his or her own interpretations. As needed, the evaluator asked probing questions to help participants correctly read a graph, make comparisons between some of the data, and think about why they were interpreting the data in certain ways. Participants’ interpretations of the data were often similar to those of the evaluator in the summative evaluation report. One example was the interpretation of how effectively the content was delivered for the Watershed Walk activity. While observing this activity, evaluators had recorded the total number of times participants mentioned reasons why something was considered a pollutant and what people could do to prevent that particular kind of pollution. When participants looked at graphs of this data, they were surprised at how infrequently they had talked about the reasons and actions for the various pollutants, because they felt they had covered these topics in all of their demonstrations (See Figure 3). One participant said, “We should be talking about these 100% of the time.” The evaluator pointed out that this feeling was also reflected in the interviews from the previous year’s summative evaluation, when two-thirds of the participants had said they wanted visitors to learn about these two topics. Participants had identified that they wanted visitors to learn this information, but when they actually carried out the activity, they did not follow through. This discrepancy led to a discussion about objectives and the disconnect between what they saw as objectives and what they were actually teaching visitors. Participants said in some cases they did not talk about certain pollutants because they either did not know the information or feel confident enough to talk about it. In order to attain these objectives, participants decided they needed to increase their knowledge about why certain things are considered pollutants and what people can do to prevent them. They created cards for the sticky wall that reflected their expectation of what content should be delivered during every Watershed Walk and their desire for more specific content-related training. Participants ended up creating their own expectations for their performance, expectations that the program manager always had of them but now the participants were adopting for themselves. 7
  • 8. Figure 3: How often participants talked about why something is considered a pollutant (n=27) There was one instance where participants interpreted the data quite differently than the evaluator and program manager. For the Watershed Walk activity, participants were expected to present the walk every half hour. Between walks, they cleaned up from the previous walk, set up the next walk, or waited for visitors to stop by. One of the expected outcomes of participating in the Park Crew is for participants to acquire skills and confidence interacting with museum visitors. These skills were measured by observing participant interactions with visitors passing by and/or stopping at the watershed model (the basis for the Watershed Walk activity). Observations revealed that participants were not comfortable with the most basic levels of interaction, greeting visitors and inviting them to participate in the activity. Most of the time (86%), participants said nothing to visitors as they passed by, not even “hello.” When participants were shown these data, they did not interpret it as a potential area of improvement. Instead they talked about their need to set up or clean up for the activity. It did not occur to any of the participants that they should stop for a moment to at least acknowledge the visitor, if not invite her or him back for the Watershed Walk. This was eye opening for the program manger. It became clear that training for the BBY needed to include the same customer service training that other museum staff receive, instead of solely focusing on presenting the four activities. This was one case during the workshops where the program manager added an idea to the sticky wall to make sure the necessity of customer service training did not get lost in the process. Workshop 3: Generating Recommendations Participants spent the third workshop drawing connections from the ideas generated during the first two workshops to their future BBY work. The program manager facilitated this process using a consensus-building workshop technique (Bucki, 2008). The primary goal of the consensus-building workshop was to give the participants a voice in creating recommendations for program improvement. While the evaluator and program manager could have developed recommendations on their own, it was important to include participants’ thoughts and opinions as part of the democratic process of learning and decision-making. The workshop began by 8
  • 9. posing the question, “What do we need to do in order to reach our goals working in the Big Back Yard?” Participants utilized their list of intended objectives for each activity and their keep/change reflections from the first two workshops to answer the question. The participants and program manager individually brainstormed answers to the question and shared their ideas with a partner. The partners chose 8-10 of their best ideas and wrote them down on half sheets of paper. Then they posted several of their ideas on the sticky wall and identified ones with shared meaning. They clustered these into pairs and each became a column, or theme. They placed all their remaining ideas under the column to which each best fit. By the end of the consensus- building workshop, the team had identified and named seven themes of recommendations they wanted to focus on in preparation for their upcoming BBY work: 1) things to change, 2) seeking new information, 3) hands-on field trips, 4) visitor interaction, 5) activity preparation, 6) advertising activities, and 7) space and comfort. 1) Things to change. The team identified a number of large and small changes to each activity that would improve either their ability to teach the activity or the visitor's experience. 2) Seeking new information. The team decided to seek new information on local watershed issues and local fossil history based on where they lacked or desired more content knowledge. 3) Hands-on field trips. They also wanted to increase their content knowledge by going on more field trips. 4) Visitor interaction. In order to improve visitor interactions, participants suggested creating cards for visitor questions they were unable to answer that could be sent back to the visitor after the answer was found. Participants also decided they wanted to plan multiple ways to talk to and interact with visitors, as well as survey visitors about their experience in the BBY. 5) Activity preparation. The team desired time to revise the activities’ learning objectives and practice teaching the activities to each other. 6) Advertising activities. Responding to occasional low attendance in the BBY, participants proposed new advertising techniques such as placing one Park Crew staff inside the museum to recruit visitors and creating and posting additional signs throughout the museum. 7) Space and comfort. Participants identified ways to improve their working conditions such as adding tents or awnings to their outdoor workspaces. Workshop 4: Implementing Recommendations After participants brainstormed, clustered, and named their ideas, the program manager facilitated a final workshop to incorporate the team’s recommendations into training sessions for the four BBY activities. He mapped out the remaining shifts until the start of summer and had the participants think about how to best use the shifts to prepare for the BBY season. The team decided to work in small groups to plan training sessions for the four activities. Each group chose one of the activities and was responsible for planning two days of training for the rest of the crew. Participants made sure their training days addressed each of the seven recommendation themes as they related to their activity. This final workshop allowed participants to have a direct say in how they were trained and what skills and knowledge they needed to feel comfortable teaching their activities. 9
  • 10. It was important that participants had the chance to make suggestions and offer feedback on their experience throughout the four participatory evaluation workshops. Rather than simply receiving a mixture of positive and negative feedback from the evaluation report via the program manager, participants were given the opportunity to interpret and respond critically to the evaluation data. The team worked together to make sense of the data in their own way and synthesized it into a comprehensive list of training suggestions for the upcoming season, some of which the program manager might not have considered had he not sought their input. Participants left the workshops with a stronger sense of ownership and control of their work because they were given time to express their opinions, provide their own interpretations of what was observed in the BBY, and create recommendations for improvement, many of which were implemented in the upcoming summer. Involving Participants in the Entire Evaluation Process The participatory evaluation workshops led to the team’s participation in a full participatory evaluation process. After reviewing the Park Crew summative evaluation data, the team realized there was an information gap. One participant said, "Well, we need to talk to visitors 'cuz we want to find out what they think!" Shortly after the workshops the participants came to the evaluator with their desire to formatively evaluate each of the four BBY activities with visitors that summer. The participants had already identified the goals of their evaluation, to understand what visitors learn from their activities and to improve visitors’ experiences. The program manager helped participants identify questions they wanted to ask visitors, which they then shared with the evaluator. The evaluator and participants discussed the questions and the types of information the questions would elicit to ensure they gathered their desired information. Participants also identified additional questions to include in the survey. The participants stressed that they wanted two versions of the survey, one for adults and a “kid-friendly” version. The evaluator formatted the surveys and had the participants review them. The evaluator trained the participants to collect survey data from visitors participating in BBY activities. Evaluators modeled the process and then shadowed participants during their first day of data collection to offer pointers and be available for questions. After that point, the participants managed the survey data collection. Due to scheduling constraints, the participants did not have time to enter the data so the evaluator completed that step. The survey data was then shared in a participatory manner, similar to how the participants engaged with the summative data from the previous summer. The team discussed graphs and tables of data, decided what they wanted to keep or change based on the data, and generated recommendations for their future work in the BBY. DISCUSSION Factors That Led to a Successful Participatory Evaluation Experience The participatory evaluation process was an overall success and a beneficial experience for all involved. The literature identifies a variety of factors that lead to a successful participatory evaluation process, including balanced partnerships, sufficient time allocated to the process, and the program manager's regard for evaluation and interest in including participants in the process (Cousins & Earl, 1992, 1995a; Cousins & Whitmore, 1998; King, 1995). These factors were present in carrying out the participatory evaluation workshops, leading to the success of the process.. 10
  • 11. Balanced partnerships are vital to participatory evaluation, with the evaluator and participants bringing their own knowledge and expertise to the process (Cousins & Earl, 1995a; Cousins & Whitmore, 1998). The program manager brought his expertise of working with the participants. He identified strategies to engage participants with the data, kept them on task during the workshops, and offered them encouragement when needed. The evaluator brought her evaluation expertise. She taught participants about evaluation and helped them use their analytical skills to make sense of the data. Participants brought their expertise as front-line staff working directly with visitors. They were able to provide their unique perspective to the interpretation of the data and generate recommendations based on their first-hand knowledge of the activities. Sufficient time is necessary to engage in participatory evaluation without rushing the process (King, 1995). Originally, the workshops were scheduled to cover two three-hour sessions. However, participants became deeply engaged, often spending considerable amounts of time reflecting on their prior experiences in the BBY. The participants’ positive energy and engagement made it clear by the end of the first day that the process of interpreting data and generating recommendations would have to be extended to allow enough time for a meaningful experience. Rushing participants through the workshops could have negatively affected their first participatory evaluation experience and diminished the use of the findings (King, 1998). Because of the program manager's flexibility with the team’s schedule, two additional three-hour workshop days were added. Participants must value evaluation and have an interest in engaging in the evaluation process (Cousins & Earl, 1992; King, 1995). Without the interest of those involved, the evaluator will have a difficult time successfully carrying out a participatory process (King, 1995). The interest of the program manager to engage participants in the evaluation process was vital to ensuring the successful implementation of a participatory evaluation approach. The engagement of participants during the workshops also led to its overall success. Participant engagement was evident by the lengthy discussions of the four activities. It was clear participants wanted to improve their work in the BBY on both a personal and team level. The interest and enthusiasm for the process was also evident in the participants’ desire to evaluate their own activities with visitors, as well as the program manager advocating participatory evaluation to other KAYSC staff. Benefits of the Participatory Evaluation Process The program manager and youth reaped a number of benefits from their involvement in the participatory evaluation process, as evident from their reflections and behaviors after the workshops. The youth reflected on their experience and the benefits after each workshop and while planning for the upcoming summer. The evaluator and the program manager debriefed on the evaluation process, discussing benefits of the process for them and what they had observed and heard as benefits for the youth. Benefits to the Program Manager A major outcome of participatory evaluation is evaluation capacity building of those involved in the process (King & Stevahn, 2002). The participatory evaluation process provided professional 11
  • 12. development for the program manager, who had little evaluation experience going into the workshops. The collaboration with the evaluator increased the program manager's understanding of evaluation, his ability to interpret evaluation data and generate recommendations, and his confidence in including youth as participants in future evaluations. The program manager's evaluation capacity led to his desire to engage a different youth team, the Water Crew, in a participatory evaluation process. The team was developing interactive quizzes for an upcoming exhibition and wanted to find out if the quizzes successfully communicated their desired messages. Having recently completed the participatory evaluation workshops, the program manager suggested some possible evaluation methods and the team contacted the internal evaluator to help carry out the evaluation. The participants had already identified some questions they wanted to ask visitors, so the evaluator helped them refine their interview questions and trained the participants in conducting visitor interviews. The Water Crew then carried out cued interviews with visitors. The evaluator offered participants support and guidance as needed. The evaluator and program manager then shared the data with the participants. The participants interpreted the data and identified revisions to make to their quizzes. The evaluation of the Water Crew quizzes is an example of a small-scale participatory evaluation that was prompted by the program manager's evaluation knowledge and capacity. In addition to increasing his evaluation capacity, the participatory evaluation process also provided the program manager with important feedback about the Park Crew's work. Throughout the workshops, participants reflected on their work, talking about what they personally felt worked well and what they would change. Participants also gave thoughtful suggestions on how to better prepare for their work in the BBY and keep motivated throughout the summer. They shed light on aspects of their work the program manager had not considered, such as their lack of confidence in delivering some of the information and their desire for more training opportunities. As a result of the participatory process, the program manager had a deeper understanding of the participants’ experience in the BBY and could proactively identify and respond to their needs. Benefits to the Participants The participatory evaluation process also increased participants’ understanding of evaluation. Meaningful involvement in evaluation can help youth develop higher order thinking skills, specifically analytic and evaluative skills (London et al., 2003). Participants also become more reflective about themselves, their work, and their overall program, as was evident in the Park Crew’s deep engagement in the workshops and the thoughtful recommendations they created (Sabo Flores, 2008). Participants’ understanding and comfort with evaluation was also evident by their desire to engage in a formative evaluation process to further inform and improve their BBY activities. Involving participants in the evaluation process transforms their understanding of their program (Checkoway et al., 2003). The workshops served the dual purpose of program improvement and training for the upcoming year. Some of the participants were new to the Park Crew and had not led activities in the BBY before. By taking part in such a thorough reflection of the team's previous work and planning for the upcoming summer, new participants left the workshops well poised to begin training. One participant shared his increased understanding, "I know more about 12
  • 13. the things we do in the Big Back Yard." Participant comments also reflected their increased level of knowledge about the activities and confidence in sharing that knowledge with visitors. One participant said, "I feel very confident now," while another shared, "I feel we are going to do really good this summer." The team's increased knowledge and confidence created a shared positive attitude toward their BBY work which provided a motivational springboard to the training work. Participants experience a sense of empowerment and pride when they have an influence on the way programs are run and see their ideas acted upon (Checkoway et al., 2003; Horsch et al., 2002; London et al., 2003). The participants generated thoughtful suggestions for how they could improve their work experience, but they needed outlets to implement their ideas in order to feel empowered by the process. The program manager, rather than planning and leading the BBY training himself, had the participants use their suggestions to develop their own training sessions. Participants also shared their ideas with a museum operations staff member who was responsible for BBY operations. The team suggested ways to increase the number of BBY visitors and amenities that would allow them to more effectively do their job. The museum operations staff member implemented several ideas that were realistic given time and financial constraints. Participants were able to see their suggestions put in place over the course of the summer. They learned firsthand how to make changes within a large organization and realized they shared a collective voice as valued staff members. Participants also made connections that helped them understand the evaluation work that occurs within the museum. Initially, some of the participants were a bit uncomfortable being observed by the evaluation staff as part of the summative evaluation. After meeting the evaluator and engaging with the evaluation data, they had a deeper understanding of what observations can reveal about a program. One participant expressed a renewed sense of comfort with the evaluation process sharing, "I feel being watched paid off in the long run." The team learned the importance of evaluation in a museum setting and the different ways evaluators help improve museum programs and projects. It was empowering for participants to feel a deeper tie to the museum by knowing more about the various types of work done in the institution. CONCLUSION As a result of using a participatory evaluation approach to involve youth in evaluation, the KAYSC, and more broadly the Science Museum of Minnesota, now have a format for engaging participants in evaluation practice and training them to be more self-reflective in their work. A sustainable cycle of formative evaluation (evaluation, reflection, change, and more evaluation) has been developed for a group of participants that previously had little or no knowledge of evaluation practice. The internal evaluator and program manager’s partnership increased their confidence and interest in participatory evaluation, which is evident in their use of the process with other KAYSC evaluations. Additionally, the successful Park Crew participatory evaluation experiences influenced the KAYSC's commitment to using a youth participatory evaluation approach with future evaluations. Although this article focused on youth participatory evaluation, participatory processes can be used with all types of primary users. Participatory evaluation is also not limited to program 13
  • 14. evaluation. Exhibit evaluation can also benefit from a participatory evaluation approach. Partnering with primary intended users, building their evaluation capacity, and increasing their level of participation in carrying out various phases of a formative evaluation enhances their understanding and ownership of the findings, which in turn leads to increased use of the data to improve their exhibit or program (Cousins & Earl, 1995a; Cousins & Whitmore, 1998; King & Stevahn, 2002; Patton, 2008). Users also gain a deeper understanding of evaluation practice and enhance their evaluative thinking skills through direct involvement in carrying out the evaluation, offering input, and reflecting on the process (Cousins & Earl, 1992). Primary intended users become more reflective of their work throughout the exhibit or program development process and develop the capacity to become more active participants in future evaluations (King, 1995). Note 1. To obtain a copy of the KAYSC BBY Park Crew Summative Evaluation Report, which also includes the participants’ recommendations, visit www.smm.org/researchandeval Acknowledgments The Park Crew and related evaluation work are funded by the National Center for Earth-surface Dynamics, a National Science Foundation Science and Technology Center. We thank the Science Museum of Minnesota's KAYSC and Department of Evaluation & Research in Learning staff for their feedback and support throughout the project. We also express our gratitude to the 2007 and 2008 Park Crew youth staff for their energetic participation and openness to the youth participatory evaluation process. References Bucki, J. S. (2008). Participatory leadership skills. St. Paul, MN: GoodWorkTools.com. Checkoway, B., Dobbie, E., & Richards-Schuster, K. (2003). The Wingspread Symposium: Involving young people in community evaluation research. CYD Journal, 4(1), 7-11. Cousins, J.B., & Earl, L. (1995a). The case for participatory evaluation: Theory, research, practice. In Cousins, J.B., & Earl, L. (Eds.). Participatory evaluation in education: Studies in evaluation use and organizational learning. (pp. 3-18). London: Falmer. Cousins, J.B., & Earl, L. (1995b). Participatory evaluation in education: What do we know? Where do we go? In Cousins, J.B., & Earl, L. (Eds.). Participatory evaluation in education: Studies in evaluation use and organizational learning. (pp. 159-180). London: Falmer. Cousins, J.B., & Earl, L. (1992). The case for participatory evaluation. Educational Evaluation and Policy Analysis, 14, 397-418. Cousins, J.B., & Whitmore, E. (1998). Framing participatory evaluation. New Directions for Evaluation, 80, 5-23. Horsch, K., Little, P., Chase Smith, J., Goodyear, L., & Harris, E. (2002). Youth involvement in evaluation and research. (Issues and Opportunities in Out-of-School Time Evaluation No. 1). Retrieved December 14, 2008, from http://www.hfrp.org/content/download/1093/48598/file/issuebrief1.pdf King, J. A. (1995). Involving practitioners in evaluation studies: How viable is collaborative evaluation in schools? In Cousins, J.B., & Earl, L. (Eds.). Participatory evaluation in education: Studies in evaluation use and organizational learning. (pp. 159-180). London: 14
  • 15. Falmer. King, J.A. & Stevahn, L. (2002). Three frameworks for considering evaluator role. In Ryan, K.E., & Schwandt, T.A. (Eds.). Exploring evaluator role and identity. (pp. 1-16). Greenwich, CT: Information Age Publishing. London, J., Zimmerman, K., & Erbstein, N. (2003). Youth-led research and evaluation: Tools for youth, organizational, and community development. New Directions for Evaluation, 98, 33-45. Patton, M.Q. (2008). Utilization-focused evaluation. (4th ed.) Thousand Oaks, CA: Sage. Sabo Flores, K. (2008). Youth participatory evaluation: Strategies for engaging young people. San Francisco, CA: Jossey-Bass. Stufflebeam, D. L. (1980). An interview with Daniel L. Stufflebeam. Educational Evaluation and Policy Analysis, 2(4): 90-92. The Joint Committee on Standards for Educational Evaluation. (1994). The program evaluation standards (2nd ed.) Thousand Oaks, CA: Sage. Velure Roholt, R., & Steiner M. (2005). "Not Your Average Workplace" - the Youth Science Center, Science Museum of Minnesota. Curator, 4(2), 141-157. About the Authors Amy Grack Nelson is an Evaluation and Research Associate in the Science Museum of Minnesota's Department of Evaluation and Research in Learning. Email: agnelson@smm.org Robby Callahan Schreiber is a Youth Program Manager in the Science Museum of Minnesota's Kitty Andersen Youth Science Center. Email: rschreiber@smm.org. 15