SlideShare a Scribd company logo
1 of 11
Download to read offline
Learning Design and Formative Evaluation
                   A Paper Written for Week 7 of the OLDS MOOC 2013
          Thomas C. Reeves, Ph.D.                            Yishay Mor, Ph.D.
          The University of Georgia                       Open University of the UK

Some aspects of “evaluation,” the focus of Week 7, have been woven into the fabric of the
first six weeks of the OLDS MOOC: 1) Initiate, 2) Inquire, 3) Ideate, 4) Connect, 5) Proto-
type, and 6) Curate. For example, as a learning designer, you would normally conduct re-
views of existing educational resources before setting off to develop your own during the
Initiate and Inquire phases of a learning design project. The success of Prototype phase of a
learning design initiative is heavily dependent upon evaluative activities. However, regardless
of how much upfront analysis and investigation you do, your initial design concepts and pro-
totypes will almost always be subject to improvement. This is where formative evaluation
comes in.
Formative evaluation is the essential “life-force” of the learning design process. Formative
evaluation is “the systematic collection of information for the purpose of informing deci-
sions to design and improve the product.”1 Virtually everything about a learning design can
be enhanced at some stage of its development. Sometimes all that is needed for improve-
ment is a flash of creative insight, but more often than not learning designers require specific
information to guide decisions about how to improve the learning design as it is being de-
signed, developed, and implemented. As described below, you can collect this information
using many different methods from a variety of different people, ranging from subject mat-
ter experts to members of the intended learning population for the learning design. Within
the context of the OLDS MOOC per se, peer evaluation is especially relevant, given that you
have been asked to engage in reviewing the learning design artifacts of others at virtually
every step of the process.


Objectives
A careful reading of this paper will enable you to:

    •   identify decisions involved in the formative evaluation of a learning design;
    •   specify questions that should be answered before making these decisions about im-
        proving a learning design;
    •   identify the information needed to answer these questions; and
    •   decide how to collect and report the required information so that a learning design
        can be improved in a timely manner.




                                                  1
Formative Evaluation Strategies
As illustrated in Figure 1, different types of decisions must be made when you attempt to
improve a learning design, each of which is tied to one or more specific questions that can
be addressed by formative evaluation activities, such as observations and interviews.


Decisions                                 Example Questions
 Should the interface proposed for        Is navigation clear to learners?
 an Open Educational Resource             Are the meanings of icons clear?
 (OER) be redesigned?                     Do learners get lost in navigating through the OER?

 Should the number and length of          Which activities do learners engage in most frequently?
 activities in a learning design be de-   To what extent do learners exhibit a lack of engagement in
 creased?                                 specific activities?

 Should more practice opportunities       Do learners pass quizzes?
 be added to the learning design?         Do learners exhibit mastery of intended objectives?
                                          Do learners rate interactive components highly?

 Should the learning design scope         Is the learning design aligned with curricular guidelines?
 be expanded?                             Do content experts rate the OER as comprehensive?

                Figure 1. Typical decisions and questions in a formative evaluation.

Some people resist evaluation. However, there is an ethical imperative to evaluate a learning
design while it is being developed. After all, learning designs are intended to change people,
to modify their knowledge, skills, attitudes, and intentions as well as to influence them to
behave differently. The risk of misguiding learners with an untested learning design is too
great, and therefore formative evaluation can be viewed a morally responsible activity.
In Barbara Flagg’s classic book, Formative Evaluation for Educational Technologies1, she identified
six reasons why people resist formative evaluation:
    •    Time – In the rush to meet project deadlines, reducing or eliminating formative
         evaluation activities is perceived as an easy way to save time.
    •    Money – Most learning design budgets, if there even is a budget, fail to provide suf-
         ficient funding for rigorous formative evaluation. (At least 10% of a project budget
         should be allocated to formative evaluation.)
    •    Human Nature – Many learning designers are reluctant to subject their innovations
         to potential criticism, especially from learners they may view as uninformed or from
         experts they may view as threatening.
    •    Unrealistic Expectations – Although formative evaluation can provide information
         to guide decision-making, it cannot substitute for the expertise and creativity of a
         qualified learning designer. In short, you cannot just toss together a rough prototype
         of a design, and expect formative evaluation alone to turn it into a winner.




                                                    2
•   Measurement Difficulties – Although some aspects of formative evaluation are rela-
        tively easy to determine (e.g., investigating whether learners think various parts of an
        learning design are engaging), there is a lack of reliable, valid, and feasible methods
        of evaluating certain kinds of outcomes of learning that a particular design may ad-
        dress, e.g., problem-solving.
    •   Knowledge – Formative evaluation expertise is still not widely available within the
        learning design community. Many designers lack the knowledge and skills to conduct
        systematic formative evaluation in an efficient and effective manner.
Engaging in formative evaluation should result in an overall reduction in development and
implementation costs over the lifespan of a learning design initiative. (“Costs” encompass far
more than just currency, and within the context of learning design often are best calculated
in terms of the “sweat equity” and time that you put into a design effort.) Therefore, you
should evaluate “early and often.” The sooner formative evaluation is conducted during a
learning design initiative, the more likely that substantive improvements will be made and
errors avoided.
Decisions and Questions
As a learning designer, you will produce draft design documents and prototypes of various
types. Each of these represents an opportunity for making important decisions about en-
hancing the effectiveness of the final learning design. Should you increase the level of the
program’s objectives? How can the design be more engaging for learners? Should more as-
sessment be incorporated into the design? These and other decisions will be faced by you
and other members of your design team.
The impetus to make decisions about improving a learning design will come from many di-
rections. You may see an interactive program developed by someone else that inspires a new
interface idea. You may find out new information about the interests and knowledge of your
prospective learners. Your budget, if any, may be cut, thereby requiring you to reduce the
more expensive elements of an interactive program such as animation. These and other fac-
tors will be signals that there is room (and often a need) for improvements in a prototype
design.
Of course, formative evaluation is not something that is initiated when there is a crisis such
as a budget cut. Instead, it is a professional practice integral to the overall learning design
process. What’s more, a formative evaluation perspective is no less important for those in-
volved in implementing a learning design. The bottom line is that all of us are human and
our first efforts to create or implement learning designs are bound to be somewhat flawed.
Formative evaluation is essential to detecting and reducing these flaws and eventually attain-
ing the high quality we all desire.
Each decision can inspire different types of questions about improving the learning design.
Do learners understand the structure of the design and what their options are at any given
moment? Does the learning design maintain the learners’ attention? Do they accomplish the
learning objectives? Is it feasible to implement the program as designed? It is too late to wait



                                               3
until you have completed a learning design to ask these types of questions. Instead, they
must be addressed throughout the creation and development of the design.
There are no universal criteria established for formative evaluation of learning designs, but
some of the most commonly considered factors are functionality (To what extent does the
program work as designed?), usability (To what degree is the program user-friendly for the
intended learners?), appeal (How much do learners like it and why?), accessibility (To what
degree is the learning design able to be used fully by all potential learners?), and effectiveness
(What do they learn?). Different criteria entail many different types of questions. For exam-
ple, usability implies criteria that can be broken down into smaller issues such as the quality
of the user interface or the meaningfulness of icons. User interface can be further divided
into factors such as navigation, mapping, aesthetics, and control. Finally, a factor like naviga-
tion can be examined via several different questions: How do learners navigate through a
computer-based learning design? How does their navigation relate to the underlying peda-
gogy of the design? What parts of the interactive learning resource are underutilized? Where
would learners like to go, but don’t seem to know how? Answering these and other ques-
tions will provide you with the information you need to enhance the navigational aspects of
an interactive learning resource and ultimately improve its usability.
Specific Formative Evaluation Methods
The key to sound formative evaluation is to collect data systematically at critical stages of the
learning design’s development and to utilize the findings of each formative evaluation strat-
egy as much as your time and resources allow. The following formative evaluation methods
are essential within the context of most learning design initiatives:
    •   peer review,
    •   expert review,
    •   learner review,
    •   usability testing, and
    •   alpha, beta, and field tests of prototype learning designs.
Peer review is most often done during the earliest stages of learning design, but it can continue
through the entire lifespan of a project. It can range from informal activities (asking a colleague to
react to your design ideas over coffee) to more formal activities (providing formative feedback via
a tool like Cloudworks2 to the prototypes developed by others).

Expert review may be the most frequently used formative evaluation strategy. It is important to
remember that there are several different kinds of “experts,” and that each type of expert can add
unique kinds of information to the review and enhancement process. Content or subject matter
experts can help you improve the scope, sequence, and accuracy of a learning design’s content.
Learning theory experts can assist by critiquing the potential effectiveness of the pedagogical di-
mensions of a design. Graphic designers can suggest how to enhance the aesthetics of the look
and feel of an interactive program. Expert teachers can help you anticipate the logistical require-
ments for successful implementation of a learning design in schools or universities.



                                                  4
With respect to formative evaluation, an expert is anyone with specialized knowledge that is rele-
vant to the design and implementation of your learning design. Experts can provide different per-
spectives on the critical aspects of your program, e.g., its accuracy, completeness, user-friendliness,
motivational strategies, aesthetics, instructional validity, effectiveness, efficiency, and feasibility.
You should seek to utilize both internal (members of the design team) and external (non-
members of the design team) experts to the degree that your resources allow. It is often useful to
structure an expert’s review so that you are assured of getting the types and depth of information
you desire. Figure 2 presents an expert review form to guide design experts with experience in
interactive multimedia when they critique a prototype OER.

If you must limit expert review, content experts are probably the most important expert sources
of formative information in education settings. After all, if you do not get the content right, the
eventual learners will be misled. One of the problems with many learning designs is that they have
poor subject matter integrity because of a lack of expert review. This is a major challenge because
so much material can be incorporated into a single learning design.

Reviewer: Dr. Ima Gladtohelp                                                            Due Date: June 10

Please circle your rating and write comments on each aspect of the Open Educational Resource (OER).

1 represents the lowest and most negative impression on the scale, 3 represents an adequate impression,
and 5 represents the highest and most positive impression. Choose N/A if the item is not appropriate or not
applicable to this OER. Use additional sheets to write comments.

NA=Not applicable 1=Strongly disagree 2=Disagree 3=Neither agree/nor disagree 4=Agree 5=Strongly agree
AREA 1 – LEARNING DESIGN REVIEW
1. This OER design provides learners with a clear knowledge              N/A    1   2     3    4   5
   of the program objectives.
2. The learning interactions in this prototype OER design are            N/A    1    2     3   4   5
   appropriate for the objectives.
3. The pedagogical design of the OER is based                            N/A    1   2     3    4   5
   on sound learning theory and principles.
4. The feedback in the OER learning design is clear.                     N/A    1    2     3   4   5
5. The pace of the learning interactions is appropriate.                 N/A    1    2     3   4   5
6. The difficulty level of the learning design is appropriate.           N/A    1    2     3   4   5

Comments:

AREA 2 – COSMETIC DESIGN REVIEW
7. The screen design of the OER follows sound principles.                N/A    1    2     3   4   5
8. Media is appropriately integrated in this OER.                        N/A    1   2     3    4   5
9. The screen displays are easy to understand.                           N/A    1   2     3    4   5

Comments:

AREA 3 – FUNCTIONALITY REVIEW
10. This OER operated flawlessly.                                        N/A    1   2     3    4   5

Comments:

         Figure 2. Sample expert review form for a prototype learning design or functional OER.




                                                         5
In addition to learning design and content experts, people with special expertise in human-
computer interface (HCI) design and the aesthetics of online learning programs can provide use-
ful expert reviews for interactive learning resources. For example, if your team doesn’t include an
actual art director who is responsible for the look and feel of the design, it may be useful to ask
other graphic artists to critique your prototype. Of course, you would not decide to make major
changes in the design elements of a learning program based on the opinions of just one or two
experts because aesthetic appeal is more subjective than many other criteria to be reviewed.

Experienced learning designers are often the best experts for reviewing user interface issues, but
there are people who specialize in HCI issues per se. Figure 3 presents a user interface instrument
that can be used to guide reviews provided by expert designers or experienced users of OERs.




                   Figure 3. Sample user interface review form for an OER.

Enlisting the help of the “right” experts for review services is a crucial step in setting up a forma-
tive evaluation. If subject matter experts (commonly called SMEs) are already part of your team,
one of their primary responsibilities will be checking the accuracy and currency of your content.
However, even when working with qualified SMEs, it is a good idea to have the content reviewed
by other external content experts. The costs for SMEs to review learning designs can vary widely
depending on the field. Recruiting graduate students and academic staff can be a much less ex-
pensive source of expert review.




                                                  6
Learner review is based on the analysis of learner behavior during the use of a learning design.
The perspectives of “experts” are valuable, but the opinions of the target audience for your learn-
ing design are equally important. It is critical to ensure that learner differences are accommodated
to the extent possible, and thus it is important that learner review be done with a sample of people
whose background knowledge and expectations approximate those of the final intended learners.
During learner review, learners should be allowed to work in realistic conditions, with minimal
interruption from an observer, in order to accurately replicate the intended context of use.

Suppose you are designing an interactive educational resource such as a game for use in schools.
In that case, valuable information for making decisions about improving the game can be derived
from systematic observations of learners while they use it. Observations can be conducted at your
development site or on-site at a school. Learner opinions, actions, responses, and suggestions will
provide you with practical guidance in making decisions about improving the game. Of course,
you would also want the teachers who must implement the learning game to review it, especially
with respect to seeking their ideas about how they could integrate it into their existing practices.
Widening the review process to include parents, administrators, and other school specialists is also
advised in this context. Few innovative games designs have been successfully integrated into
schools, a problem that might have been reduced by more inclusive formative evaluation.

Observations of learners engaging with your prototype learning design at various stages of its de-
velopment can be a valuable, if somewhat humbling, experience. You may be surprised at how
frequently what seemed to be the most user-friendly aspects of your program befuddle would-be
learners. Alternatively, what you view as motivating may bore the intended audience. Fortunately,
you will often find that your creative design ideas are validated by learners. A sample protocol
from the Apple Interface Laboratory is given in Figure 4. During your observations, you will see
learners doing things you never expected. When you see learners making mistakes, your first in-
stinct may be to blame the mistakes on their inexperience or lack of intelligence. This is wrong.
The purpose of observing learners is to see what parts of your program might be difficult or inef-
fective. Therefore, if you observe someone struggling or making mistakes, attribute the difficulties
to faulty design, not to the learner.

Observing learners can be a time-intensive and exhausting process. It can range from a very sim-
ple one-on-one observation protocol to a complex arrangement wherein several observers, video
cameras, and computers are used to record learners’ reactions. Whatever type of procedure is fol-
lowed, it is important that you record information carefully and that you later deal with each issue
that arises during the observations.

Figure 5 presents a simple formative evaluation review form with three columns, one for indicat-
ing what section of an interactive program is being reviewed, one for recording observations, and
the last for recording the actions taken in response to the issues raised by the observations. The
last column is very important because it provides the evidence that the evaluation data collected
has actually had an impact on design decisions.




                                                 7
User Observation (Based upon Apple HCI Group Protocol)
 The following instructions guide you through a simple user observation. With this protocol, you will see where people
 have difficulty using your interactive program, and you will be able to use that information to improve it.

1 – Introduce yourself   Make the session and task as welcoming as possible. Remember, this user observation is not
                         designed as a controlled experiment, so keep the environment friendly.
2 – Describe the         Try to make the participant feel at ease by stressing that you’re trying to find problems in the
general purpose of       program. For example, you could say:
the observation
                         You’re helping us by trying out this e-learning program. We’re looking for places where the
                         program may be difficult to use. If you have trouble with some tasks, it’s the program’s fault, not
                         yours. Don’t feel bad; that’s exactly what we’re looking for. If we can locate the trouble spots,
                         then we can go back and improve the program. Remember, we're testing the program, not you.
3 – Tell the participant Make sure you inform participants that they can quit at any time if they find themselves becom-
that it’s OK to quit at  ing uncomfortable. Participants shouldn’t feel like they're locked into completing tasks. Say
any time                 something like this:

                         Although I don’t know of any reason for this to happen, if you should become uncomfortable in
                         any way, you are free to quit at any time.
4 – Talk about the       Explain the purpose of each piece of equipment in the observation room and how it will be
equipment
                         used in the test (hardware, software, video camera, microphones, etc.).
5 – Explain how to       Ask participants to think aloud during the observation, saying what comes to mind as they
“think aloud”            work. You’ll find that listening to learners as they engage with your e-learning program provides
                         you with useful information that you can get in no other way. Unfortunately, most people feel
                         awkward or self-conscious about thinking aloud. Explain why you want participants to think
                         aloud, and demonstrate how to do it. You could say:

                         We have found that we get lots of information from these informal tests if we ask people to
                         think aloud as they work through the e-learning program. It may be a bit awkward at first, but
                         it’s really very easy once you get used to it. All you have to do is speak your thoughts as you
                         go through the program. If you forget to think aloud, I’ll remind you to keep talking. Would you
                         like me to demonstrate?
6 – Describe why you     It is very important that you allow participants to work with your product without any interfer-
will not be able to      ence or extra help. If a participant begins having difficulty and you immediately provide help,
help.                    you may lose the most valuable information you can gain from user observation: where users
                         have trouble, and how they figure out what to do. Of course, there may be situations where you
                         must step in and provide assistance, but you should decide what those situations will be before
                         you begin testing. You may decide that you will allow someone to flounder in the e-learning
                         program for at least 3 minutes before providing assistance. Or you may identify distinct prob-
                         lems you will provide help on. As a rule of thumb, try not to give your test participants any more
                         information than the true users of your product will have. Here are some things you can say to
                         the participant:

                         As you’re working through the e-learning program, I won’t be able to provide help or answer
                         questions. This is because we want to create the most realistic situation possible. Even though
                         I won’t be able to answer your questions, please ask them anyway. It’s very important that I
                         capture all your questions and comments on the recording equipment. When you’ve finished all
                         the program, I’ll answer any questions you still have.
                                                                                              (Continued on next page)
                                  Figure 4. Sample User Observation Protocol.




                                                              8
7 – Describe the         Explain what the participant should do first, second, third, etc.
tasks and introduce
the program              Give the participant written instructions for the tasks.

                         Important: If you need to demonstrate your program before the user observation begins, be
                         sure you don’t demonstrate something you’re trying to test. (For example, if you want to
                         know whether users can figure out how to use certain interactions, don’t show them how to
                         engage with these particular interactions before the test.)
8 – Ask if there are     Before you start, make sure the respondent knows your expectations, then begin the obser-
questions
                         vation.
9 – Conclude the         When the test is over:
observation
                         •    Explain what you were trying to find out during the test.

                         •    Answer any remaining questions the participant may have.

                         •    Discuss any interesting behaviors you would like the participant to explain.
10 – Use the results     To get the most out of your test results, review all your data carefully and thoroughly (your
                         notes, the video recording, the tasks, etc). Look for places where participants had trouble,
                         and see if you can determine how your program could be changed to alleviate the problems.
                         Look for patterns in the participants' behavior that might tell you whether the program is
                         understood correctly.

                         It’s a good idea to keep a record of what you found out during the test. That way, you’ll have
                         documentation to support your design decisions and you’ll be able to see trends in users’
                         behavior. After you’ve examined the results and summarized the important findings, fix the
                         problems you found and test the product again. By testing your product more than once,
                         you’ll see how your changes affect users’ performance.

                          Figure 4. Sample User Observation Protocol (continued).




                                               Formative Review Log
 Program: Learn or else!                             Reviewer: Smyth                                 Date: May 15

 Screen            Comments, Questions, Suggestions                      Action Taken
 C-17              The forward and back navigation arrows                Enlarge the navigation arrows by 50%
                   are so small that learners seem to have               and repeat observations.
                   trouble placing the mouse cursor on them.

 C-23              The learners think that the “Question                 Use the “Question Mark icon for help,
                   Mark icon will take them to help, but it              and find a different icon for the fre-
                   takes them to a list of frequently asked              quently asked questions.
                   questions instead.

                             Figure 5. Formative evaluation review log for OER.

Usability testing is an important method in the design of interactive programs. User interface
expert Ben Shneiderman3 maintains that usability is a combination of the following characteristics:

     •    ease of learning,
     •    high speed of task performance,



                                                               9
•   low error rate,
    •   subjective satisfaction, and
    •   learner retention over time.
Too many formative evaluations are only focused on whether users like a program or not, but
usability is a much deeper subject. There are instances when you might evaluate usability without
learners. Time with learners is often limited; it is not a free resource. In addition, learners may find
it difficult to visualize how a prototype learning design such as n OER could be designed differ-
ently and they therefore may tend to evaluate according to what already exists, rather than to what
is possible. Some usability criteria will only be reliably identified or articulated by trained evaluators
using protocols such as heuristic evaluation. At least three evaluators with a mix of experience and
expertise are required for heuristic evaluation because fewer will not be able to identify all the us-
ability problems. Other types of usability evaluation will require more participants.4

As part of the learning design process, prototypes are commonly developed and tested. When
developing any type of digital learning resource, is important to begin usability evaluation at the
earliest phases of design because, if left until just before an interactive program is released, there
will be little chance to make any significant design changes.

An alpha version of a design is used in the earliest tests, conducted when the product is still at the
early prototype stage. Alpha testing is generally done with in-house personnel. Beta versions are
released to selected samples of learners for testing. The product is more refined at this point, near-
ing completion, but still needing to be tested and refined. The field version of a program is sup-
posed to be flawless, although anyone who has bought version 1.0 of a new piece of software re-
alizes that this is rarely the case. The persistence of problems (bugs) is why testing even field ver-
sions of an interactive program is important.

Finally, although this brief paper is focused on formative evaluation in the context of learning de-
sign, it is worth noting that formative evaluation is also a major component of the more formal
genre of R&D activity known as educational design research5 (also referred to as design-based
research or simply design research). Educational design research (EDR) entails the investigation
of learning in practical educational settings by addressing a significant learning problem through
the close collaboration of teachers and researchers. EDR projects usually last two or more years.
EDR has two primary goals: first to create an effective intervention that directly addresses an im-
portant problem, e.g., a serious learning game that is intended to increase the interest of young
minority students in careers in science and engineering, and second to extend theoretical knowl-
edge related to learning design, e.g., the pedagogical principles underlying the design and use of
serious games. The second outcome is usually represented in the form of design principles that
others can use when they engage in learning design and/or EDR. It is this second goal of knowl-
edge production that distinguishes EDR from learning design.


Conclusion
This paper has introduced you to the concept of formative evaluation. There is a lot more to it
than can be included in this brief paper. You may wish to consult the references6 listed below.


                                                   10
References
1
 Flagg, B. N. (1990). Formative evaluation for educational technologies. Hillsdale, NJ: Lawrence Erlbaum
Associates.
2
 Conole, G., Culver, J., Williams, P., Cross, S., Clark, P., & Brasher, A. (2009). Cloudworks:
Social networking for learning design. Australasian Journal of Educational Technology, 25(5), 763-
782.
3
 Shneiderman, B., & Plaisant, C. (2010). Designing the user interface: Strategies for effective human-computer
interaction (5th Ed.). Reading, MA: Addison-Wesley.
4
 Hwang, W., & Salvendy, G. (2010). Number of people required for usability evaluation: The
10±2 rule. Communications of the ACM, 53(5), 130-133.
5
McKenney, S. E., & Reeves, T. C. (2012). Conducting educational design research. New York:
Routledge.
6
Reeves, T. C., & Hedberg, J. G. (2003). Interactive learning systems evaluation. Englewood Cliffs,
NJ: Educational Technology Publications.




                                                      11

More Related Content

What's hot

What does good course design look like to you - Alex Wu, Blackboard
What does good course design look like to you - Alex Wu, BlackboardWhat does good course design look like to you - Alex Wu, Blackboard
What does good course design look like to you - Alex Wu, BlackboardBlackboard APAC
 
Instructional design
Instructional designInstructional design
Instructional designMandira Shahi
 
'Een praktische toolkit voor blended learning' - Chris Rouwenhorst & Martine ...
'Een praktische toolkit voor blended learning' - Chris Rouwenhorst & Martine ...'Een praktische toolkit voor blended learning' - Chris Rouwenhorst & Martine ...
'Een praktische toolkit voor blended learning' - Chris Rouwenhorst & Martine ...SURF Events
 
Revisiting Backwards Design Table
Revisiting Backwards Design TableRevisiting Backwards Design Table
Revisiting Backwards Design Tablesandrasawaya
 
Blackboard Analytics for Learn: A recipe for success
Blackboard Analytics for Learn: A recipe for successBlackboard Analytics for Learn: A recipe for success
Blackboard Analytics for Learn: A recipe for successRichard Stals
 
2021_06_30 «Increasing Student Interaction with Formal Languages using Progra...
2021_06_30 «Increasing Student Interaction with Formal Languages using Progra...2021_06_30 «Increasing Student Interaction with Formal Languages using Progra...
2021_06_30 «Increasing Student Interaction with Formal Languages using Progra...eMadrid network
 
Using Blackboard Mobile Learn to develop research skills through authentic le...
Using Blackboard Mobile Learn to develop research skills through authentic le...Using Blackboard Mobile Learn to develop research skills through authentic le...
Using Blackboard Mobile Learn to develop research skills through authentic le...Blackboard APAC
 
Implementing an Online Learning Initiative
Implementing an Online Learning InitiativeImplementing an Online Learning Initiative
Implementing an Online Learning InitiativeAndy Petroski
 
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...Ioana Jivet
 
Mobile Learning Keynote
Mobile Learning KeynoteMobile Learning Keynote
Mobile Learning Keynotesandrasawaya
 
Preso slidesiste2012web
Preso slidesiste2012webPreso slidesiste2012web
Preso slidesiste2012webBill Dolton
 
Micro Instructional Design for Problem-Based and Game-Based Learning
Micro Instructional Design for Problem-Based and Game-Based LearningMicro Instructional Design for Problem-Based and Game-Based Learning
Micro Instructional Design for Problem-Based and Game-Based LearningAndy Petroski
 
Media business education project
Media business education projectMedia business education project
Media business education projectVictoria Grace Walden
 
eportfolios as catalyst: NLI Workshop Spring 2015
eportfolios as catalyst: NLI Workshop Spring 2015eportfolios as catalyst: NLI Workshop Spring 2015
eportfolios as catalyst: NLI Workshop Spring 2015Marc Zaldivar
 
Flexibel leren op locatie met apps, mobile devices en weararables_deel 3
Flexibel leren op locatie met apps, mobile devices en weararables_deel 3Flexibel leren op locatie met apps, mobile devices en weararables_deel 3
Flexibel leren op locatie met apps, mobile devices en weararables_deel 3SURF Events
 
The First Principles of Instruction
The First Principles of InstructionThe First Principles of Instruction
The First Principles of InstructionAndy Petroski
 
higher thinking skills-ed.tech2
higher thinking skills-ed.tech2higher thinking skills-ed.tech2
higher thinking skills-ed.tech2Reynold Bangalisan
 

What's hot (20)

What does good course design look like to you - Alex Wu, Blackboard
What does good course design look like to you - Alex Wu, BlackboardWhat does good course design look like to you - Alex Wu, Blackboard
What does good course design look like to you - Alex Wu, Blackboard
 
Instructional design
Instructional designInstructional design
Instructional design
 
'Een praktische toolkit voor blended learning' - Chris Rouwenhorst & Martine ...
'Een praktische toolkit voor blended learning' - Chris Rouwenhorst & Martine ...'Een praktische toolkit voor blended learning' - Chris Rouwenhorst & Martine ...
'Een praktische toolkit voor blended learning' - Chris Rouwenhorst & Martine ...
 
Revisiting Backwards Design Table
Revisiting Backwards Design TableRevisiting Backwards Design Table
Revisiting Backwards Design Table
 
Blackboard Analytics for Learn: A recipe for success
Blackboard Analytics for Learn: A recipe for successBlackboard Analytics for Learn: A recipe for success
Blackboard Analytics for Learn: A recipe for success
 
Addie
AddieAddie
Addie
 
Gps mod 4 v2.1
Gps mod 4 v2.1Gps mod 4 v2.1
Gps mod 4 v2.1
 
2021_06_30 «Increasing Student Interaction with Formal Languages using Progra...
2021_06_30 «Increasing Student Interaction with Formal Languages using Progra...2021_06_30 «Increasing Student Interaction with Formal Languages using Progra...
2021_06_30 «Increasing Student Interaction with Formal Languages using Progra...
 
Using Blackboard Mobile Learn to develop research skills through authentic le...
Using Blackboard Mobile Learn to develop research skills through authentic le...Using Blackboard Mobile Learn to develop research skills through authentic le...
Using Blackboard Mobile Learn to develop research skills through authentic le...
 
01 welcome
01 welcome01 welcome
01 welcome
 
Implementing an Online Learning Initiative
Implementing an Online Learning InitiativeImplementing an Online Learning Initiative
Implementing an Online Learning Initiative
 
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...
 
Mobile Learning Keynote
Mobile Learning KeynoteMobile Learning Keynote
Mobile Learning Keynote
 
Preso slidesiste2012web
Preso slidesiste2012webPreso slidesiste2012web
Preso slidesiste2012web
 
Micro Instructional Design for Problem-Based and Game-Based Learning
Micro Instructional Design for Problem-Based and Game-Based LearningMicro Instructional Design for Problem-Based and Game-Based Learning
Micro Instructional Design for Problem-Based and Game-Based Learning
 
Media business education project
Media business education projectMedia business education project
Media business education project
 
eportfolios as catalyst: NLI Workshop Spring 2015
eportfolios as catalyst: NLI Workshop Spring 2015eportfolios as catalyst: NLI Workshop Spring 2015
eportfolios as catalyst: NLI Workshop Spring 2015
 
Flexibel leren op locatie met apps, mobile devices en weararables_deel 3
Flexibel leren op locatie met apps, mobile devices en weararables_deel 3Flexibel leren op locatie met apps, mobile devices en weararables_deel 3
Flexibel leren op locatie met apps, mobile devices en weararables_deel 3
 
The First Principles of Instruction
The First Principles of InstructionThe First Principles of Instruction
The First Principles of Instruction
 
higher thinking skills-ed.tech2
higher thinking skills-ed.tech2higher thinking skills-ed.tech2
higher thinking skills-ed.tech2
 

Similar to OLDS MOOC Week 7: Formative evaluation paper

Project Based Learning- Ashish K Chaurdia
Project Based Learning- Ashish K ChaurdiaProject Based Learning- Ashish K Chaurdia
Project Based Learning- Ashish K ChaurdiaDipayan Sarkar
 
Ch. 11 designing and conducting formative evaluations
Ch. 11 designing and conducting formative evaluationsCh. 11 designing and conducting formative evaluations
Ch. 11 designing and conducting formative evaluationsEzraGray1
 
E Learning Course Design
E Learning Course DesignE Learning Course Design
E Learning Course DesignHidayathulla NS
 
Addie instructional design model
Addie instructional design modelAddie instructional design model
Addie instructional design modelMichael Payne
 
addiemodel-2444430216152445-202f3279.pdf
addiemodel-2444430216152445-202f3279.pdfaddiemodel-2444430216152445-202f3279.pdf
addiemodel-2444430216152445-202f3279.pdfGamal Mansour
 
Addie Model .pptx
Addie Model .pptxAddie Model .pptx
Addie Model .pptxsonuodm
 
Design Thinking
Design ThinkingDesign Thinking
Design ThinkingKim Moore
 
Instructional design
Instructional designInstructional design
Instructional designVICTOR EKPO
 
Www whidbey com_frodo_isd_htm
Www whidbey com_frodo_isd_htmWww whidbey com_frodo_isd_htm
Www whidbey com_frodo_isd_htmElsa von Licy
 
ADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design ModelADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design Modeleshikachattopadhyay
 
Chapter 4Conducting a Needs AssessmentKnowing what is
Chapter 4Conducting a Needs AssessmentKnowing what is Chapter 4Conducting a Needs AssessmentKnowing what is
Chapter 4Conducting a Needs AssessmentKnowing what is WilheminaRossi174
 
Instructional Design
Instructional DesignInstructional Design
Instructional DesignIain Doherty
 
Design journal
Design journalDesign journal
Design journalR. Sosa
 
Isd basics stc
Isd basics stcIsd basics stc
Isd basics stcRobert McWatt
 
Addie model
Addie modelAddie model
Addie modelSDM1946
 
Design Thinking Workshop STLinSTL
Design Thinking Workshop STLinSTLDesign Thinking Workshop STLinSTL
Design Thinking Workshop STLinSTLlmittler
 

Similar to OLDS MOOC Week 7: Formative evaluation paper (20)

M1-Designing-SLOs-13NOV13
M1-Designing-SLOs-13NOV13M1-Designing-SLOs-13NOV13
M1-Designing-SLOs-13NOV13
 
M1 Designing Trainer Notes
M1 Designing Trainer NotesM1 Designing Trainer Notes
M1 Designing Trainer Notes
 
Instructional models
Instructional modelsInstructional models
Instructional models
 
Project Based Learning- Ashish K Chaurdia
Project Based Learning- Ashish K ChaurdiaProject Based Learning- Ashish K Chaurdia
Project Based Learning- Ashish K Chaurdia
 
Ch. 11 designing and conducting formative evaluations
Ch. 11 designing and conducting formative evaluationsCh. 11 designing and conducting formative evaluations
Ch. 11 designing and conducting formative evaluations
 
E Learning Course Design
E Learning Course DesignE Learning Course Design
E Learning Course Design
 
Addie instructional design model
Addie instructional design modelAddie instructional design model
Addie instructional design model
 
The ADDIE Model
The ADDIE ModelThe ADDIE Model
The ADDIE Model
 
addiemodel-2444430216152445-202f3279.pdf
addiemodel-2444430216152445-202f3279.pdfaddiemodel-2444430216152445-202f3279.pdf
addiemodel-2444430216152445-202f3279.pdf
 
Addie Model .pptx
Addie Model .pptxAddie Model .pptx
Addie Model .pptx
 
Design Thinking
Design ThinkingDesign Thinking
Design Thinking
 
Instructional design
Instructional designInstructional design
Instructional design
 
Www whidbey com_frodo_isd_htm
Www whidbey com_frodo_isd_htmWww whidbey com_frodo_isd_htm
Www whidbey com_frodo_isd_htm
 
ADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design ModelADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design Model
 
Chapter 4Conducting a Needs AssessmentKnowing what is
Chapter 4Conducting a Needs AssessmentKnowing what is Chapter 4Conducting a Needs AssessmentKnowing what is
Chapter 4Conducting a Needs AssessmentKnowing what is
 
Instructional Design
Instructional DesignInstructional Design
Instructional Design
 
Design journal
Design journalDesign journal
Design journal
 
Isd basics stc
Isd basics stcIsd basics stc
Isd basics stc
 
Addie model
Addie modelAddie model
Addie model
 
Design Thinking Workshop STLinSTL
Design Thinking Workshop STLinSTLDesign Thinking Workshop STLinSTL
Design Thinking Workshop STLinSTL
 

More from Yishay Mor

Education as a design practice and a design science
Education as a design practice and a design scienceEducation as a design practice and a design science
Education as a design practice and a design scienceYishay Mor
 
Simon Nelson: FutureLearn
Simon Nelson: FutureLearnSimon Nelson: FutureLearn
Simon Nelson: FutureLearnYishay Mor
 
Paul Hunter: why MOOCs and Executives Don't Mix
Paul Hunter: why MOOCs and Executives Don't MixPaul Hunter: why MOOCs and Executives Don't Mix
Paul Hunter: why MOOCs and Executives Don't MixYishay Mor
 
Sanna Ruhalahti: Wanted - MOOC Pedagogy
Sanna Ruhalahti: Wanted - MOOC PedagogySanna Ruhalahti: Wanted - MOOC Pedagogy
Sanna Ruhalahti: Wanted - MOOC PedagogyYishay Mor
 
OEC Paris Residential: scenarios workshop
OEC Paris Residential: scenarios workshopOEC Paris Residential: scenarios workshop
OEC Paris Residential: scenarios workshopYishay Mor
 
MOOCs for Web Talent
MOOCs for Web TalentMOOCs for Web Talent
MOOCs for Web TalentYishay Mor
 
OpenEducation Challenge Finalists' Workshop: Design Thinking Session
OpenEducation Challenge Finalists' Workshop: Design Thinking SessionOpenEducation Challenge Finalists' Workshop: Design Thinking Session
OpenEducation Challenge Finalists' Workshop: Design Thinking SessionYishay Mor
 
EEE Project Meeting, June 2014
EEE Project Meeting, June 2014EEE Project Meeting, June 2014
EEE Project Meeting, June 2014Yishay Mor
 
How to ruin a MOOC? JISC RSC Yorkshire & the Humber Online Conference 2013
How to ruin a MOOC? JISC RSC Yorkshire & the Humber Online Conference 2013How to ruin a MOOC? JISC RSC Yorkshire & the Humber Online Conference 2013
How to ruin a MOOC? JISC RSC Yorkshire & the Humber Online Conference 2013Yishay Mor
 
How to ruin a mooc
How to ruin a moocHow to ruin a mooc
How to ruin a moocYishay Mor
 
Metis project worskhop design
Metis project worskhop designMetis project worskhop design
Metis project worskhop designYishay Mor
 
Metis project deliverable D3.2: Draft of pilot workshop
Metis project deliverable D3.2: Draft of pilot workshopMetis project deliverable D3.2: Draft of pilot workshop
Metis project deliverable D3.2: Draft of pilot workshopYishay Mor
 
Design narratives
Design narrativesDesign narratives
Design narrativesYishay Mor
 
The Pedagogical Patterns Collector User Guide
The Pedagogical Patterns Collector User GuideThe Pedagogical Patterns Collector User Guide
The Pedagogical Patterns Collector User GuideYishay Mor
 
Learning Design: mapping the landscape
Learning Design: mapping the landscapeLearning Design: mapping the landscape
Learning Design: mapping the landscapeYishay Mor
 
Fels survey
Fels surveyFels survey
Fels surveyYishay Mor
 
Fels mobile learningday-diymobileapps
Fels mobile learningday-diymobileappsFels mobile learningday-diymobileapps
Fels mobile learningday-diymobileappsYishay Mor
 
Local learnscenarios
Local learnscenariosLocal learnscenarios
Local learnscenariosYishay Mor
 
Mobile Learning Day - LocalLearn Scenarios
Mobile Learning Day - LocalLearn ScenariosMobile Learning Day - LocalLearn Scenarios
Mobile Learning Day - LocalLearn ScenariosYishay Mor
 
Social Sciences Mobile Learning Day - the Survey
Social Sciences Mobile Learning Day - the SurveySocial Sciences Mobile Learning Day - the Survey
Social Sciences Mobile Learning Day - the SurveyYishay Mor
 

More from Yishay Mor (20)

Education as a design practice and a design science
Education as a design practice and a design scienceEducation as a design practice and a design science
Education as a design practice and a design science
 
Simon Nelson: FutureLearn
Simon Nelson: FutureLearnSimon Nelson: FutureLearn
Simon Nelson: FutureLearn
 
Paul Hunter: why MOOCs and Executives Don't Mix
Paul Hunter: why MOOCs and Executives Don't MixPaul Hunter: why MOOCs and Executives Don't Mix
Paul Hunter: why MOOCs and Executives Don't Mix
 
Sanna Ruhalahti: Wanted - MOOC Pedagogy
Sanna Ruhalahti: Wanted - MOOC PedagogySanna Ruhalahti: Wanted - MOOC Pedagogy
Sanna Ruhalahti: Wanted - MOOC Pedagogy
 
OEC Paris Residential: scenarios workshop
OEC Paris Residential: scenarios workshopOEC Paris Residential: scenarios workshop
OEC Paris Residential: scenarios workshop
 
MOOCs for Web Talent
MOOCs for Web TalentMOOCs for Web Talent
MOOCs for Web Talent
 
OpenEducation Challenge Finalists' Workshop: Design Thinking Session
OpenEducation Challenge Finalists' Workshop: Design Thinking SessionOpenEducation Challenge Finalists' Workshop: Design Thinking Session
OpenEducation Challenge Finalists' Workshop: Design Thinking Session
 
EEE Project Meeting, June 2014
EEE Project Meeting, June 2014EEE Project Meeting, June 2014
EEE Project Meeting, June 2014
 
How to ruin a MOOC? JISC RSC Yorkshire & the Humber Online Conference 2013
How to ruin a MOOC? JISC RSC Yorkshire & the Humber Online Conference 2013How to ruin a MOOC? JISC RSC Yorkshire & the Humber Online Conference 2013
How to ruin a MOOC? JISC RSC Yorkshire & the Humber Online Conference 2013
 
How to ruin a mooc
How to ruin a moocHow to ruin a mooc
How to ruin a mooc
 
Metis project worskhop design
Metis project worskhop designMetis project worskhop design
Metis project worskhop design
 
Metis project deliverable D3.2: Draft of pilot workshop
Metis project deliverable D3.2: Draft of pilot workshopMetis project deliverable D3.2: Draft of pilot workshop
Metis project deliverable D3.2: Draft of pilot workshop
 
Design narratives
Design narrativesDesign narratives
Design narratives
 
The Pedagogical Patterns Collector User Guide
The Pedagogical Patterns Collector User GuideThe Pedagogical Patterns Collector User Guide
The Pedagogical Patterns Collector User Guide
 
Learning Design: mapping the landscape
Learning Design: mapping the landscapeLearning Design: mapping the landscape
Learning Design: mapping the landscape
 
Fels survey
Fels surveyFels survey
Fels survey
 
Fels mobile learningday-diymobileapps
Fels mobile learningday-diymobileappsFels mobile learningday-diymobileapps
Fels mobile learningday-diymobileapps
 
Local learnscenarios
Local learnscenariosLocal learnscenarios
Local learnscenarios
 
Mobile Learning Day - LocalLearn Scenarios
Mobile Learning Day - LocalLearn ScenariosMobile Learning Day - LocalLearn Scenarios
Mobile Learning Day - LocalLearn Scenarios
 
Social Sciences Mobile Learning Day - the Survey
Social Sciences Mobile Learning Day - the SurveySocial Sciences Mobile Learning Day - the Survey
Social Sciences Mobile Learning Day - the Survey
 

Recently uploaded

Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Celine George
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomnelietumpap1
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfSpandanaRallapalli
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Science 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxScience 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxMaryGraceBautista27
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONHumphrey A Beña
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 

Recently uploaded (20)

Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choom
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdf
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Science 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxScience 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptx
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 

OLDS MOOC Week 7: Formative evaluation paper

  • 1. Learning Design and Formative Evaluation A Paper Written for Week 7 of the OLDS MOOC 2013 Thomas C. Reeves, Ph.D. Yishay Mor, Ph.D. The University of Georgia Open University of the UK Some aspects of “evaluation,” the focus of Week 7, have been woven into the fabric of the first six weeks of the OLDS MOOC: 1) Initiate, 2) Inquire, 3) Ideate, 4) Connect, 5) Proto- type, and 6) Curate. For example, as a learning designer, you would normally conduct re- views of existing educational resources before setting off to develop your own during the Initiate and Inquire phases of a learning design project. The success of Prototype phase of a learning design initiative is heavily dependent upon evaluative activities. However, regardless of how much upfront analysis and investigation you do, your initial design concepts and pro- totypes will almost always be subject to improvement. This is where formative evaluation comes in. Formative evaluation is the essential “life-force” of the learning design process. Formative evaluation is “the systematic collection of information for the purpose of informing deci- sions to design and improve the product.”1 Virtually everything about a learning design can be enhanced at some stage of its development. Sometimes all that is needed for improve- ment is a flash of creative insight, but more often than not learning designers require specific information to guide decisions about how to improve the learning design as it is being de- signed, developed, and implemented. As described below, you can collect this information using many different methods from a variety of different people, ranging from subject mat- ter experts to members of the intended learning population for the learning design. Within the context of the OLDS MOOC per se, peer evaluation is especially relevant, given that you have been asked to engage in reviewing the learning design artifacts of others at virtually every step of the process. Objectives A careful reading of this paper will enable you to: • identify decisions involved in the formative evaluation of a learning design; • specify questions that should be answered before making these decisions about im- proving a learning design; • identify the information needed to answer these questions; and • decide how to collect and report the required information so that a learning design can be improved in a timely manner. 1
  • 2. Formative Evaluation Strategies As illustrated in Figure 1, different types of decisions must be made when you attempt to improve a learning design, each of which is tied to one or more specific questions that can be addressed by formative evaluation activities, such as observations and interviews. Decisions Example Questions Should the interface proposed for Is navigation clear to learners? an Open Educational Resource Are the meanings of icons clear? (OER) be redesigned? Do learners get lost in navigating through the OER? Should the number and length of Which activities do learners engage in most frequently? activities in a learning design be de- To what extent do learners exhibit a lack of engagement in creased? specific activities? Should more practice opportunities Do learners pass quizzes? be added to the learning design? Do learners exhibit mastery of intended objectives? Do learners rate interactive components highly? Should the learning design scope Is the learning design aligned with curricular guidelines? be expanded? Do content experts rate the OER as comprehensive? Figure 1. Typical decisions and questions in a formative evaluation. Some people resist evaluation. However, there is an ethical imperative to evaluate a learning design while it is being developed. After all, learning designs are intended to change people, to modify their knowledge, skills, attitudes, and intentions as well as to influence them to behave differently. The risk of misguiding learners with an untested learning design is too great, and therefore formative evaluation can be viewed a morally responsible activity. In Barbara Flagg’s classic book, Formative Evaluation for Educational Technologies1, she identified six reasons why people resist formative evaluation: • Time – In the rush to meet project deadlines, reducing or eliminating formative evaluation activities is perceived as an easy way to save time. • Money – Most learning design budgets, if there even is a budget, fail to provide suf- ficient funding for rigorous formative evaluation. (At least 10% of a project budget should be allocated to formative evaluation.) • Human Nature – Many learning designers are reluctant to subject their innovations to potential criticism, especially from learners they may view as uninformed or from experts they may view as threatening. • Unrealistic Expectations – Although formative evaluation can provide information to guide decision-making, it cannot substitute for the expertise and creativity of a qualified learning designer. In short, you cannot just toss together a rough prototype of a design, and expect formative evaluation alone to turn it into a winner. 2
  • 3. • Measurement Difficulties – Although some aspects of formative evaluation are rela- tively easy to determine (e.g., investigating whether learners think various parts of an learning design are engaging), there is a lack of reliable, valid, and feasible methods of evaluating certain kinds of outcomes of learning that a particular design may ad- dress, e.g., problem-solving. • Knowledge – Formative evaluation expertise is still not widely available within the learning design community. Many designers lack the knowledge and skills to conduct systematic formative evaluation in an efficient and effective manner. Engaging in formative evaluation should result in an overall reduction in development and implementation costs over the lifespan of a learning design initiative. (“Costs” encompass far more than just currency, and within the context of learning design often are best calculated in terms of the “sweat equity” and time that you put into a design effort.) Therefore, you should evaluate “early and often.” The sooner formative evaluation is conducted during a learning design initiative, the more likely that substantive improvements will be made and errors avoided. Decisions and Questions As a learning designer, you will produce draft design documents and prototypes of various types. Each of these represents an opportunity for making important decisions about en- hancing the effectiveness of the final learning design. Should you increase the level of the program’s objectives? How can the design be more engaging for learners? Should more as- sessment be incorporated into the design? These and other decisions will be faced by you and other members of your design team. The impetus to make decisions about improving a learning design will come from many di- rections. You may see an interactive program developed by someone else that inspires a new interface idea. You may find out new information about the interests and knowledge of your prospective learners. Your budget, if any, may be cut, thereby requiring you to reduce the more expensive elements of an interactive program such as animation. These and other fac- tors will be signals that there is room (and often a need) for improvements in a prototype design. Of course, formative evaluation is not something that is initiated when there is a crisis such as a budget cut. Instead, it is a professional practice integral to the overall learning design process. What’s more, a formative evaluation perspective is no less important for those in- volved in implementing a learning design. The bottom line is that all of us are human and our first efforts to create or implement learning designs are bound to be somewhat flawed. Formative evaluation is essential to detecting and reducing these flaws and eventually attain- ing the high quality we all desire. Each decision can inspire different types of questions about improving the learning design. Do learners understand the structure of the design and what their options are at any given moment? Does the learning design maintain the learners’ attention? Do they accomplish the learning objectives? Is it feasible to implement the program as designed? It is too late to wait 3
  • 4. until you have completed a learning design to ask these types of questions. Instead, they must be addressed throughout the creation and development of the design. There are no universal criteria established for formative evaluation of learning designs, but some of the most commonly considered factors are functionality (To what extent does the program work as designed?), usability (To what degree is the program user-friendly for the intended learners?), appeal (How much do learners like it and why?), accessibility (To what degree is the learning design able to be used fully by all potential learners?), and effectiveness (What do they learn?). Different criteria entail many different types of questions. For exam- ple, usability implies criteria that can be broken down into smaller issues such as the quality of the user interface or the meaningfulness of icons. User interface can be further divided into factors such as navigation, mapping, aesthetics, and control. Finally, a factor like naviga- tion can be examined via several different questions: How do learners navigate through a computer-based learning design? How does their navigation relate to the underlying peda- gogy of the design? What parts of the interactive learning resource are underutilized? Where would learners like to go, but don’t seem to know how? Answering these and other ques- tions will provide you with the information you need to enhance the navigational aspects of an interactive learning resource and ultimately improve its usability. Specific Formative Evaluation Methods The key to sound formative evaluation is to collect data systematically at critical stages of the learning design’s development and to utilize the findings of each formative evaluation strat- egy as much as your time and resources allow. The following formative evaluation methods are essential within the context of most learning design initiatives: • peer review, • expert review, • learner review, • usability testing, and • alpha, beta, and field tests of prototype learning designs. Peer review is most often done during the earliest stages of learning design, but it can continue through the entire lifespan of a project. It can range from informal activities (asking a colleague to react to your design ideas over coffee) to more formal activities (providing formative feedback via a tool like Cloudworks2 to the prototypes developed by others). Expert review may be the most frequently used formative evaluation strategy. It is important to remember that there are several different kinds of “experts,” and that each type of expert can add unique kinds of information to the review and enhancement process. Content or subject matter experts can help you improve the scope, sequence, and accuracy of a learning design’s content. Learning theory experts can assist by critiquing the potential effectiveness of the pedagogical di- mensions of a design. Graphic designers can suggest how to enhance the aesthetics of the look and feel of an interactive program. Expert teachers can help you anticipate the logistical require- ments for successful implementation of a learning design in schools or universities. 4
  • 5. With respect to formative evaluation, an expert is anyone with specialized knowledge that is rele- vant to the design and implementation of your learning design. Experts can provide different per- spectives on the critical aspects of your program, e.g., its accuracy, completeness, user-friendliness, motivational strategies, aesthetics, instructional validity, effectiveness, efficiency, and feasibility. You should seek to utilize both internal (members of the design team) and external (non- members of the design team) experts to the degree that your resources allow. It is often useful to structure an expert’s review so that you are assured of getting the types and depth of information you desire. Figure 2 presents an expert review form to guide design experts with experience in interactive multimedia when they critique a prototype OER. If you must limit expert review, content experts are probably the most important expert sources of formative information in education settings. After all, if you do not get the content right, the eventual learners will be misled. One of the problems with many learning designs is that they have poor subject matter integrity because of a lack of expert review. This is a major challenge because so much material can be incorporated into a single learning design. Reviewer: Dr. Ima Gladtohelp Due Date: June 10 Please circle your rating and write comments on each aspect of the Open Educational Resource (OER). 1 represents the lowest and most negative impression on the scale, 3 represents an adequate impression, and 5 represents the highest and most positive impression. Choose N/A if the item is not appropriate or not applicable to this OER. Use additional sheets to write comments. NA=Not applicable 1=Strongly disagree 2=Disagree 3=Neither agree/nor disagree 4=Agree 5=Strongly agree AREA 1 – LEARNING DESIGN REVIEW 1. This OER design provides learners with a clear knowledge N/A 1 2 3 4 5 of the program objectives. 2. The learning interactions in this prototype OER design are N/A 1 2 3 4 5 appropriate for the objectives. 3. The pedagogical design of the OER is based N/A 1 2 3 4 5 on sound learning theory and principles. 4. The feedback in the OER learning design is clear. N/A 1 2 3 4 5 5. The pace of the learning interactions is appropriate. N/A 1 2 3 4 5 6. The difficulty level of the learning design is appropriate. N/A 1 2 3 4 5 Comments: AREA 2 – COSMETIC DESIGN REVIEW 7. The screen design of the OER follows sound principles. N/A 1 2 3 4 5 8. Media is appropriately integrated in this OER. N/A 1 2 3 4 5 9. The screen displays are easy to understand. N/A 1 2 3 4 5 Comments: AREA 3 – FUNCTIONALITY REVIEW 10. This OER operated flawlessly. N/A 1 2 3 4 5 Comments: Figure 2. Sample expert review form for a prototype learning design or functional OER. 5
  • 6. In addition to learning design and content experts, people with special expertise in human- computer interface (HCI) design and the aesthetics of online learning programs can provide use- ful expert reviews for interactive learning resources. For example, if your team doesn’t include an actual art director who is responsible for the look and feel of the design, it may be useful to ask other graphic artists to critique your prototype. Of course, you would not decide to make major changes in the design elements of a learning program based on the opinions of just one or two experts because aesthetic appeal is more subjective than many other criteria to be reviewed. Experienced learning designers are often the best experts for reviewing user interface issues, but there are people who specialize in HCI issues per se. Figure 3 presents a user interface instrument that can be used to guide reviews provided by expert designers or experienced users of OERs. Figure 3. Sample user interface review form for an OER. Enlisting the help of the “right” experts for review services is a crucial step in setting up a forma- tive evaluation. If subject matter experts (commonly called SMEs) are already part of your team, one of their primary responsibilities will be checking the accuracy and currency of your content. However, even when working with qualified SMEs, it is a good idea to have the content reviewed by other external content experts. The costs for SMEs to review learning designs can vary widely depending on the field. Recruiting graduate students and academic staff can be a much less ex- pensive source of expert review. 6
  • 7. Learner review is based on the analysis of learner behavior during the use of a learning design. The perspectives of “experts” are valuable, but the opinions of the target audience for your learn- ing design are equally important. It is critical to ensure that learner differences are accommodated to the extent possible, and thus it is important that learner review be done with a sample of people whose background knowledge and expectations approximate those of the final intended learners. During learner review, learners should be allowed to work in realistic conditions, with minimal interruption from an observer, in order to accurately replicate the intended context of use. Suppose you are designing an interactive educational resource such as a game for use in schools. In that case, valuable information for making decisions about improving the game can be derived from systematic observations of learners while they use it. Observations can be conducted at your development site or on-site at a school. Learner opinions, actions, responses, and suggestions will provide you with practical guidance in making decisions about improving the game. Of course, you would also want the teachers who must implement the learning game to review it, especially with respect to seeking their ideas about how they could integrate it into their existing practices. Widening the review process to include parents, administrators, and other school specialists is also advised in this context. Few innovative games designs have been successfully integrated into schools, a problem that might have been reduced by more inclusive formative evaluation. Observations of learners engaging with your prototype learning design at various stages of its de- velopment can be a valuable, if somewhat humbling, experience. You may be surprised at how frequently what seemed to be the most user-friendly aspects of your program befuddle would-be learners. Alternatively, what you view as motivating may bore the intended audience. Fortunately, you will often find that your creative design ideas are validated by learners. A sample protocol from the Apple Interface Laboratory is given in Figure 4. During your observations, you will see learners doing things you never expected. When you see learners making mistakes, your first in- stinct may be to blame the mistakes on their inexperience or lack of intelligence. This is wrong. The purpose of observing learners is to see what parts of your program might be difficult or inef- fective. Therefore, if you observe someone struggling or making mistakes, attribute the difficulties to faulty design, not to the learner. Observing learners can be a time-intensive and exhausting process. It can range from a very sim- ple one-on-one observation protocol to a complex arrangement wherein several observers, video cameras, and computers are used to record learners’ reactions. Whatever type of procedure is fol- lowed, it is important that you record information carefully and that you later deal with each issue that arises during the observations. Figure 5 presents a simple formative evaluation review form with three columns, one for indicat- ing what section of an interactive program is being reviewed, one for recording observations, and the last for recording the actions taken in response to the issues raised by the observations. The last column is very important because it provides the evidence that the evaluation data collected has actually had an impact on design decisions. 7
  • 8. User Observation (Based upon Apple HCI Group Protocol) The following instructions guide you through a simple user observation. With this protocol, you will see where people have difficulty using your interactive program, and you will be able to use that information to improve it. 1 – Introduce yourself Make the session and task as welcoming as possible. Remember, this user observation is not designed as a controlled experiment, so keep the environment friendly. 2 – Describe the Try to make the participant feel at ease by stressing that you’re trying to find problems in the general purpose of program. For example, you could say: the observation You’re helping us by trying out this e-learning program. We’re looking for places where the program may be difficult to use. If you have trouble with some tasks, it’s the program’s fault, not yours. Don’t feel bad; that’s exactly what we’re looking for. If we can locate the trouble spots, then we can go back and improve the program. Remember, we're testing the program, not you. 3 – Tell the participant Make sure you inform participants that they can quit at any time if they find themselves becom- that it’s OK to quit at ing uncomfortable. Participants shouldn’t feel like they're locked into completing tasks. Say any time something like this: Although I don’t know of any reason for this to happen, if you should become uncomfortable in any way, you are free to quit at any time. 4 – Talk about the Explain the purpose of each piece of equipment in the observation room and how it will be equipment used in the test (hardware, software, video camera, microphones, etc.). 5 – Explain how to Ask participants to think aloud during the observation, saying what comes to mind as they “think aloud” work. You’ll find that listening to learners as they engage with your e-learning program provides you with useful information that you can get in no other way. Unfortunately, most people feel awkward or self-conscious about thinking aloud. Explain why you want participants to think aloud, and demonstrate how to do it. You could say: We have found that we get lots of information from these informal tests if we ask people to think aloud as they work through the e-learning program. It may be a bit awkward at first, but it’s really very easy once you get used to it. All you have to do is speak your thoughts as you go through the program. If you forget to think aloud, I’ll remind you to keep talking. Would you like me to demonstrate? 6 – Describe why you It is very important that you allow participants to work with your product without any interfer- will not be able to ence or extra help. If a participant begins having difficulty and you immediately provide help, help. you may lose the most valuable information you can gain from user observation: where users have trouble, and how they figure out what to do. Of course, there may be situations where you must step in and provide assistance, but you should decide what those situations will be before you begin testing. You may decide that you will allow someone to flounder in the e-learning program for at least 3 minutes before providing assistance. Or you may identify distinct prob- lems you will provide help on. As a rule of thumb, try not to give your test participants any more information than the true users of your product will have. Here are some things you can say to the participant: As you’re working through the e-learning program, I won’t be able to provide help or answer questions. This is because we want to create the most realistic situation possible. Even though I won’t be able to answer your questions, please ask them anyway. It’s very important that I capture all your questions and comments on the recording equipment. When you’ve finished all the program, I’ll answer any questions you still have. (Continued on next page) Figure 4. Sample User Observation Protocol. 8
  • 9. 7 – Describe the Explain what the participant should do first, second, third, etc. tasks and introduce the program Give the participant written instructions for the tasks. Important: If you need to demonstrate your program before the user observation begins, be sure you don’t demonstrate something you’re trying to test. (For example, if you want to know whether users can figure out how to use certain interactions, don’t show them how to engage with these particular interactions before the test.) 8 – Ask if there are Before you start, make sure the respondent knows your expectations, then begin the obser- questions vation. 9 – Conclude the When the test is over: observation • Explain what you were trying to find out during the test. • Answer any remaining questions the participant may have. • Discuss any interesting behaviors you would like the participant to explain. 10 – Use the results To get the most out of your test results, review all your data carefully and thoroughly (your notes, the video recording, the tasks, etc). Look for places where participants had trouble, and see if you can determine how your program could be changed to alleviate the problems. Look for patterns in the participants' behavior that might tell you whether the program is understood correctly. It’s a good idea to keep a record of what you found out during the test. That way, you’ll have documentation to support your design decisions and you’ll be able to see trends in users’ behavior. After you’ve examined the results and summarized the important findings, fix the problems you found and test the product again. By testing your product more than once, you’ll see how your changes affect users’ performance. Figure 4. Sample User Observation Protocol (continued). Formative Review Log Program: Learn or else! Reviewer: Smyth Date: May 15 Screen Comments, Questions, Suggestions Action Taken C-17 The forward and back navigation arrows Enlarge the navigation arrows by 50% are so small that learners seem to have and repeat observations. trouble placing the mouse cursor on them. C-23 The learners think that the “Question Use the “Question Mark icon for help, Mark icon will take them to help, but it and find a different icon for the fre- takes them to a list of frequently asked quently asked questions. questions instead. Figure 5. Formative evaluation review log for OER. Usability testing is an important method in the design of interactive programs. User interface expert Ben Shneiderman3 maintains that usability is a combination of the following characteristics: • ease of learning, • high speed of task performance, 9
  • 10. • low error rate, • subjective satisfaction, and • learner retention over time. Too many formative evaluations are only focused on whether users like a program or not, but usability is a much deeper subject. There are instances when you might evaluate usability without learners. Time with learners is often limited; it is not a free resource. In addition, learners may find it difficult to visualize how a prototype learning design such as n OER could be designed differ- ently and they therefore may tend to evaluate according to what already exists, rather than to what is possible. Some usability criteria will only be reliably identified or articulated by trained evaluators using protocols such as heuristic evaluation. At least three evaluators with a mix of experience and expertise are required for heuristic evaluation because fewer will not be able to identify all the us- ability problems. Other types of usability evaluation will require more participants.4 As part of the learning design process, prototypes are commonly developed and tested. When developing any type of digital learning resource, is important to begin usability evaluation at the earliest phases of design because, if left until just before an interactive program is released, there will be little chance to make any significant design changes. An alpha version of a design is used in the earliest tests, conducted when the product is still at the early prototype stage. Alpha testing is generally done with in-house personnel. Beta versions are released to selected samples of learners for testing. The product is more refined at this point, near- ing completion, but still needing to be tested and refined. The field version of a program is sup- posed to be flawless, although anyone who has bought version 1.0 of a new piece of software re- alizes that this is rarely the case. The persistence of problems (bugs) is why testing even field ver- sions of an interactive program is important. Finally, although this brief paper is focused on formative evaluation in the context of learning de- sign, it is worth noting that formative evaluation is also a major component of the more formal genre of R&D activity known as educational design research5 (also referred to as design-based research or simply design research). Educational design research (EDR) entails the investigation of learning in practical educational settings by addressing a significant learning problem through the close collaboration of teachers and researchers. EDR projects usually last two or more years. EDR has two primary goals: first to create an effective intervention that directly addresses an im- portant problem, e.g., a serious learning game that is intended to increase the interest of young minority students in careers in science and engineering, and second to extend theoretical knowl- edge related to learning design, e.g., the pedagogical principles underlying the design and use of serious games. The second outcome is usually represented in the form of design principles that others can use when they engage in learning design and/or EDR. It is this second goal of knowl- edge production that distinguishes EDR from learning design. Conclusion This paper has introduced you to the concept of formative evaluation. There is a lot more to it than can be included in this brief paper. You may wish to consult the references6 listed below. 10
  • 11. References 1 Flagg, B. N. (1990). Formative evaluation for educational technologies. Hillsdale, NJ: Lawrence Erlbaum Associates. 2 Conole, G., Culver, J., Williams, P., Cross, S., Clark, P., & Brasher, A. (2009). Cloudworks: Social networking for learning design. Australasian Journal of Educational Technology, 25(5), 763- 782. 3 Shneiderman, B., & Plaisant, C. (2010). Designing the user interface: Strategies for effective human-computer interaction (5th Ed.). Reading, MA: Addison-Wesley. 4 Hwang, W., & Salvendy, G. (2010). Number of people required for usability evaluation: The 10±2 rule. Communications of the ACM, 53(5), 130-133. 5 McKenney, S. E., & Reeves, T. C. (2012). Conducting educational design research. New York: Routledge. 6 Reeves, T. C., & Hedberg, J. G. (2003). Interactive learning systems evaluation. Englewood Cliffs, NJ: Educational Technology Publications. 11