SlideShare una empresa de Scribd logo
1 de 17
Formative Evaluation Report
Formative Evaluation Report
This report will describe and analyze the formative evaluation process for faculty training
on the smart classroom in room 202 of the Education building on the UND campus. This
report is divided into four main sections; Plan, Description, Outcomes and Interpretation.
The Plan provides an outline of how the formative evaluation was designed and includes
one-to-one and small group evaluations. The Description explains how the evaluations
progressed. The Outcomes section contains the data obtained from both evaluations and
the Interpretation analyzes this data.

I. The Plan

Control of Variables

                                        Aptitude
Aptitude will be controlled by including learners of below average, average and above
average aptitude. Aptitude will be measured using an Experience Assessment, a tool
which measures experience with computers, smart classrooms and technical ability (see
Appendix A). Questions on the instrument will be assigned equal weights and one through
five points for each response.

The Experience Assessment contains a total of ten questions. Question 1 (1-5 pts)
measures the learner’s experience with computers. Question 2 (1-5 pts) measures how
frequently the learner uses computers to teach. This question was included because
intermediate computer knowledge is a required prerequisite for the course. Question 3
(1-5 pts) measures how often the instructor uses PowerPoint and an LCD projector to
teach classes. This question acknowledges a learner’s experience with low level
technology in the classroom. Question 4 (1-5 pts) measures whether a learner has
experienced the smart classroom setting as a student. Familiarity with the smart
classroom environment as a student can provide an instructor valuable insight into good
and bad instructional strategies with the equipment. Question 5 (1-5 pts) measures
whether the student has used a smart classroom to teach. Previous experience with the
smart classroom could impact the pace of the instruction for this student. Both questions
4 and 5 have yes/no responses. Positive responses will receive the full value of five points
and negative answers will receive only one point. Question 6 (1-5 pts) measures
frequency of experience with smart classrooms. This question was included because even
if the learner’s experience is minimal, prior increases the probability that the learner will
succeed and may impact the pace of the instruction as well. Question 7 measures the
student’s ability to learn new technical skills. Question 8 (1-5 pts) measures whether the
student has taken a course using distance education and question 9 (1-5 pts) measures
whether the student has taught online using distance education. Both questions 8 and 9
have yes/no responses. Positive responses will receive the full value of five points and
negative answers will receive only one point. Question 10 is an open ended question.
Answers will be transcribed, reported and categorized. A chart will be created to illustrate
the results. Possible total scores from nine to forty-five will indicate aptitude ranking.
Participants will be faculty members at UND or people who have the potential to teach or
who are teachers.

                                          Process
Due to the hands-on nature of the smart classroom, all instruction will be conducted face-
to-face. The one-to-one sessions will be conducted by the same trainer and the group
sessions will be conducted by a different facilitator. A checklist derived from the
objectives will also help to control for process and insure that each item is addressed.

                                        Support
All training will be conducted in the smart classroom located in the Education building,
room 202. Support will be provided by the Center for Instructional Learning Technology
(CILT). The director of this department designed the room and CILT is responsible for its
maintenance. A phone number is posted for anyone experiencing technical problems to
call when in need of assistance.

One-to-One
Three subjects of varying ability, below average, average and above average, will be
chosen to participate in this stage of the evaluation. Each training session will be
conducted on a one-to-one basis with the trainer. The trainer will proceed to teach the
learner about the smart classroom by demonstrating each aspect and asking the learner to
demonstrate as well by using a guided practice approach. Performance will be measured
using the Performance Checklist (Appendix B). The instructor will stop as questions or
problems arise and get feedback from the learner as to how different aspects of the
instruction could be improved. An Attitudinal Assessment (Appendix C) will be
administered following the training. All responses to the Experience Assessment and the
Attitudinal Assessment will be recorded using a matrix of questions and learners (Tables
1&2). Instruction will be revised based on learner feedback and the results of the post-
training Attitudinal Assessment. One-to-one instruction will be completed within a two-
week period.

Small Group
Eight learners of below average, average and above average aptitude will participate in
the training after the one-to-one evaluations are finished and revisions have been made.
These eight learners may or may not be taught simultaneously. Training will be
conducted in a similar manner as in the one-to-one phase; face-to-face instruction in the
smart classroom. There will be no discussion of instruction or learner feedback during the
instruction stage of the small group phase. The Performance Checklist (Appendix B) will
be used during instruction to monitor learner progress. This stage takes place during the
two weeks following one-to-one instruction.

Evaluation
Performance will be evaluated based on the stated objectives. The Performance Checklist
and three paper-based quizzes will be used to evaluate each objective. Failure of the
learner to complete any objective accurately will lead to analysis of the instruction and
related instruments. A table will indicate the results for each objective and evaluation
questions. Items that are below 100% accurate will be analyzed and modified as
necessary.

All information will be analyzed for consistency and patterns by objective, assessment
and learner both individually and across type. Matrix tables (Tables 3&4) containing
objectives for each item will be used to perform item and performance analyses. Items
within each objective will be evaluated, and any inconsistencies will be considered
indications that modifications to the instruction or to the instruments may be needed.
Problems with the instrument may include items that may not measure what they are
supposed to measure according to the corresponding objective, the question may be
unclear or too difficult. Success for each objective will be measured and compared to
success rates of other objectives to examine overall consistency. Student performance
across all checklist items and assessment items will be evaluated for consistency and
patterns. Deficient items and unsuccessful objectives will be analyzed by learner. All
information will be analyzed across objectives, items and learners and any
inconsistencies and patterns will be studied for indication of need for modification.


II. Description
                                       One-to-One

At this point it is important to note that the smart classroom in Education Room 202 was
not complete until a few weeks prior to the commencement of training. Much of the
material was compiled and written prior to completion of the system in order to prepare
as well as possible for the implementation phase. During much of the training many of
the glitches were discovered and worked through which took a lot of time and made it
difficult to get though the material. The goal was to have as much worked out as possible
before the group training. The learning task analysis was changed a few times throughout
the process which also effected the assessment and the results.

Since this is instructor-led training, a facilitator was chosen to conduct the training while
the designer watched, took notes, and answered questions as needed. The facilitator is
also and IDT graduate student who had a great deal of technology experience (especially
audio/visual) and some experience with room 202 working on a different project. The
facilitator was asked to read the green text from the instructor manual verbatim in order
for the designer to evaluate the instruction that had been written.

The placement of practice items was something we struggled with. During the first one-
to-one we turned everything off after the first demonstration and asked the participant to
go through the process of turning it all back on. This was repeated only a few minutes
later as an assessment item while the assessment checklist was utilized. This seemed to
take up a great deal of time and the format was changed so that the learner practiced
many of the activities simultaneous to the instructor describing them. The student manual
came to be seen as more of a tool than a guide and the learners were encouraged to listen
to the facilitator’s instructions while watching the manual and practicing the activities
when asked. This was effective in the one-to-one sessions but less so in the group
sessions where learners were taking turns practicing. Those items they practiced
themselves seemed to be better understood than those that were practiced by others.
However they did learn to turn to the student manual when questions arose during the
assessment.

Participant A

Participant A is a graduate student and teaching assistant in the Communications
department. Her score on the Experience Assessment was average. She was chosen
because of her experience with distance education and her desire to learn the system. She
is also an IDT student. Her experience and knowledge was helpful in identifying areas
that needed improvement. Participant A spent 2 hours on the instruction and was still
unable to complete the training. This was due to a combination of reasons, including the
frequent discussion that occurred throughout the session regarding the system and
clarifying how things worked, the materials and how to improve them, and her need to
leave at a certain time. Participant A referred to the student manual quite frequently
during the session.

The first thing noted was that the information reviewed in the student manual was not
specific enough. I added page numbers to and more specific verbiage to help with that. In
the “Tour of the Room” section it was noted that there were eight speakers in the ceiling
rather than four. This was edited for the next version of the manual. After turning on the
main power the facilitator mentioned that the user should wait a few seconds before
turning on the Gateway power switch. This was also added to the verbiage. The facilitator
also noticed that the touch panel does not, in fact, default to the computer interface as
indicated by the manual. This section was deemed irrelevant and removed from the text
to save time in the training. The sequence and length of time needed for turning the
projector on and off became a question which was investigated and edited in the text of
the manuals. In the camera section, we added a note to watch the camera for movement to
indicate whether it had been turned on. The note on the camera presets was edited and a
separate screen shot was added for the pan, tilt and zoom section. In the microphone
section there was some question as to the how the volume worked. This issue was not
resolved until later in the evaluation process. A photo of the microphone clip was taken
and inserted into this section as well.

After reviewing the “Recording Process” diagram it was reconfigured to more accurately
reflect how recording is accomplished with the system. Details of the process of
recording to mini-DV tape were carefully reviewed and refined. For example, directions
were modified to include a reminder to look at the touch panel to determine if a tape
starts playing when it is inserted it into the console. The Note in the “Recording to
DVD” section and the Tech Tip in “Finalizing a DVD” was edited for accuracy. On the
Job Aid for finalizing a DVD, a tech tip was added. Participant A and the facilitator went
very carefully through this job aid and many changes were identified, including some of
the verbiage taken directly from the component manual. Participant A suggested that a
break be added to the manual at this point.
In Module III, we noted a need to make sure the learner kept the microphone on during
the audio conference demonstration. It was also mentioned that there should be a note in
the manuals about the fire wire cable being difficult to plug in. In the section on Breeze
the facilitator suggested that the Breeze link be made a shortcut on the desktop. There
were technical issues with the making Breeze detect the camera. We spent about 20
minutes trying to figure it out until we had to move on to the assessment without
finishing the training.


Participant B

The participant is the Dean of the College of Education and Human Development. His
score on the Experience Assessment was below average. He was chosen because of his
role as the Dean and his desire to become familiar with the technology used by his
faculty. His lack of experience helped identify several areas that were unclear and/or too
long winded.

The first area of improvement was the objectives. They had not be re-written from the
five component format and needed to change. They were revised after this session. In the
“Getting Started” section, it was decided that the words, “green text to indicate when the
instructor should be speaking” should be in green text to draw the facilitator’s eye to it.
Also in this section, under Job Aids, the words, “This comprises the majority of the
student manual” were deleted. In the part of this section which was spoken by the
facilitator, some of the verbiage and order of information was changed and some was
removed for flow and to save time.

In Part 2, some of the text was seen as repetitive and removed. Also instructions to the
facilitator to refer to the room diagram while pointing out the various components were
added. The diagram entitled, “Tour of the System” was confusing and missing some
components – changes were made upon revision. In the “Tour of the Room” section, a
sentence about acoustic materials being added to the walls and floors was added. In the
section on powering up, a practice item was added. The section on Shutting Down was
moved to later in the training as was the practice item for turning this system back on.
This practice item was moved because we felt that it was taking too much time and it was
too simple to do so soon in the training.

In the “Projector” section we added a practice item for the learners to try turning the
projector on and off. When we had problems with it, we realized that it seemed to be
going into sleep mode. A tech tip on the projector was written and added to the section. A
practice item was also added to the “Camera” section so that the learner could try turning
the camera on and watch it respond. To save time, the camera preset section was deleted.
In the “Instructor Microphone” section duplication was removed and a tech tip was added
regarding the mute button.

We decided that it would save time to move the Module I checklist to the end of the
training with the Module III checklist. In the Introduction to Module II we added a
summary of “The Recording Process” diagram. In Part 6, some of the wording in the
Tech Tip was changed for accuracy. In the practice section we decided a photo of the
correct type of DVDs for the system would be helpful and we discovered a circle on the
photo had been moved to an incorrect spot. In Part 7 we changed the
“Discussion/Demonstration” section into a Practice session in which the facilitator will
talk the learner through the steps of recording to a DVD. This was a section in which a
few more idiosyncrasies with the system were identified and addressed by removing a
note and going straight to the “Finalize a DVD” section. There were a few steps in the
“finalizing a DVD” job aid which needed to be edited and added. Also, a sentence about
how to use the remote control with the touch panel was added. A few small changes were
made to the audio conference section to shorten and simplify the process.

Due to time constraints we were unable to finish the last section on Breeze.

Participant C

Participant C is an adjunct faculty member of the Department of Teaching and Learning.
She is an advanced user and already uses the system to record her lecture sessions on
DVD. She had many good suggestions which were incorporated into the manuals.

The Getting Started section still seemed to drag on for a long time with a lot of lecturing
going on. There was too much information not totally relevant being given to the learner
and we decided it would be better to simply cut a lot of it out. Other minor edits were
made to the text spoken by the facilitator to make them flow better.

Participant C suggested that the phrase “To use when streaming” be added to the “Tour
of the system” diagram for clarity. It was added. Minor directions were added to the
“Tour of the Room” section for the facilitator.

In Part 3 a Note was removed to save time. Due to the participant’s previous experience
with the system, we asked her if she had any trouble with the projector. She said that she
watches the touch panel to see if the OFF button turns red or the ON button turns blue.
This tip was added to the manuals. In addition, a practice was moved from after the
facilitator discussion to before so that the learner could play with the buttons while the
facilitator talked about them.

A tech tip was added to tell learners that the longer the camera control buttons on the
touch panel are held down, the quicker the camera moves. Practice sections were added
here as well.

Participant C brought up a question regarding switching back and forth between media
and microphone volume. After working on trying to find an answer for a while, we
decided it was a question we would have to ask CILT.

Participant C suggested putting a section on the USB extension after the photo of the
USB ports. This suggestion was not taken because the “Plugging your laptop” section
was removed from the instructor manual. The “Plugging in your laptop” section was
removed from the instructor manual because we came to consider it a prerequisite. It was
left in the student manual and placed at the back in the appendix.

Participant C felt that the word “simple” should be removed from the summary because it
might make someone who does not find it simple, feel stupid. This was changed.

A few minor wording changes were made to the “Recording to mini-DV tape” section.
The text on the JVC diagram was changed to more accurately match that which is on the
actual component. The facilitator had made the suggestion in each prior one-to-one to
“wave to yourself to make sure the DV tape is not playing back.” Participant C pointed
out that the red light on the device stays on when it is recording. Both of these tips were
added to the instruction. The wrong photos were cut and pasted into the “Recording to
DVD” section from another section. This was corrected.

In the “Finalizing a DVD” segment, a clarification was made about finalizing and un-
finalizing DVDs. In the Job Aid, a tech tip was added about timing out and a few more
clarifications were made.

Participant C suggested we explain more about voice over IPs so a brief statement was
added to the “audio conference” section. A practice item was also added and the
information on VoIPs was moved to another section. The volume section was circled on
the photo for clarification.

Since this is a section that was not completely finished by either of the previous
participants, we found several areas for improvement as we worked through it. A
“something more” section was removed due to time constraints and a screen shot was
deleted. A few screen shots were cropped to focus the learner in on the relevant content.
In this segment, we decided to switch the camera and audio conference sections so that
we could ensure that the camera is set up before the audio conference begins. This is
helpful if there are camera initialization problems. Clarifications and corrections were
made to the phone dialing section.

After running through this instruction with three participants it was clear that it was
originally written with far too much information to cover in the allotted time. It was also
a challenge to continue finding glitches within the system that effected how the
instruction was written. Ideally this instruction would have been better prepared had we
performed at least one more one-to-one so that we could test the Breeze section more
thoroughly. Still, there were far fewer changes on the next round of revisions so there
was some progress achieved.
Small Group

First Session

The first small group session was scheduled with two people, one male and one female,
but only the female attended. Participant 1 is the Technology Coordinator for the
department of Teaching and Learning. She was chosen because of her need to learn the
system because of her position in the department. She has worked with many other smart
classrooms so her score on the Experience Assessment was above average.

There were a few formatting issues with the table of content which were changed, and
few of the page numbers referred to in the student manual were in need of correction.

In the “Power Up” section two Tech Tips and some of the instructor dialog was removed
to save time. The Summary/Transition was also deleted for time. More minor changes
were necessary to the Projector section and Practice Item was removed.

In the Camera section the word “camera” was replaced with the phrase, “cam control” to
align with how it shows on the touch panel. The words, “further from” were added to the
description of zoom for clarity.

The types of batteries used for the Instructor Microphone were added to that section as
well as a photo of correct mic placement.

In the “Recording to mini-DV tape” section, part of the introduction was removed
because it was a duplication and a Tech Tip was removed from the instructor’s manual.
This tip was left in the student manual for reference. Some of the wording was changed
in the instruction and a Tech Tip on the HD/DVD switch was added. Edits were made to
the graphic as well.

In Part 7, the information on DVDs and DVD-RWs was removed from the instructor
presentation to save time but it remains in the student manual for reference. Some of the
graphics here were edited as well and the order was changed for clarity.

A tip for finalizing was added for clarity to the Finalize a DVD job aid and other minor
edits to the text and graphics were made as well. The Tech Tip at the end of the section
was removed but remains in the student manual.

In the section on Breeze several nonessential items were removed from the Instructor
dialog to save time. Text and photos that had been added received minor edits and a
“more reading” section was removed as well. Clarifications were made in the text on
using audio conference in breeze and minor but important instructions were added.
Second Session

The second session of the small group was comprised of participants 2, 3 and 4 who
completed the training in room 202 of the Education building. All participants are UND
employees but only Participant 3 is a faculty member who is in the Teaching and
Learning Department. She was above average on the Experience Assessment due to her
frequent use of smart classrooms. For this reason she entered the training thinking that
she would not be learning anything new but ended up being enthusiastic about utilizing
the distance features of the room for recording her class sessions. Participant 2 is the
doctoral program coordinator for Teaching and Learning and took part in the training
only in the interest of helping a student and had a below average score on the Experience
Assessment. Participant 4 is an instructional designer for the UND Aerospace program.
Her responsibilities including training faculty and students in Breeze; her Experience
Assessment score was above average. Her participation in the training was to help a
fellow IDT student and learn more about using smart classrooms for instruction.

The facilitator for this session was the director of the center for instructional and learning
technologies and one of the people who designed the smart classroom. Since this was the
first time she had reviewed the training, she took extra time discussing and in some cases
correcting the instruction. Even so, the instruction was much more coherent in this
session than in previous sessions thanks to the feedback of previous participants and
much revision. The training did go over two hours with the assessment.

Many of the problems found during this session were typographical or formatting issues.
The few substantial issues were related to system design problems that the facilitator
clarified. Several of the arrows in the diagram labeled “The Recording Process” had to be
removed or changed.

III. Outcomes
                                      Practice Items

The original version had practice items interspersed throughout the instruction but we
found that the way they were presented interrupted the flow of learning and took far too
much time. After experimenting with placement of practice items during the one-to-one
sessions and the first group session, we found it to work best when we walked learners
through the steps while they performed tasks. We also found that the student manual
should be used more as a tool during the instruction rather than a reference. Many times
we found ourselves reminding learners to use it while performing tasks, even tasks that
we talked them through, so that they would feel comfortable doing so when they were on
their own in the assessment and in practical use. In addition, participants in the second
group session were not all able to perform practice items because there was not enough
time or space to allow each learner to perform each task as it was taught. Each learner did
get a chance to perform some tasks with the instructor, but all had to watch others
perform as well. It was noted that those that performed the practice items seemed to
perform better on the assessment on those particular items than those who did not.
However, the changes made for trial and error made it difficult to accurately measure the
use of practice items and their effectiveness over time.

                                       Final Test
               This is where I discuss the data from the post training quiz

IV. Interpretation
                                      Practice items
 This is where I will discuss what I think the data from the pre-test questionnaire and the
                                 practice checklist means

                                        Final Test
   This is where I will discuss what I think the data from the post training quiz means


A – Adonica
B – Dan
C – Laurie
1 – Joneen
2 - Cynthia
3 – Rachael
4 – Adrienne
Conclusions and Future Revisions

V. Appendices
This is where I will enter charts and tables to document the data discussed above.

          Table 1. Response to experience questions on Experience Assessment.
                                   Questions
                 1     2      3      4     5      6      7     8    9
  Participant                                                             Average
  A              5     5      3      1     1      1      4     1    1          2.44
  B              5     5      2      1     1      1      3     1    2          2.33
  C              5     4      4      1     2      5      4     2    1          3.11
  1
  2              5     4      4      1     1      1      4     1    2          2.56
  3              5     4      4      1     2      4      4     2    1          3.00
  4              5     5      3      2     2      1      4     2    1          2.78
  Average
  One-to-One
  Composite
  Small Group
  Composite
  Average
  Ratings

Answers to Question 10

Participant A: Technical difficulties in sound, video or connection.

Participant B: The time it takes to develop the course and then the new skills one might
need to learn.

Participant C: 1) The fact that the work is all safely hidden away in my computer 2) not
knowing the people fact to face.

Participant 1:

Participant 2: My experience is in correspondence – which is not the same as distance
education. I never have a time when I can connect with a student except over e-mail. I
think the biggest drawback is that sometimes when students are writing culturally
insensitive or factually untrue responses, I believe that a face-to-face conversation would
be beneficial. Or, a redirection to a reading that might help them reframe their thinking.
This is hard to do in correspondence. I can certainly suggest a link or an article to read,
but I have no way of knowing if they follow through with it.

Participant 3: One barrier is that there is limited personal interaction with peers. The
course I took as a biology class so it wasn’t as critical there, but in some education
courses, we are teaching and modeling effective personal interactions. On the flip side, I
found that I needed to know my content very well, as the grade was 100% objective
based on my test scores.

One further disadvantage with the class I took related to this was that the tests were open
book, as they were taken at home. In that regard, I could see a potential for academic
dishonesty.

Participant 4: Trying to get student skills and technology up to speed so that everyone can
focus on the subject matter.
Practice

  Table 3. One-to-One Practice Items
Objective    1   1.1   1.1.   1.1.    1.2        2.     2.1       2.2         2.3      3.     3.1    3.2    3.3    3.4    3.4.    4     4.1     4.1.      5.         5.1       5.1.1
                       1      2                                                                                           1                     1
Student A
Student B
Student C
# Correct
% Correct
% Mastery

Objective    6   6.1   6.2    7      7.1    8.        8.1     9         9.1         9.1.    9.1.    9.2    9.3    10     10.1    11   11.1    11.2     11.3    12.         #     %
                                                                                    1       2
Student A
Student B
Student C
# Correct
% Correct
% Mastery
Table 4. Small group practice items
Objective   1    1.1   1.1.   1.1.    1.2        2.     2.1       2.2         2.3      3.     3.1    3.2    3.3    3.4    3.4.    4     4.1     4.1.      5.         5.1       5.1.1
                       1      2                                                                                           1                     1
Student 1
Student 2
Student 3
Student 4
# Correct
% Correct
% Mastery

Objective   6   6.1    6.2    7      7.1    8.        8.1     9         9.1         9.1.    9.1.    9.2    9.3    10     10.1    11   11.1    11.2     11.3    12.         #     %
                                                                                    1       2
Student 1
Student 2
Student 3
Student 4
# Correct
% Correct
% Mastery
Table 5. One-to-One Evaluation Items
Objective   1   1.1   1.1.   1.1.    1.2        2.     2.1       2.2         2.3      3.     3.1    3.2    3.3    3.4    3.4.    4     4.1     4.1.      5.         5.1       5.1.1
                      1      2                                                                                           1                     1
Student A             X
Student B
Student C
# Correct
% Correct
% Mastery

Objective   6   6.1   6.2    7      7.1    8.        8.1     9         9.1         9.1.    9.1.    9.2    9.3    10     10.1    11   11.1    11.2     11.3    12.         #     %
                                                                                   1       2
Student 1
Student 2
Student 3
Student 4
# Correct
% Correct
% Mastery
Table 6. Small group Evaluation Items
Objective   1    1.1   1.1.   1.1.    1.2        2.     2.1       2.2         2.3      3.     3.1    3.2    3.3    3.4    3.4.    4     4.1     4.1.      5.         5.1       5.1.1
                       1      2                                                                                           1                     1
Student A
Student B
Student C
# Correct
% Correct
% Mastery

Objective   6   6.1    6.2    7      7.1    8.        8.1     9         9.1         9.1.    9.1.    9.2    9.3    10     10.1    11   11.1    11.2     11.3    12.         #     %
                                                                                    1       2
Student 1
Student 2
Student 3
Student 4
# Correct
% Correct
% Mastery

Más contenido relacionado

La actualidad más candente

CBT Computer Based Training
CBT Computer Based TrainingCBT Computer Based Training
CBT Computer Based Training
gueste1bd13
 
Icube_working_paper
Icube_working_paperIcube_working_paper
Icube_working_paper
najmulq
 

La actualidad más candente (18)

Topic
TopicTopic
Topic
 
CBT Computer Based Training
CBT Computer Based TrainingCBT Computer Based Training
CBT Computer Based Training
 
E-Learning Student Assistance Model for the First Computer Programming Course
 E-Learning Student Assistance Model for the First Computer Programming Course E-Learning Student Assistance Model for the First Computer Programming Course
E-Learning Student Assistance Model for the First Computer Programming Course
 
Technology and Assessment
Technology and AssessmentTechnology and Assessment
Technology and Assessment
 
Programmed learning
Programmed learningProgrammed learning
Programmed learning
 
04 course design development phase
04 course design   development phase04 course design   development phase
04 course design development phase
 
CAI & CMI
CAI & CMICAI & CMI
CAI & CMI
 
Programed instructional material
Programed instructional materialProgramed instructional material
Programed instructional material
 
Cbt
CbtCbt
Cbt
 
The Application of Game-Like Learning Design to Real-World Settings: a Holist...
The Application of Game-Like Learning Design to Real-World Settings: a Holist...The Application of Game-Like Learning Design to Real-World Settings: a Holist...
The Application of Game-Like Learning Design to Real-World Settings: a Holist...
 
Icube_working_paper
Icube_working_paperIcube_working_paper
Icube_working_paper
 
Programmed instruction(mithaa)
Programmed instruction(mithaa)Programmed instruction(mithaa)
Programmed instruction(mithaa)
 
03 course design design phase
03 course design   design phase03 course design   design phase
03 course design design phase
 
Standard i
Standard iStandard i
Standard i
 
Computer manged learning and problems code 8620 bed
Computer manged learning and problems code 8620 bedComputer manged learning and problems code 8620 bed
Computer manged learning and problems code 8620 bed
 
Use of ICT in Language Testing
Use of ICT in Language TestingUse of ICT in Language Testing
Use of ICT in Language Testing
 
Field Study 3 Episode 7
Field Study 3 Episode 7Field Study 3 Episode 7
Field Study 3 Episode 7
 
Programmed Instruction
Programmed InstructionProgrammed Instruction
Programmed Instruction
 

Destacado (8)

Smart Classroom instruction Design Document
Smart Classroom instruction Design DocumentSmart Classroom instruction Design Document
Smart Classroom instruction Design Document
 
Instructor Manual
Instructor ManualInstructor Manual
Instructor Manual
 
Smart Classroom Student Manual
Smart Classroom Student ManualSmart Classroom Student Manual
Smart Classroom Student Manual
 
Smart Classroom Design Principles
Smart Classroom Design PrinciplesSmart Classroom Design Principles
Smart Classroom Design Principles
 
Smart Classroom Equipments for Digital Teaching & Learnng
Smart Classroom Equipments for Digital Teaching & LearnngSmart Classroom Equipments for Digital Teaching & Learnng
Smart Classroom Equipments for Digital Teaching & Learnng
 
Smart class education
Smart class educationSmart class education
Smart class education
 
Smart Classroom
Smart ClassroomSmart Classroom
Smart Classroom
 
Ppt smartclass
Ppt  smartclassPpt  smartclass
Ppt smartclass
 

Similar a Smart Classroom instruction Formative Evaluation Report

Direct Instruction: Methods for Closure and Evaluation
Direct Instruction: Methods for Closure and EvaluationDirect Instruction: Methods for Closure and Evaluation
Direct Instruction: Methods for Closure and Evaluation
mlegan31
 
Tig 3 piloting curriculum(1)
Tig 3 piloting curriculum(1)Tig 3 piloting curriculum(1)
Tig 3 piloting curriculum(1)
geraldine mendoza
 
Enhancing Student Learning through Proactive Feedback Based Adaptive Teaching...
Enhancing Student Learning through Proactive Feedback Based Adaptive Teaching...Enhancing Student Learning through Proactive Feedback Based Adaptive Teaching...
Enhancing Student Learning through Proactive Feedback Based Adaptive Teaching...
IJITE
 
Purpose This assignment will enable the student to identify .docx
Purpose  This assignment will enable the student to identify .docxPurpose  This assignment will enable the student to identify .docx
Purpose This assignment will enable the student to identify .docx
woodruffeloisa
 
6b. sample of a study guide
6b. sample of a study guide6b. sample of a study guide
6b. sample of a study guide
Gambari Isiaka
 

Similar a Smart Classroom instruction Formative Evaluation Report (20)

Evaluating Educational Technology
Evaluating Educational TechnologyEvaluating Educational Technology
Evaluating Educational Technology
 
Direct Instruction: Methods for Closure and Evaluation
Direct Instruction: Methods for Closure and EvaluationDirect Instruction: Methods for Closure and Evaluation
Direct Instruction: Methods for Closure and Evaluation
 
K to 12 classroom assessment (revised)
K to 12 classroom assessment (revised)K to 12 classroom assessment (revised)
K to 12 classroom assessment (revised)
 
Curriculum Evaluation BY Ahmet YUSUF
Curriculum  Evaluation BY Ahmet YUSUFCurriculum  Evaluation BY Ahmet YUSUF
Curriculum Evaluation BY Ahmet YUSUF
 
Tig 3 piloting curriculum(1)
Tig 3 piloting curriculum(1)Tig 3 piloting curriculum(1)
Tig 3 piloting curriculum(1)
 
Types of Evaluation prior to Instructional Act
Types of Evaluation prior to Instructional ActTypes of Evaluation prior to Instructional Act
Types of Evaluation prior to Instructional Act
 
Revising instructional material
Revising instructional materialRevising instructional material
Revising instructional material
 
Enhancing Student Learning through Proactive Feedback Based Adaptive Teaching...
Enhancing Student Learning through Proactive Feedback Based Adaptive Teaching...Enhancing Student Learning through Proactive Feedback Based Adaptive Teaching...
Enhancing Student Learning through Proactive Feedback Based Adaptive Teaching...
 
A Mastery Learning Approach To Engineering Homework Assignments
A Mastery Learning Approach To Engineering Homework AssignmentsA Mastery Learning Approach To Engineering Homework Assignments
A Mastery Learning Approach To Engineering Homework Assignments
 
Educational assessment
Educational assessment Educational assessment
Educational assessment
 
Purpose This assignment will enable the student to identify .docx
Purpose  This assignment will enable the student to identify .docxPurpose  This assignment will enable the student to identify .docx
Purpose This assignment will enable the student to identify .docx
 
Leadership and Management CIPD UK Assignment Sheet
Leadership and Management CIPD UK Assignment SheetLeadership and Management CIPD UK Assignment Sheet
Leadership and Management CIPD UK Assignment Sheet
 
Pencil andpapertest
Pencil andpapertestPencil andpapertest
Pencil andpapertest
 
6b. sample of a study guide
6b. sample of a study guide6b. sample of a study guide
6b. sample of a study guide
 
Continuous Assessment
Continuous AssessmentContinuous Assessment
Continuous Assessment
 
Unit 301 Essay
Unit 301 EssayUnit 301 Essay
Unit 301 Essay
 
Construction of Test
Construction of TestConstruction of Test
Construction of Test
 
FINAL PRESENTATION on OBE using TDD.pptx
FINAL PRESENTATION on OBE using TDD.pptxFINAL PRESENTATION on OBE using TDD.pptx
FINAL PRESENTATION on OBE using TDD.pptx
 
Ch11 power point apt 501
Ch11 power point apt 501Ch11 power point apt 501
Ch11 power point apt 501
 
reflection.docx
reflection.docxreflection.docx
reflection.docx
 

Más de Christine Gonnella

Más de Christine Gonnella (19)

Designing Instructional Discussion Forums
Designing Instructional Discussion ForumsDesigning Instructional Discussion Forums
Designing Instructional Discussion Forums
 
Universal Design at Mayville State University
Universal Design at Mayville State UniversityUniversal Design at Mayville State University
Universal Design at Mayville State University
 
Smart Classroom Assessment
Smart Classroom AssessmentSmart Classroom Assessment
Smart Classroom Assessment
 
Performance Checklist
Performance ChecklistPerformance Checklist
Performance Checklist
 
Final Evaluation Report Charts and Graphs
Final Evaluation Report Charts and GraphsFinal Evaluation Report Charts and Graphs
Final Evaluation Report Charts and Graphs
 
Higher Education: Where We've Gone Wrong
Higher Education: Where We've Gone WrongHigher Education: Where We've Gone Wrong
Higher Education: Where We've Gone Wrong
 
Alice Tutorial
Alice TutorialAlice Tutorial
Alice Tutorial
 
Second Life Whatisitand Why
Second  Life  Whatisitand WhySecond  Life  Whatisitand Why
Second Life Whatisitand Why
 
Teaching and Learning in SL-Desserts and Demos
Teaching and Learning in SL-Desserts and DemosTeaching and Learning in SL-Desserts and Demos
Teaching and Learning in SL-Desserts and Demos
 
Introduction to SL for Graphic Design students
Introduction to SL for Graphic Design studentsIntroduction to SL for Graphic Design students
Introduction to SL for Graphic Design students
 
Educational Uses of SL
Educational Uses of SLEducational Uses of SL
Educational Uses of SL
 
Introduction To Second Life
Introduction To Second LifeIntroduction To Second Life
Introduction To Second Life
 
Lifelong Learning In The Digital Age
Lifelong Learning In The Digital AgeLifelong Learning In The Digital Age
Lifelong Learning In The Digital Age
 
Smart Classrooms in Distance Ed
Smart Classrooms in Distance EdSmart Classrooms in Distance Ed
Smart Classrooms in Distance Ed
 
Shaffer
ShafferShaffer
Shaffer
 
Interactives In E Learning
Interactives In E LearningInteractives In E Learning
Interactives In E Learning
 
Blended Hybrid
Blended HybridBlended Hybrid
Blended Hybrid
 
Who The Heck Is ADDIE Anyway
Who The Heck Is ADDIE AnywayWho The Heck Is ADDIE Anyway
Who The Heck Is ADDIE Anyway
 
Finding Nemo Demo
Finding Nemo DemoFinding Nemo Demo
Finding Nemo Demo
 

Último

1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
AnaAcapella
 

Último (20)

Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
Spatium Project Simulation student brief
Spatium Project Simulation student briefSpatium Project Simulation student brief
Spatium Project Simulation student brief
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 

Smart Classroom instruction Formative Evaluation Report

  • 2. Formative Evaluation Report This report will describe and analyze the formative evaluation process for faculty training on the smart classroom in room 202 of the Education building on the UND campus. This report is divided into four main sections; Plan, Description, Outcomes and Interpretation. The Plan provides an outline of how the formative evaluation was designed and includes one-to-one and small group evaluations. The Description explains how the evaluations progressed. The Outcomes section contains the data obtained from both evaluations and the Interpretation analyzes this data. I. The Plan Control of Variables Aptitude Aptitude will be controlled by including learners of below average, average and above average aptitude. Aptitude will be measured using an Experience Assessment, a tool which measures experience with computers, smart classrooms and technical ability (see Appendix A). Questions on the instrument will be assigned equal weights and one through five points for each response. The Experience Assessment contains a total of ten questions. Question 1 (1-5 pts) measures the learner’s experience with computers. Question 2 (1-5 pts) measures how frequently the learner uses computers to teach. This question was included because intermediate computer knowledge is a required prerequisite for the course. Question 3 (1-5 pts) measures how often the instructor uses PowerPoint and an LCD projector to teach classes. This question acknowledges a learner’s experience with low level technology in the classroom. Question 4 (1-5 pts) measures whether a learner has experienced the smart classroom setting as a student. Familiarity with the smart classroom environment as a student can provide an instructor valuable insight into good and bad instructional strategies with the equipment. Question 5 (1-5 pts) measures whether the student has used a smart classroom to teach. Previous experience with the smart classroom could impact the pace of the instruction for this student. Both questions 4 and 5 have yes/no responses. Positive responses will receive the full value of five points and negative answers will receive only one point. Question 6 (1-5 pts) measures frequency of experience with smart classrooms. This question was included because even if the learner’s experience is minimal, prior increases the probability that the learner will succeed and may impact the pace of the instruction as well. Question 7 measures the student’s ability to learn new technical skills. Question 8 (1-5 pts) measures whether the student has taken a course using distance education and question 9 (1-5 pts) measures whether the student has taught online using distance education. Both questions 8 and 9 have yes/no responses. Positive responses will receive the full value of five points and negative answers will receive only one point. Question 10 is an open ended question. Answers will be transcribed, reported and categorized. A chart will be created to illustrate the results. Possible total scores from nine to forty-five will indicate aptitude ranking.
  • 3. Participants will be faculty members at UND or people who have the potential to teach or who are teachers. Process Due to the hands-on nature of the smart classroom, all instruction will be conducted face- to-face. The one-to-one sessions will be conducted by the same trainer and the group sessions will be conducted by a different facilitator. A checklist derived from the objectives will also help to control for process and insure that each item is addressed. Support All training will be conducted in the smart classroom located in the Education building, room 202. Support will be provided by the Center for Instructional Learning Technology (CILT). The director of this department designed the room and CILT is responsible for its maintenance. A phone number is posted for anyone experiencing technical problems to call when in need of assistance. One-to-One Three subjects of varying ability, below average, average and above average, will be chosen to participate in this stage of the evaluation. Each training session will be conducted on a one-to-one basis with the trainer. The trainer will proceed to teach the learner about the smart classroom by demonstrating each aspect and asking the learner to demonstrate as well by using a guided practice approach. Performance will be measured using the Performance Checklist (Appendix B). The instructor will stop as questions or problems arise and get feedback from the learner as to how different aspects of the instruction could be improved. An Attitudinal Assessment (Appendix C) will be administered following the training. All responses to the Experience Assessment and the Attitudinal Assessment will be recorded using a matrix of questions and learners (Tables 1&2). Instruction will be revised based on learner feedback and the results of the post- training Attitudinal Assessment. One-to-one instruction will be completed within a two- week period. Small Group Eight learners of below average, average and above average aptitude will participate in the training after the one-to-one evaluations are finished and revisions have been made. These eight learners may or may not be taught simultaneously. Training will be conducted in a similar manner as in the one-to-one phase; face-to-face instruction in the smart classroom. There will be no discussion of instruction or learner feedback during the instruction stage of the small group phase. The Performance Checklist (Appendix B) will be used during instruction to monitor learner progress. This stage takes place during the two weeks following one-to-one instruction. Evaluation Performance will be evaluated based on the stated objectives. The Performance Checklist and three paper-based quizzes will be used to evaluate each objective. Failure of the learner to complete any objective accurately will lead to analysis of the instruction and related instruments. A table will indicate the results for each objective and evaluation
  • 4. questions. Items that are below 100% accurate will be analyzed and modified as necessary. All information will be analyzed for consistency and patterns by objective, assessment and learner both individually and across type. Matrix tables (Tables 3&4) containing objectives for each item will be used to perform item and performance analyses. Items within each objective will be evaluated, and any inconsistencies will be considered indications that modifications to the instruction or to the instruments may be needed. Problems with the instrument may include items that may not measure what they are supposed to measure according to the corresponding objective, the question may be unclear or too difficult. Success for each objective will be measured and compared to success rates of other objectives to examine overall consistency. Student performance across all checklist items and assessment items will be evaluated for consistency and patterns. Deficient items and unsuccessful objectives will be analyzed by learner. All information will be analyzed across objectives, items and learners and any inconsistencies and patterns will be studied for indication of need for modification. II. Description One-to-One At this point it is important to note that the smart classroom in Education Room 202 was not complete until a few weeks prior to the commencement of training. Much of the material was compiled and written prior to completion of the system in order to prepare as well as possible for the implementation phase. During much of the training many of the glitches were discovered and worked through which took a lot of time and made it difficult to get though the material. The goal was to have as much worked out as possible before the group training. The learning task analysis was changed a few times throughout the process which also effected the assessment and the results. Since this is instructor-led training, a facilitator was chosen to conduct the training while the designer watched, took notes, and answered questions as needed. The facilitator is also and IDT graduate student who had a great deal of technology experience (especially audio/visual) and some experience with room 202 working on a different project. The facilitator was asked to read the green text from the instructor manual verbatim in order for the designer to evaluate the instruction that had been written. The placement of practice items was something we struggled with. During the first one- to-one we turned everything off after the first demonstration and asked the participant to go through the process of turning it all back on. This was repeated only a few minutes later as an assessment item while the assessment checklist was utilized. This seemed to take up a great deal of time and the format was changed so that the learner practiced many of the activities simultaneous to the instructor describing them. The student manual came to be seen as more of a tool than a guide and the learners were encouraged to listen to the facilitator’s instructions while watching the manual and practicing the activities when asked. This was effective in the one-to-one sessions but less so in the group
  • 5. sessions where learners were taking turns practicing. Those items they practiced themselves seemed to be better understood than those that were practiced by others. However they did learn to turn to the student manual when questions arose during the assessment. Participant A Participant A is a graduate student and teaching assistant in the Communications department. Her score on the Experience Assessment was average. She was chosen because of her experience with distance education and her desire to learn the system. She is also an IDT student. Her experience and knowledge was helpful in identifying areas that needed improvement. Participant A spent 2 hours on the instruction and was still unable to complete the training. This was due to a combination of reasons, including the frequent discussion that occurred throughout the session regarding the system and clarifying how things worked, the materials and how to improve them, and her need to leave at a certain time. Participant A referred to the student manual quite frequently during the session. The first thing noted was that the information reviewed in the student manual was not specific enough. I added page numbers to and more specific verbiage to help with that. In the “Tour of the Room” section it was noted that there were eight speakers in the ceiling rather than four. This was edited for the next version of the manual. After turning on the main power the facilitator mentioned that the user should wait a few seconds before turning on the Gateway power switch. This was also added to the verbiage. The facilitator also noticed that the touch panel does not, in fact, default to the computer interface as indicated by the manual. This section was deemed irrelevant and removed from the text to save time in the training. The sequence and length of time needed for turning the projector on and off became a question which was investigated and edited in the text of the manuals. In the camera section, we added a note to watch the camera for movement to indicate whether it had been turned on. The note on the camera presets was edited and a separate screen shot was added for the pan, tilt and zoom section. In the microphone section there was some question as to the how the volume worked. This issue was not resolved until later in the evaluation process. A photo of the microphone clip was taken and inserted into this section as well. After reviewing the “Recording Process” diagram it was reconfigured to more accurately reflect how recording is accomplished with the system. Details of the process of recording to mini-DV tape were carefully reviewed and refined. For example, directions were modified to include a reminder to look at the touch panel to determine if a tape starts playing when it is inserted it into the console. The Note in the “Recording to DVD” section and the Tech Tip in “Finalizing a DVD” was edited for accuracy. On the Job Aid for finalizing a DVD, a tech tip was added. Participant A and the facilitator went very carefully through this job aid and many changes were identified, including some of the verbiage taken directly from the component manual. Participant A suggested that a break be added to the manual at this point.
  • 6. In Module III, we noted a need to make sure the learner kept the microphone on during the audio conference demonstration. It was also mentioned that there should be a note in the manuals about the fire wire cable being difficult to plug in. In the section on Breeze the facilitator suggested that the Breeze link be made a shortcut on the desktop. There were technical issues with the making Breeze detect the camera. We spent about 20 minutes trying to figure it out until we had to move on to the assessment without finishing the training. Participant B The participant is the Dean of the College of Education and Human Development. His score on the Experience Assessment was below average. He was chosen because of his role as the Dean and his desire to become familiar with the technology used by his faculty. His lack of experience helped identify several areas that were unclear and/or too long winded. The first area of improvement was the objectives. They had not be re-written from the five component format and needed to change. They were revised after this session. In the “Getting Started” section, it was decided that the words, “green text to indicate when the instructor should be speaking” should be in green text to draw the facilitator’s eye to it. Also in this section, under Job Aids, the words, “This comprises the majority of the student manual” were deleted. In the part of this section which was spoken by the facilitator, some of the verbiage and order of information was changed and some was removed for flow and to save time. In Part 2, some of the text was seen as repetitive and removed. Also instructions to the facilitator to refer to the room diagram while pointing out the various components were added. The diagram entitled, “Tour of the System” was confusing and missing some components – changes were made upon revision. In the “Tour of the Room” section, a sentence about acoustic materials being added to the walls and floors was added. In the section on powering up, a practice item was added. The section on Shutting Down was moved to later in the training as was the practice item for turning this system back on. This practice item was moved because we felt that it was taking too much time and it was too simple to do so soon in the training. In the “Projector” section we added a practice item for the learners to try turning the projector on and off. When we had problems with it, we realized that it seemed to be going into sleep mode. A tech tip on the projector was written and added to the section. A practice item was also added to the “Camera” section so that the learner could try turning the camera on and watch it respond. To save time, the camera preset section was deleted. In the “Instructor Microphone” section duplication was removed and a tech tip was added regarding the mute button. We decided that it would save time to move the Module I checklist to the end of the training with the Module III checklist. In the Introduction to Module II we added a
  • 7. summary of “The Recording Process” diagram. In Part 6, some of the wording in the Tech Tip was changed for accuracy. In the practice section we decided a photo of the correct type of DVDs for the system would be helpful and we discovered a circle on the photo had been moved to an incorrect spot. In Part 7 we changed the “Discussion/Demonstration” section into a Practice session in which the facilitator will talk the learner through the steps of recording to a DVD. This was a section in which a few more idiosyncrasies with the system were identified and addressed by removing a note and going straight to the “Finalize a DVD” section. There were a few steps in the “finalizing a DVD” job aid which needed to be edited and added. Also, a sentence about how to use the remote control with the touch panel was added. A few small changes were made to the audio conference section to shorten and simplify the process. Due to time constraints we were unable to finish the last section on Breeze. Participant C Participant C is an adjunct faculty member of the Department of Teaching and Learning. She is an advanced user and already uses the system to record her lecture sessions on DVD. She had many good suggestions which were incorporated into the manuals. The Getting Started section still seemed to drag on for a long time with a lot of lecturing going on. There was too much information not totally relevant being given to the learner and we decided it would be better to simply cut a lot of it out. Other minor edits were made to the text spoken by the facilitator to make them flow better. Participant C suggested that the phrase “To use when streaming” be added to the “Tour of the system” diagram for clarity. It was added. Minor directions were added to the “Tour of the Room” section for the facilitator. In Part 3 a Note was removed to save time. Due to the participant’s previous experience with the system, we asked her if she had any trouble with the projector. She said that she watches the touch panel to see if the OFF button turns red or the ON button turns blue. This tip was added to the manuals. In addition, a practice was moved from after the facilitator discussion to before so that the learner could play with the buttons while the facilitator talked about them. A tech tip was added to tell learners that the longer the camera control buttons on the touch panel are held down, the quicker the camera moves. Practice sections were added here as well. Participant C brought up a question regarding switching back and forth between media and microphone volume. After working on trying to find an answer for a while, we decided it was a question we would have to ask CILT. Participant C suggested putting a section on the USB extension after the photo of the USB ports. This suggestion was not taken because the “Plugging your laptop” section
  • 8. was removed from the instructor manual. The “Plugging in your laptop” section was removed from the instructor manual because we came to consider it a prerequisite. It was left in the student manual and placed at the back in the appendix. Participant C felt that the word “simple” should be removed from the summary because it might make someone who does not find it simple, feel stupid. This was changed. A few minor wording changes were made to the “Recording to mini-DV tape” section. The text on the JVC diagram was changed to more accurately match that which is on the actual component. The facilitator had made the suggestion in each prior one-to-one to “wave to yourself to make sure the DV tape is not playing back.” Participant C pointed out that the red light on the device stays on when it is recording. Both of these tips were added to the instruction. The wrong photos were cut and pasted into the “Recording to DVD” section from another section. This was corrected. In the “Finalizing a DVD” segment, a clarification was made about finalizing and un- finalizing DVDs. In the Job Aid, a tech tip was added about timing out and a few more clarifications were made. Participant C suggested we explain more about voice over IPs so a brief statement was added to the “audio conference” section. A practice item was also added and the information on VoIPs was moved to another section. The volume section was circled on the photo for clarification. Since this is a section that was not completely finished by either of the previous participants, we found several areas for improvement as we worked through it. A “something more” section was removed due to time constraints and a screen shot was deleted. A few screen shots were cropped to focus the learner in on the relevant content. In this segment, we decided to switch the camera and audio conference sections so that we could ensure that the camera is set up before the audio conference begins. This is helpful if there are camera initialization problems. Clarifications and corrections were made to the phone dialing section. After running through this instruction with three participants it was clear that it was originally written with far too much information to cover in the allotted time. It was also a challenge to continue finding glitches within the system that effected how the instruction was written. Ideally this instruction would have been better prepared had we performed at least one more one-to-one so that we could test the Breeze section more thoroughly. Still, there were far fewer changes on the next round of revisions so there was some progress achieved.
  • 9. Small Group First Session The first small group session was scheduled with two people, one male and one female, but only the female attended. Participant 1 is the Technology Coordinator for the department of Teaching and Learning. She was chosen because of her need to learn the system because of her position in the department. She has worked with many other smart classrooms so her score on the Experience Assessment was above average. There were a few formatting issues with the table of content which were changed, and few of the page numbers referred to in the student manual were in need of correction. In the “Power Up” section two Tech Tips and some of the instructor dialog was removed to save time. The Summary/Transition was also deleted for time. More minor changes were necessary to the Projector section and Practice Item was removed. In the Camera section the word “camera” was replaced with the phrase, “cam control” to align with how it shows on the touch panel. The words, “further from” were added to the description of zoom for clarity. The types of batteries used for the Instructor Microphone were added to that section as well as a photo of correct mic placement. In the “Recording to mini-DV tape” section, part of the introduction was removed because it was a duplication and a Tech Tip was removed from the instructor’s manual. This tip was left in the student manual for reference. Some of the wording was changed in the instruction and a Tech Tip on the HD/DVD switch was added. Edits were made to the graphic as well. In Part 7, the information on DVDs and DVD-RWs was removed from the instructor presentation to save time but it remains in the student manual for reference. Some of the graphics here were edited as well and the order was changed for clarity. A tip for finalizing was added for clarity to the Finalize a DVD job aid and other minor edits to the text and graphics were made as well. The Tech Tip at the end of the section was removed but remains in the student manual. In the section on Breeze several nonessential items were removed from the Instructor dialog to save time. Text and photos that had been added received minor edits and a “more reading” section was removed as well. Clarifications were made in the text on using audio conference in breeze and minor but important instructions were added.
  • 10. Second Session The second session of the small group was comprised of participants 2, 3 and 4 who completed the training in room 202 of the Education building. All participants are UND employees but only Participant 3 is a faculty member who is in the Teaching and Learning Department. She was above average on the Experience Assessment due to her frequent use of smart classrooms. For this reason she entered the training thinking that she would not be learning anything new but ended up being enthusiastic about utilizing the distance features of the room for recording her class sessions. Participant 2 is the doctoral program coordinator for Teaching and Learning and took part in the training only in the interest of helping a student and had a below average score on the Experience Assessment. Participant 4 is an instructional designer for the UND Aerospace program. Her responsibilities including training faculty and students in Breeze; her Experience Assessment score was above average. Her participation in the training was to help a fellow IDT student and learn more about using smart classrooms for instruction. The facilitator for this session was the director of the center for instructional and learning technologies and one of the people who designed the smart classroom. Since this was the first time she had reviewed the training, she took extra time discussing and in some cases correcting the instruction. Even so, the instruction was much more coherent in this session than in previous sessions thanks to the feedback of previous participants and much revision. The training did go over two hours with the assessment. Many of the problems found during this session were typographical or formatting issues. The few substantial issues were related to system design problems that the facilitator clarified. Several of the arrows in the diagram labeled “The Recording Process” had to be removed or changed. III. Outcomes Practice Items The original version had practice items interspersed throughout the instruction but we found that the way they were presented interrupted the flow of learning and took far too much time. After experimenting with placement of practice items during the one-to-one sessions and the first group session, we found it to work best when we walked learners through the steps while they performed tasks. We also found that the student manual should be used more as a tool during the instruction rather than a reference. Many times we found ourselves reminding learners to use it while performing tasks, even tasks that we talked them through, so that they would feel comfortable doing so when they were on their own in the assessment and in practical use. In addition, participants in the second group session were not all able to perform practice items because there was not enough time or space to allow each learner to perform each task as it was taught. Each learner did get a chance to perform some tasks with the instructor, but all had to watch others perform as well. It was noted that those that performed the practice items seemed to perform better on the assessment on those particular items than those who did not.
  • 11. However, the changes made for trial and error made it difficult to accurately measure the use of practice items and their effectiveness over time. Final Test This is where I discuss the data from the post training quiz IV. Interpretation Practice items This is where I will discuss what I think the data from the pre-test questionnaire and the practice checklist means Final Test This is where I will discuss what I think the data from the post training quiz means A – Adonica B – Dan C – Laurie 1 – Joneen 2 - Cynthia 3 – Rachael 4 – Adrienne
  • 12. Conclusions and Future Revisions V. Appendices This is where I will enter charts and tables to document the data discussed above. Table 1. Response to experience questions on Experience Assessment. Questions 1 2 3 4 5 6 7 8 9 Participant Average A 5 5 3 1 1 1 4 1 1 2.44 B 5 5 2 1 1 1 3 1 2 2.33 C 5 4 4 1 2 5 4 2 1 3.11 1 2 5 4 4 1 1 1 4 1 2 2.56 3 5 4 4 1 2 4 4 2 1 3.00 4 5 5 3 2 2 1 4 2 1 2.78 Average One-to-One Composite Small Group Composite Average Ratings Answers to Question 10 Participant A: Technical difficulties in sound, video or connection. Participant B: The time it takes to develop the course and then the new skills one might need to learn. Participant C: 1) The fact that the work is all safely hidden away in my computer 2) not knowing the people fact to face. Participant 1: Participant 2: My experience is in correspondence – which is not the same as distance education. I never have a time when I can connect with a student except over e-mail. I think the biggest drawback is that sometimes when students are writing culturally insensitive or factually untrue responses, I believe that a face-to-face conversation would be beneficial. Or, a redirection to a reading that might help them reframe their thinking. This is hard to do in correspondence. I can certainly suggest a link or an article to read, but I have no way of knowing if they follow through with it. Participant 3: One barrier is that there is limited personal interaction with peers. The course I took as a biology class so it wasn’t as critical there, but in some education courses, we are teaching and modeling effective personal interactions. On the flip side, I
  • 13. found that I needed to know my content very well, as the grade was 100% objective based on my test scores. One further disadvantage with the class I took related to this was that the tests were open book, as they were taken at home. In that regard, I could see a potential for academic dishonesty. Participant 4: Trying to get student skills and technology up to speed so that everyone can focus on the subject matter.
  • 14. Practice Table 3. One-to-One Practice Items Objective 1 1.1 1.1. 1.1. 1.2 2. 2.1 2.2 2.3 3. 3.1 3.2 3.3 3.4 3.4. 4 4.1 4.1. 5. 5.1 5.1.1 1 2 1 1 Student A Student B Student C # Correct % Correct % Mastery Objective 6 6.1 6.2 7 7.1 8. 8.1 9 9.1 9.1. 9.1. 9.2 9.3 10 10.1 11 11.1 11.2 11.3 12. # % 1 2 Student A Student B Student C # Correct % Correct % Mastery
  • 15. Table 4. Small group practice items Objective 1 1.1 1.1. 1.1. 1.2 2. 2.1 2.2 2.3 3. 3.1 3.2 3.3 3.4 3.4. 4 4.1 4.1. 5. 5.1 5.1.1 1 2 1 1 Student 1 Student 2 Student 3 Student 4 # Correct % Correct % Mastery Objective 6 6.1 6.2 7 7.1 8. 8.1 9 9.1 9.1. 9.1. 9.2 9.3 10 10.1 11 11.1 11.2 11.3 12. # % 1 2 Student 1 Student 2 Student 3 Student 4 # Correct % Correct % Mastery
  • 16. Table 5. One-to-One Evaluation Items Objective 1 1.1 1.1. 1.1. 1.2 2. 2.1 2.2 2.3 3. 3.1 3.2 3.3 3.4 3.4. 4 4.1 4.1. 5. 5.1 5.1.1 1 2 1 1 Student A X Student B Student C # Correct % Correct % Mastery Objective 6 6.1 6.2 7 7.1 8. 8.1 9 9.1 9.1. 9.1. 9.2 9.3 10 10.1 11 11.1 11.2 11.3 12. # % 1 2 Student 1 Student 2 Student 3 Student 4 # Correct % Correct % Mastery
  • 17. Table 6. Small group Evaluation Items Objective 1 1.1 1.1. 1.1. 1.2 2. 2.1 2.2 2.3 3. 3.1 3.2 3.3 3.4 3.4. 4 4.1 4.1. 5. 5.1 5.1.1 1 2 1 1 Student A Student B Student C # Correct % Correct % Mastery Objective 6 6.1 6.2 7 7.1 8. 8.1 9 9.1 9.1. 9.1. 9.2 9.3 10 10.1 11 11.1 11.2 11.3 12. # % 1 2 Student 1 Student 2 Student 3 Student 4 # Correct % Correct % Mastery