4.16.24 21st Century Movements for Black Lives.pptx
Web-Based Self- and Peer-Assessment of Teachers’ Educational Technology Competencies
1. Web-based Self- and Peer-assessment of
Teachers’ Educational Technology
Competencies
Hans Põldoja, Terje Väljataga, Kairit Tammets, Mart Laanpere
Tallinn University, Estonia
2. cba
This work is licensed under the Creative Commons
Attribution-ShareAlike 3.0 Unported License. To view a copy
of this license, visit http://creativecommons.org/licenses/by-sa/
3.0/ or send a letter to Creative Commons, 444 Castro
Street, Suite 900, Mountain View, California, 94041, USA.
3. Outline
• Problem statement
• Related works: existing competency frameworks
• Design challenges
• Design methodology
• Current prototypes
• Conclusions and future work
4. Research context
• Importance of educational technology
competencies
• Generic ICT competency frameworks (e.g. ICDL)
do not cover all the competencies needed for
educational use of ICT
• Educational Technology Competency Model
(ETCM) for Estonian teachers
• DigiMina project for assessing teachers’
educational technology competencies
5. Research problem
To what extent and how could be
teachers’ educational technology
competencies assessed using a Web-
based tool?
6. Existing competency
frameworks
• International Computer Driving License
• UNESCO ICT Competency Framework for
Teachers
• ISTE National Educational Technology Standards
for Teachers (NETS-T)
7. • ECDL / ICDL
• Used in 148
countries
• Focused on ICT
usage
• Standardized
testing
8. • Launched 2008,
revised 2011
• Guidelines for
creating national
competency models
• 6 subdomains
9. • ISTE NETS-T 2008
• 20 competencies in
5 competency
groups
• Used in Norway,
Netherlands,
Finland, etc.
11. Design challenges
• How to define performance indicators of all
competencies in ETCM?
• How to select appropriate methods and
instruments for assessing competencies?
• How to implement selected assessment methods
in a Web-based tool?
12. Educational Technology Competency
Model for Estonian Teachers
• Based on ISTE NETS-T 2008
• 5-level assessment rubric developed by local
expert group
13. Assessment rubrics example
3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new
technologies and situations
1 2 3 4 5
Creates a user Manages access Solves Transfers working Chooses
account in a web- rights to the independently the methods from (compares,
based system and resources problems that known web evaluates) the most
creates/uploads published in the occur during the environment/ suitable tool for a
resources; uses web. use of ICT tools software to an given task.
common software/ (using help, manual, unknown
web environments/ FAQ or forums environment.
hardware with the when needed);
help of a user combines different
manual; uses tools; changes the
presentation tools settings of a web-
and a printer; based system.
saves/copies files to
external drive.
14. Measuring Educational
Technology Competencies
• Assessment methodology and instruments must be
reliable, valid, flexible, but also affordable with
respect to time and costs.
• Four levels of measuring competencies (Miller, 1990):
• knows
• knows how
• shows how
• does
15. Web-based Assessment of
Competencies
• Five-dimensional framework for authentic assessment (Gulikers et
al, 2004):
• tasks: meaningful, relevant, typical, complex, ownership of problem
and solution space;
• physical context: similar to professional work space and time
frame, professional tools;
• social context: similar to social context of professional practice
(incl. decision making);
• form: demonstration and presentation of professionally relevant
results, multiple indicators;
• criteria: used in professional practice, related to realistic process/
product, explicit
17. Research-based design methodology
Contextual Inquiry Participatory Design Product Design Software Prototype
As Hypothesis
Concept mapping User stories
Information
architecture
Participatory design
sessions
Paper prototyping
Scenario-based High-fidelity
Personas Agile sprints
design prototyping
Adapted from (Leinonen et al, 2008)
18. Personas
• Teacher training master student
• Novice teacher
• Experienced teacher
• Educational technologist of a school
• Trainings manager (in a national organization)
(Cooper et al, 2007)
19. Scenarios
• Master student is evaluating her educational
technology competencies
• Peer assessment of problem solving tasks
• Educational technologist of a school is getting an
overview of teachers’ educational technology
competencies
• Training manager is compiling a training group with
sufficient level of competencies
(Carroll, 2000)
23. Competency Test
• Competency test can be taken several times to
measure the advancement
• Usability issue: large number of tasks (20
competencies, 5 levels)
• Solutions:
• Can be saved and continued later
• Setting the starting level with self-evaluation
24. Tasks
• Three types of tasks:
• automatically assessed self-test items (29)
• peer assessment task (23)
• self reflection task (41)
• Need to increase the number of competencies that
can be assessed with a self-test
• Peer assessment requires blind review from 2 users
in a same or higher competency level
25. Tasks
• Tasks are authored using IMS QTI compatible test
authoring tool TATS (Tomberg & Laanpere, 2011)
• 5 IMS QTI question types are used:
• choiceInteraction (multi-choice)
• choiceInteraction (multi-response)
• orderInteraction
• associateInteraction
• extendedTextInteraction
26. Competency Profile
• Level of competencies is displayed as a diagram
• User can compare her competency level with the
average level of various groups
• Privacy settings (private, group members, public)
• Can be linked or embedded to external web pages
27. Group
• Typically created for a school or a group of
teacher training students
• Group owner can see competency profiles of all
members
• Anybody can create a group
• Groups can be set up as private or public
28. Competency Requirements
• Large number of competency profiles would make
DigiMina a valuable planning tool
• Competency requirements can be created by the
training manager, teacher trainer and group owner
• Will be implemented in a later phase
34. Conclusions and future work
• DigiMina as a component of a larger digital
ecosystem
• Involving expert teachers in creating and evaluating
assessment tasks
• Integrating DigiMina with the national teachers’
portal
• DigiMina as a framework for assessing various
competency models
35. References
• Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A Five-Dimensional Framework for
Authentic Assessment. Educational Technology Research & Development, 52(3), 67–86.
• Miller, G. E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine,
65(9), S63–S67.
• Leinonen, T., Toikkanen, T., & Silvfast, K. (2008). Software as Hypothesis: Research-Based Design
Methodology. In Proceedings of the Tenth Anniversary Conference on Participatory Design 2008 (pp.
61–70). Indianapolis, IN: Indiana University.
• Cooper, A., Reimann, R., & Cronin, D. (2007). About Face 3: The Essentials of Interaction Design.
Indianapolis, IN: Wiley Publishing, Inc.
• Carroll, J. M. (2000). Making Use: Scenario-Based Design of Human-Computer Interactions.
Cambridge, MA: The MIT Press.
• Tomberg,V., & Laanpere, M. (2011). Implementing Distributed Architecture of Online Assessment
Tools Based on IMS QTI ver.2. In F. Lazarinis, S. Green, & E. Pearson (Eds.), Handbook of Research
on E-Learning Standards and Interoperability: Frameworks and Issues (pp. 41–58). IGI Global.
36. Acknowledgements
This research was supported by
• EDUKO program of Archimedes Foundation
• Estonian Ministry of Education and Research
targeted research grant No. 0130159s08
• Tiger University Program of the Estonian
Information Technology Foundation
37. Thank You!
Hans Põldoja http://trac.htk.tlu.ee/digimina/
Research Associate
Tallinn University, Estonia
hans.poldoja@tlu.ee
http://www.hanspoldoja.net
http://www.slideshare.net/hanspoldoja