Se ha denunciado esta presentación.
Se está descargando tu SlideShare. ×

Icce21 systematizing game learning analytics for improving serious games lifecycle


Eche un vistazo a continuación

1 de 46 Anuncio

Más Contenido Relacionado

Similares a Icce21 systematizing game learning analytics for improving serious games lifecycle (20)

Más de Baltasar Fernández-Manjón (20)


Más reciente (20)

Icce21 systematizing game learning analytics for improving serious games lifecycle

  1. 1. Systematizing Game Learning Analytics for Improving Serious Games Lifecycle Baltasar Fernandez-Manjon , @BaltaFM e-UCM Research Group , Educación Digital y Juegos Serios Cátedra Telefónica-Complutense
  2. 2. Serious Games • Any use of digital games with purposes other than entertainment (Michael & Chen, 2006) • Applied successfully in many domains (medicine, business) with different purposes (knowledge, awareness, behavioural change) • But still is a low adoption of Serious Games in mainstream education • More evidence-support about SG efficacy is needed Clark Abt, 1970
  3. 3.
  4. 4. Do serious games actually work? - Very few SG have a formal evaluation (e.g., pre-post) - Usually tested with a very limited number of users - Formal evaluation could be as expensive as creating the game (or even more expensive) - Evaluation is not yet considered as a strong requirement - Difficult to deploy games in the classroom - Teachers have very little info about what is happening when a game is being used - what the student has learned from playing the game?
  5. 5. Serious Games for bullying & cyberbullying A. Calvo-Morata, C. Alonso-Fernández, M. Freire-Morán, I. Martínez-Ortiz and B. Fernández-Manjón, Serious games to prevent and detect bullying and cyberbullying: a systematic serious games and literature review in Computers & Education, 8 July 2020
  6. 6. Serious Games for bullying & cyberbullying • Only 14/32 games with user validation • Validation: average < 300 users per game • Very few collect user interactions • Various target groups • They address the problem in a variety of ways • 20 Bullying, 7 Cyberbullying, 5 Both problems, work on empathy, raise awareness, show prevention strategies, safe internet use, report harassment, identify problem, change behavior, develop emotional/social skills • No open games • Lack of free access to the game resources and its code A. Calvo-Morata, C. Alonso-Fernández, M. Freire-Morán, I. Martínez-Ortiz and B. Fernández-Manjón, Serious games to prevent and detect bullying and cyberbullying: a systematic serious games and literature review in Computers & Education, 8 July 2020
  7. 7. Most common methodology are pre-post questionaries in experiments: Formal validation of serious games Is there a significant difference between pre-questionary and post-questionary results? Pre and post questionaries should have been previously validated
  8. 8. Learning analytics & Game Analytics • Learning analytics: Improving education based on analysis of actual data • Data driven • From only theory-driven to evidence-based • Game Analytics: Application of analytics to game development and research (Telemetry) • Game metrics • Interpretable measures of data related to games • Player behavior • Mainly used with “commercial purposes” • monetization, churn, user funnels
  9. 9. Game Learning Analytics breaking the game black box model to obtain information while students play. Manuel Freire, Ángel Serrano-Laguna, Borja Manero, Iván Martínez-Ortiz, Pablo Moreno-Ger, Baltasar Fernández-Manjón (2016): Game Learning Analytics: Learning Analytics for Serious Games. In Learning, Design, and Technology (pp. 1–29). Cham: Springer International Publishing. •GLA is learning analytics applied to serious games • collect, analyze and visualize data from learners’ interactions with SGs Game Learning Analytics (GLA)
  10. 10. Uses of Game Learning Analytics in Serious Games Lifecycle • Game testing – game analytics • Player focus – user experience • Average playing time, completion rate • Game deployment and student evaluation • Real-time information for supporting the teacher • Knowing what is happening when the game is deployed in the class • “Stealth” student evaluation (Valerie Shute) • Formal Game evaluation – game effectivity • From pre-post questionaries to evaluation based on game learning analytics?? GLA
  11. 11. From Formal Game Validation to Game Learning Analytics PRE POST Experimental group Real-time analysis Off-line analysis User control Session control Game efficacy User acceptance Design validation
  12. 12. Minimun Game Requirements for GLA • Most of games are black boxes. • No access to what is going on during game play • We need access to game “guts” • User interactions • Changes of the game state or game variables • Or the game must communicate with the outside world • Using some logging framework • What is the meaning of the that data? • Ethics: adequate experimental design and setting • Are users informed? • Anonymization of data could be required • Fair data exploitation for all stakeholders?
  13. 13. Game Learning Analytics (GLA) or Informagic? • Informagic • False expectations of gaining full insight on the game educational experience based only on very shallow game interaction data • Set more realistic expectations about learning analytics with serious games • Requirements • Outcomes • Uses • Cost/Complexity Perez-Colado, I. J., Alonso-Fernández, C., Freire-Moran, M., Martinez-Ortiz, I., & Fernández-Manjón, B. (2018). Game Learning Analytics is not informagic! In IEEE Global Engineering Education Conference (EDUCON).
  14. 14. Game Learning Analytics
  15. 15. Game Learning Analytics What data is to be collected from the game and how it relates to learning goals Which specific statements (e.g. in xAPI format) are to be tracked from the game containing that information How the statements collected are to be analyzed and what information is to be reported and/or visualized
  16. 16. Results: RQ1 GLA purposes ➔ Main focus: assess learning & predict performance ➔ Games are indeed useful for purposes beyond entertainment ➔ Interest now in analyzing interaction data to measure impact on players and relation to players’ in-game behaviors
  17. 17. Results: RQ2 data science techniques ➔ Linear models and cluster techniques commonly applied ➔ Classical techniques ➔ More powerful techniques (e.g. neural networks) not broadly applied yet ➔ Need of xAI ➔ Explainable AI ➔ Human understable decissions
  18. 18. Results: RQ3 main stakeholders ➔ Purposes that cover interests of many stakeholders ➔ Many research done on this area ➔ Students/Learners indirect recipients of all results
  19. 19. Results: RQ4 conclusions and results Results on assessment & student profiling: ➔ GLA data can accurately predict games’ impact ➔ Performance is related to players’ characteristics Results on SG design: ➔ GLA data can validate SG design ➔ Assessment can & should be integrated in SG design ➔ Importance of SG characteristics ➔ Identified challenges when designing SG ➔ Proposed frameworks to simplify design
  20. 20. Results: Additional information Serious games used: ➔ Main focus to teach ➔ Main domain maths and science-related topics Participants in the validations studies: ➔ Small sample sizes used (<100) ➔ Primary & secondary education Interaction data: ➔ Completion times, actions & scores commonly tracked ➔ Format not reported
  21. 21. Requirements to Systematize GLA in SG • Applying GLA to serious games is complex, error-prone, and fragile • Any small glitch can cause the whole process to fail • GLA is still a complex process that is not affordable for most of the small game producers or to game research teams • Systematize GLA in SG require better models, standards and tools • Game Learning Analytics Models • Standard formats for collecting GLA data • Tools that simplify the GLA implementation • Authoring • Tracking • Analysis • Orchestation / Management
  22. 22. GLA framework
  23. 23. Experience API for Serious Games: xAPI-SG Profile Experience API (xAPI) is a new de facto standard that enables the capture of data about human performance and its context. Now it is becoming an IEEE standard The e-UCM Research Group in collaboration with ADL created the Experience API for Serious Games Profile (xAPI- SG), a xAPI profile for the specific domain of Serious Games. The xAPI-SG Profile defines a set of verbs, activity types and extensions, that allows tracking of all in-game interactions as xAPI traces (e.g. level started or completed) Ángel Serrano-Laguna, Iván Martínez-Ortiz, Jason Haag, Damon Regan, Andy Johnson, Baltasar Fernández-Manjón (2017): Applying standards to systematize learning analytics in serious games. Computer Standards & Interfaces 50 (2017) 116–123,
  24. 24. xAPI-SG Profile The xAPI-SG Profile is the result of the implementation of an interactions model for Serious Games in xAPI. The types of interactions that can be performed in a Serious Game, and are included in the profile, can be grouped based on the type of interactions and game objects that the interaction is performed over. The following slides present some of these common interactions and game objects related with them, with example xAPI-SG statements. ● completables ● accessibles ● alternatives ● GameObjects
  25. 25. xAPI-SG: Completables A completable is something a player can start, progress and complete in a game, maybe several times. ● Verbs: initialized, progressed, completed ● Types: game, session, level, quest, stage, combat, storynode, race, completable John Smith progressed on Level 1 0.5
  26. 26. Java xApi Tracker Unity xApi Tracker C# xApi Tracker xAPI Game trackers as open code
  27. 27. SIMVA: SG Simple Validator • Simva tool aims to simplify all the aspects of the validation • Before the experiments: • Managing users & surveys • Providing anonymous identifiers to users • During the experiments: • Pre-questionnaire – Game analytics – Post-questionnaire • Collecting and storing questionnaires (surveys) and traces data (xAPI-SG) • Relating different data from users (GLA, questionnaires) • After the experiments: • Simplifying downloading and analysis of all data collected Ivan Perez-Colado, Antonio Calvo-Morata, Cristina Alonso-Fernández, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2019): Simva: Simplifying the scientific validation of serious games. 19th IEEE International Conference on Advanced Learning Technologies (ICALT), 15-18 July 2019, Maceió-AL, Brazil.
  28. 28. Unified Cloud Storage Analytics and Control Survey management Data Science tools User & Group management Gameplay Management SIMVA orchestrates all the processes
  29. 29. T-mon: Monitoring xAPI-SG traces in Python Experience API Profile for Serious Games (xAPI-SG) T-mon (Traces monitor) xAPI-SG Processor Jupyter Notebooks Default set of analysis and visualizations
  30. 30. Default analysis and visualizations Serious game completion initialized and completed traces with object-type serious-game Serious game progress initialized, progressed and completed traces with object-type serious-game, result.progress and timestamp Choices in alternatives selected traces with object-type alternative, result.response and result.success Completable progress progressed traces in any completable object type, with result.progress
  31. 31. Default analysis and visualizations Interactions interacted traces with any object type; bar chart per item, and each bar per player Completable results (scores) completed trace of any completable with result.score Completable results (max and min times) difference in timestamp of initialized and completed traces of each completable Interactions (heatmap) interacted traces grouped by item (object) and player
  32. 32. GLA Methodology 1. Game validation phase: ○ validate the serious game against actual results (post-test) ○ Collect all the game analytics data to improve the game 2. Use GLA interaction data to predict knowledge after playing. ○ create prediction models taking as input the interaction data 3. Game deployment phase: ○ students play and are automatically assessed based on their interactions (used as input for prediction models) ○ pre-post are no longer required Cristina Alonso-Fernández, Ana Rus Cano, Antonio Calvo-Morata, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2019): Lessons learned applying learning analytics to assess serious games. Computers in Human Behavior, Volume 99, October 2019, Pages 301-309
  33. 33. The game: First Aid Game Game to teach first aid techniques to 12-16 years old players Three initial situations: ● chest pain ● unconsciousness ● choking Game previously validated with pre-post and control group: Video-game instruction in basic life support maneuvers. Marchiori EJ, Ferrer G, Fernandez-Manjon B, Povar Marco J, Suberviola Gonźalez JF, Gimenez Valverde A. (2012)
  34. 34. Pre-post questionnaires + GLA data N = 227 students from a high school in Madrid (Spain) Each student completed: ● pre-test: 15 questions assessing previous knowledge about first aid techniques ● gameplay: of First Aid Game ● post-test: 15 questions assessing knowledge about first aid techniques after playing Collection of both results in pre-post test and GLA interaction data from the game (following xAPI-SG Profile). Cristina Alonso-Fernández, Iván Martínez-Ortiz, Rafael Caballero, Manuel Freire, Baltasar Fernández-Manjón (2020): Predicting students’ knowledge after playing a serious game based on learning analytics data: A case study. Journal of Computer Assisted Learning, vol. 36, no. 3, pp. 350-358, June 2020.
  35. 35. New uses of games based on GLA - Avoiding pre-test: Games for evaluation - Avoiding post-test: Games for teaching and measure of learning With or without pre-test.
  36. 36. Cyberbullying: Conectado game
  37. 37. Serious Game → prevent Bullying and Cyberbullying ● Increase awareness and empathy ● Youngsters (12 to 17 years old) ● Use at the school as a tool for teachers Conectado: ● Point & Click game ● Player in the role of victim ● Choices and minigames that you can not win ● Free and open code video game
  38. 38. 1300 12
  39. 39. Significant increase in the ciberbullying awareness Wilcoxon paired test, p<0.001 5.72 6.38 Antonio Calvo-Morata, Dan-Cristian Rotaru, Cristina Alonso-Fernández, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2018): Validation of a Cyberbullying Serious Game Using Game Analytics. IEEE Transactions on Learning Technologies (early access)
  40. 40. Our xAPI tool-based approach for GLA uAdventure authoring tool ● Narrative and GPS games ● Easy to use by non-experts ● GLA out-of-the-box ● Extensions for custom GLA e-UCM xAPI tracker ● Compatible with xAPI for serious games ● High-level interface to create traces ● Local and online modes Simva management tool ● Integrated with uA ● Automatically collects traces in xAPI format (simple LRS) ● Manages experiments and users ● Integrates GLA and pre-post questionnaires TxMon analysis tool ● Analyzes xAPI traces ● Default visualizations ● xAPI for SGs as LAM ● Can connect with Simva
  41. 41. This set of tools provide an interoperable sandbox system for supporting the xAPI GLA lifecycle: ● Simplifies SG creation ● Reduces technical knowledge for setting up and collecting analytics ● Simplifies experiment management ● And provide default analytics so: ○ Students can see how they did ○ Teachers can monitor and evaluate ○ Developers and teachers can review and improve their games Interoperable ecosystem for complete GLA lifecycle
  42. 42. Conclusions • Game Learning Analytics has a great potential for improving SGs • Evidence based serious games • Games as assessments (better “Stealth” student evaluation) • Games as powerful research environments • Still complex to implement GLA in SG • Increases the (already high) cost of the games • Requires expertise not always present in game developers, SME or research groups • Real time GLA is still complex and fragile (e.g. deployment is schools) • New standards specifications (e.g. xAPI) and open software tools could greatly simplify GLA implementation and adoption • Ethics should drive the GLA process 45
  43. 43. Thanks! Contact: @baltafm This work has been partially funded by Regional Government of Madrid (eMadrid S2018/TCS4307, co-funded by the European Structural Funds FSE and FEDER), by the Ministry of Education (TIN2017-89238-R, PID2020-119620RB-I00), by MIT-La Caixa (MISTI program, LCF/PR/MIT19/5184001) and by Telefonica-Complutense Chair on Digital Education and Serious Games. Our publications: Our open code:

Notas del editor

  • It automates significant parts of the
    experimental setup, participant management and grouping,
    analytics, questionnaires, and experimental design; together
    with activity deployment and monitoring. Finally, data
    collected by SIMVA is designed to be easily accessible from
    external analysis tools.
  • quitarla
  • We developed a serious game, with name Conectado, to help teachers in class to prevent bullying and cyberbullying.
    Conectado have been created for increase awareness about bullying and create empathy.
    The students play the game in class and the teacher can make a reflection session with the commun experience that players obtain

    Conectado is a point & click serious game
    In the game, the players are victims of bullying and cyberbullying during five in-game days.
    During gameplay the players visit the school and home scenarios, they can talk with diferents npcs, and interact with a social network in mobile and PC. At the end of the in-game days, there are minigame that the player can not win. This minigames create bad feelings like impotence, frustration...
    The game is free and has open code.
  • Pruebas con 1300 usuarios (profes y alumnos)