4. Students
Teachers
Requirements
Useful Tool Action Research
Diverse Scenarios
Non-experts
Different Questions
University
Related Indicators
4
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
5. Action Research and Learning Analytics
Summary of Prior Research
Research Questions and Method
Discussion of Findings
Recap
5
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
6. Active improvement of teaching Feedback for awareness & reflection
Human-driven Data-driven
Manually Automatically
Individual, perfectly fitting Standardized, approved by experts
Limited by time-constraints ... Limited by missing data ...
6
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
7. Develop
Question
Share Formulate
Research Plan
Action Research
Record in Collect Data
Writing
Develop Action
Analyze &
Reflect Learning Analytics
Plan
7
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
8. Do native speakers
How do students Are there specific have fewer problems
value specific learning learning offerings NOT with learning offerings
offerings? used at all? than non-native
speakers?
Are students using
specific learning Which didactical Is the performance in
materials (e.g., lecture acitivities, e.g. email e-tests somehow
recordings) in addition notifications, facilitate related to exam
to attendance or continous learning? grades?
alternatively?
Dyckhoff, 2010
8
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
9. Q1 - Indicator-Question-Mapping:
Which questions cannot be mapped to the available (sets of) indicators?
Which indicators could deliver what kind of enlightenment?
Q2 - Teacher-Data-Indicators:
Are there tools that explicitly correlate teacher data with student data?
How should teacher data be correlated?
Q3 - Missing Impact Analysis:
How could learning analytics impact teaching?
How could this impact be evaluated?
9
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
10. Study is based on
results of a qualitative meta-analysis (Dyckhoff, 2011)
collection of publications on 27 learning analytics tools
Collected Question Analysis
198 Categorized Indicator and
Indicators Mapping Discussion
10
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
13. Questions that can be answered
EASY NOT so EASY
Qualitative measure
Effects on performance
Quantitative measure
Complex correlation
Different data sources
Missing: Teacher Data Indicators
13
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
14. • Current situation
• Possible Scenarios Sociogram (Dawson, 2010)
- Forum Participation
- Correspondence Indicator
- Average assignment grading time
14
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
15. Tool Classroom Impact Exit
Conclusions
Availability State Prediction Interviews
15
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
16. International Context Set of all indicators
German International
teachers’ indicators of 27
questions available tool
16
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
17. Supporting Action Research with Learning Analytics
Several Questions NOT yet answered by indicators
Teacher Data Indicators should be researched
What’s the Impact of Learning Analytics?
Participatory Design for LA Tools/Indicators
Inform Tool Design/Learning Technologies Design
17
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
18. Dyckhoff, A.L. 2011. Implications for Learning Analytics Tools: A Meta-Analysis of
Applied Research Questions. IJCISIM. 3, (2011), 594–601.
Dawson, S. 2010. “Seeing” the learning community: An exploration of the development
of a resource for monitoring online student networking. BRIT J EDUC TECHNOL. 41, 5
(Sep. 2010), 736–752.
Dyckhoff, A.L., Lukarov V., Muslim A., Chatti M.A. and Schroeder, U. 2013.
Supporting Action Research with Learning Analytics. Proceedings of LAK’13, April 08-
12 2013, Leuven, Belgium, 220-229.
18
LAK13 "Learning Analytics and Action Research" Dyckhoff, Lukarov
Version 1:In yesterday’s workshop on teacher’s analytics, during the discussion of one talk the question was raised, what’s the impact on Learning Analytics?Version 2: So what’s the impact of Learning analytics? This questions was actually raised at yesterday’s workshop on teaching analytics.
When you read the literature, I extracted goals that were mentioned in the paper. I sorted them by goals concerning tools, what tools should do. For example track users, track interactons, etc. There are also goals for educators…And also there are goals for students…From this table, you can find out how Learning Analytics should/supposed to impact the teaching and learning processes. This is how it is supposed to be, but in the papers there is no information how the actual situation..there are the goals that come in literature (a collection of goals) about how Learning analytics should impact the learning/teaching processes
We had problems identifying what questions exactly these indicators answer/represent. Only a few indicators were well documented.For us, it was unclear what data exactly these indicators need, in order to calculate and present information to the teachers. Also, only a few of them provided information about the limitations concerning usage/interpretation. This led us thinking that we need to figure out a way how to present,and structure these pieces of information, which is crucial and necessary. If we, as researchers (and experts in the field) have problems identifying these things (issues), then I am positively inclined to say that educators, teachers will definitely have problems in understanding the purpose of the indicators.
This is a sample snapshot of our mapping of indicators with questions
Quantitative measureDiverse indicators can answer these questions, because they are concerned with quantitative measures on the basis of student generated usage data.Qualitative measures which are concerned with the satisfaction and preferences of students, still cannot be answered sufficiently. Performance indicators are also present in the research communities, but not in the tools we analyzed. There are questions that cannot be mapped, or answered by single indicators, or from single data source. So this raises the question of combining different data sources and making complex correlations. Don’t get me wrong, there are indicators that calculate complex correlations (especially in EDM), but we have seen only few indicators in the tools we analyzed. Furthermore, our analysis and discussion revealed that we have something missing… We don’t really have Indicators that represent Teacher data.
The closest thing that we have to using teacher data, or having teachers’ indicators are sociograms from Social Network analysis which correlate Student and Teacher activities. They correlate data of the course phase, course events, and show the interaction between teachers and students.Of course we need to collect teacher data in order to build these indicators. There already exists activity data from teachers available from log files. But, we also need to think about ways how to collect the missing data such as data about the quality of the learning materials, metadata of information about course events, and activities outside the classroom/lecture.Possible scenarios:We think that it would be good for the teachers to see the patterns of their participation and their online presence in the forum discussions. Also, it would be interesting to see whether in the periods when the teacher was more active and more present on the forum, students’ online presence and activities also increased. For the other two, you are more than welcomed to read our paper. :-) Missing indicators does not mean that they don’t exist in the research world.There isEDM research that does this.There are tools/indicators that exist but they might be not usable for the normal user. And these ideas led us to our thrid research questions, about how we can analyze the impact of indicators and the LA tools.
This is not THE method how to measure impact. This is just a method that we want to discuss, and we want to know what do you think how this method can be improved to measure the impact of the tool. This is very similar to design based research, but here we try to evaluate, measure, and describe more closely the impact of these tools. Make the tool available to diverse group of users (Non experts) for use during an extensive amount of time. We ourselves have a LA tool, and we started impact analysis and evaluation of it last semester. It was available to the teachers for about 10 weeks, and it turned out that’s not enough. It shows some impact, but we need longer studies for at least a whole semester. In our method we suggest to discern and understand the current state and of the classroom, so we have a reference point for comparison and measurement later in the study. In our study, we did this with questionnaires. And this questionnaire should not be revealing or providing information that might bias the activities that come after. This is why we try to find out what are they doing now. (preferably not related to LA, to see whether they are already reflecting on their teaching, and what they think about the students?) Once the current state is well established, we need to identify which activities are likely to improve by the usage of the LA tool/indicators. This is the place where we pose hypotheses how the usage of the tool will improve (have impact on) the behavior of students and teachers, their activities, and the learning process. With out tool, we have the goal to initialize action research, or initiate awareness on a lower level, or reflection on a higher level. Conduct interviews, ask questions, give questionnaires to be filled in, evaluation forms for feedback with both teachers and students. These can help in capturing their personal feelings, and opinions after using the tool in the given time period. Did a specific indicator puzzled them? Did they reflect upon the data presented to them? Did they revise the action plan? In the exit interviews, we are trying to find any signs for action research initiations. The final step is to compare the results in a “before/after” fashion to draw conclusions on how the hypothesized impact relates to the actual impact of the learning analytics tool.
The point in every context you have specific questions. If the international tools and indicators don’t fit into our specific context, we need to create our own tool. The idea behind is to acknowledge that there exist questions in the context that cannot be answered, and these questions should be addressed by the LA tool. We are aware of the limitations, and you should also be aware of these limitations. There will be new questions, new data.
Put reference from an exact location, reference it.