Deutsches Institut für Internationale Pädagogische Forschung (DIPF)
Refine
Year of publication
- 2020 (3) (remove)
Has Fulltext
- yes (3)
Is part of the Bibliography
- no (3)
Keywords
- Higher education (1)
- Learner goals (1)
- Learning analytics dashboard (1)
- Self-regulated learning (1)
- Sense-making (1)
- Student-facing learning analytics (1)
- economics (1)
- epistemic network analysis (1)
- eye-tracking (1)
- graph understanding (1)
- higher education (1)
Institute
Dieser Entwurf eines Verhaltenskodex richtet sich an Hochschulen, die mittels Learning Analytics die Qualität des Lernens und Lehrens verbessern wollen. Der Kodex kann als Vorlage zur Erstellung von organisationsspezifischen Verhaltenskodizes dienen. Er sollte an Hochschulen, die Learning Analytics einführen wollen, durch Konsultationen mit allen Interessengruppen überprüft und an die Ziele sowie die bestehende Praxis innerhalb der jeweiligen Hochschulen angepasst werden. Der Kodex wurde auf Grundlage einer Analyse bestehender europäischer Kodizes (Engelfriet, Manderveld & Jeunink, 2017; Westerlaken, Manderveld & Jorna, 2019; Sclater & Bailey, 2015; Open University UK, 2014; University of Edinburgh, 2018) und der in Deutschland geltenden Rechtsgrundlage vom Innovationsforum Trusted Learning Analytics des hessenweiten Projektes „Digital gestütztes Lehren und Lernen in Hessen“ entwickelt.
Learning to solve graph tasks is one of the key prerequisites of acquiring domain-specific knowledge in most study domains. Analyses of graph understanding often use eye-tracking and focus on analyzing how much time students spend gazing at particular areas of a graph—Areas of Interest (AOIs). To gain a deeper insight into students’ task-solving process, we argue that the gaze shifts between students’ fixations on different AOIs (so-termed transitions) also need to be included in holistic analyses of graph understanding that consider the importance of transitions for the task-solving process. Thus, we introduced Epistemic Network Analysis (ENA) as a novel approach to analyze eye-tracking data of 23 university students who solved eight multiple-choice graph tasks in physics and economics. ENA is a method for quantifying, visualizing, and interpreting network data allowing a weighted analysis of the gaze patterns of both correct and incorrect graph task solvers considering the interrelations between fixations and transitions. After an analysis of the differences in the number of fixations and the number of single transitions between correct and incorrect solvers, we conducted an ENA for each task. We demonstrate that an isolated analysis of fixations and transitions provides only a limited insight into graph solving behavior. In contrast, ENA identifies differences between the gaze patterns of students who solved the graph tasks correctly and incorrectly across the multiple graph tasks. For instance, incorrect solvers shifted their gaze from the graph to the x-axis and from the question to the graph comparatively more often than correct solvers. The results indicate that incorrect solvers often have problems transferring textual information into graphical information and rely more on partly irrelevant parts of a graph. Finally, we discuss how the findings can be used to design experimental studies and for innovative instructional procedures in higher education
Highlights
• Transparency of design, reference frames and support for action were found to support students' sense-making of LA dashboards.
• The higher the overall SRL score, the more relevant the three factors were perceived by learners.
• Learner goals affect how relevant students find reference frames.
• The SRL effect on the perceived relevance of transparency depends on learner goals.
Abstract
Unequal stakeholder engagement is a common pitfall of adoption approaches of learning analytics in higher education leading to lower buy-in and flawed tools that fail to meet the needs of their target groups. With each design decision, we make assumptions on how learners will make sense of the visualisations, but we know very little about how students make sense of dashboard and which aspects influence their sense-making. We investigated how learner goals and self-regulated learning (SRL) skills influence dashboard sense-making following a mixed-methods research methodology: a qualitative pre-study followed-up with an extensive quantitative study with 247 university students. We uncovered three latent variables for sense-making: transparency of design, reference frames and support for action. SRL skills are predictors for how relevant students find these constructs. Learner goals have a significant effect only on the perceived relevance of reference frames. Knowing which factors influence students' sense-making will lead to more inclusive and flexible designs that will cater to the needs of both novice and expert learners.