Dienst van SURF
© 2025 SURF
A promising contribution of Learning Analytics is the presentation of a learner's own learning behaviour and achievements via dashboards, often in comparison to peers, with the goal of improving self-regulated learning. However, there is a lack of empirical evidence on the impact of these dashboards and few designs are informed by theory. Many dashboard designs struggle to translate awareness of learning processes into actual self-regulated learning. In this study we investigate a Learning Analytics dashboard based on existing evidence on social comparison to support motivation, metacognition and academic achievement. Motivation plays a key role in whether learners will engage in self-regulated learning in the first place. Social comparison can be a significant driver in increasing motivation. We performed two randomised controlled interventions in different higher-education courses, one of which took place online due to the COVID-19 pandemic. Students were shown their current and predicted performance in a course alongside that of peers with similar goal grades. The sample of peers was selected in a way to elicit slight upward comparison. We found that the dashboard successfully promotes extrinsic motivation and leads to higher academic achievement, indicating an effect of dashboard exposure on learning behaviour, despite an absence of effects on metacognition. These results provide evidence that carefully designed social comparison, rooted in theory and empirical evidence, can be used to boost motivation and performance. Our dashboard is a successful example of how social comparison can be implemented in Learning Analytics Dashboards.
MULTIFILE
From the article: "The educational domain is momentarily witnessing the emergence of learning analytics – a form of data analytics within educational institutes. Implementation of learning analytics tools, however, is not a trivial process. This research-in-progress focuses on the experimental implementation of a learning analytics tool in the virtual learning environment and educational processes of a case organization – a major Dutch university of applied sciences. The experiment is performed in two phases: the first phase led to insights in the dynamics associated with implementing such tool in a practical setting. The second – yet to be conducted – phase will provide insights in the use of pedagogical interventions based on learning analytics. In the first phase, several technical issues emerged, as well as the need to include more data (sources) in order to get a more complete picture of actual learning behavior. Moreover, self-selection bias is identified as a potential threat to future learning analytics endeavors when data collection and analysis requires learners to opt in."
Although learning analytics benefit learning, its uptake by higher educational institutions remains low. Adopting learning analytics is a complex undertaking, and higher educational institutions lack insight into how to build organizational capabilities to successfully adopt learning analytics at scale. This paper describes the ex-post evaluation of a capability model for learning analytics via a mixed-method approach. The model intends to help practitioners such as program managers, policymakers, and senior management by providing them a comprehensive overview of necessary capabilities and their operationalization. Qualitative data were collected during pluralistic walk-throughs with 26 participants at five educational institutions and a group discussion with seven learning analytics experts. Quantitative data about the model’s perceived usefulness and ease-of-use was collected via a survey (n = 23). The study’s outcomes show that the model helps practitioners to plan learning analytics adoption at their higher educational institutions. The study also shows the applicability of pluralistic walk-throughs as a method for ex-post evaluation of Design Science Research artefacts.
LINK
In order to stay competitive and respond to the increasing demand for steady and predictable aircraft turnaround times, process optimization has been identified by Maintenance, Repair and Overhaul (MRO) SMEs in the aviation industry as their key element for innovation. Indeed, MRO SMEs have always been looking for options to organize their work as efficient as possible, which often resulted in applying lean business organization solutions. However, their aircraft maintenance processes stay characterized by unpredictable process times and material requirements. Lean business methodologies are unable to change this fact. This problem is often compensated by large buffers in terms of time, personnel and parts, leading to a relatively expensive and inefficient process. To tackle this problem of unpredictability, MRO SMEs want to explore the possibilities of data mining: the exploration and analysis of large quantities of their own historical maintenance data, with the meaning of discovering useful knowledge from seemingly unrelated data. Ideally, it will help predict failures in the maintenance process and thus better anticipate repair times and material requirements. With this, MRO SMEs face two challenges. First, the data they have available is often fragmented and non-transparent, while standardized data availability is a basic requirement for successful data analysis. Second, it is difficult to find meaningful patterns within these data sets because no operative system for data mining exists in the industry. This RAAK MKB project is initiated by the Aviation Academy of the Amsterdam University of Applied Sciences (Hogeschool van Amsterdan, hereinafter: HvA), in direct cooperation with the industry, to help MRO SMEs improve their maintenance process. Its main aim is to develop new knowledge of - and a method for - data mining. To do so, the current state of data presence within MRO SMEs is explored, mapped, categorized, cleaned and prepared. This will result in readable data sets that have predictive value for key elements of the maintenance process. Secondly, analysis principles are developed to interpret this data. These principles are translated into an easy-to-use data mining (IT)tool, helping MRO SMEs to predict their maintenance requirements in terms of costs and time, allowing them to adapt their maintenance process accordingly. In several case studies these products are tested and further improved. This is a resubmission of an earlier proposal dated October 2015 (3rd round) entitled ‘Data mining for MRO process optimization’ (number 2015-03-23M). We believe the merits of the proposal are substantial, and sufficient to be awarded a grant. The text of this submission is essentially unchanged from the previous proposal. Where text has been added – for clarification – this has been marked in yellow. Almost all of these new text parts are taken from our rebuttal (hoor en wederhoor), submitted in January 2016.