Welcome to the Globethics.net Library!

 

  • Learning Analytics: Pathways to Impact

    Corrin, Linda; Scheffel, Maren; Gašević, Dragan (ASCILITE, 2020-12-31)
    The field of learning analytics has evolved over the past decade to provide new ways to view, understand and enhance learning activities and environments in higher education. It brings together research and practice traditions from multiple disciplines to provide an evidence base to inform student support and effective design for learning. This has resulted in a plethora of ideas and research exploring how data can be analysed and utilised to not only inform educators, but also to drive online learning systems that offer personalised learning experiences and/or feedback for students. However, a core challenge that the learning analytics community continues to face is how the impact of these innovations can be demonstrated. Where impact is positive, there is a case for continuing or increasing the use of learning analytics, however, there is also the potential for negative impact which is something that needs to be identified quickly and managed. As more institutions implement strategies to take advantage of learning analytics as part of core business, it is important that impact can be evaluated and addressed to ensure effectiveness and sustainability. In this editorial of the AJET special issue dedicated to the impact of learning analytics in higher education, we consider what impact can mean in the context of learning analytics and what the field needs to do to ensure that there are clear pathways to impact that result in the development of systems, analyses, and interventions that improve the educational environment.
  • Combining self-reported and observational measures to assess university student academic performance in blended course designs

    Han, Feifei; Ellis, Robert (ASCILITE, 2020-12-22)
    This study combined the methods from student approaches to learning and learning analytics research by using both self-reported and observational measures to examine the student learning experience. It investigated the extent to which reported approaches and perceptions and observed online interactions are related to each other and how they contribute to variation in academic performance in a blended course design. Correlation analyses showed significant pairwise associations between approaches and frequency of the online interaction. A cluster analysis identified two groupings of students with different reported learning orientations. Based on the reported learning orientations, one-way ANOVAs showed that students with understanding orientation reported deep approaches to and positive perceptions of learning. The students with understanding orientation also interacted more frequently with the online learning tasks and had higher marks than those with reproducing orientation, who reported surface approaches and negative perceptions. Regression analyses found that adding the observational measures increased 36% of the variance in the academic performance in comparison with using self-reported measures alone (6%). The findings suggest using the combined methods to explain students’ academic performance in blended course designs not only triangulates the results but also strengthens the acuity of the analysis. Implications for practice or policy: Using combined methods of measuring learning experience offers a relatively more comprehensive understanding of learning. Combining self-reported and observational measures to explain students’ academic performance not only enables the results to be triangulated but also strengthens the acuity of the analysis. To improve student learning in blended course design, teachers should use some strategies to move students from a reproducing learning orientation towards an understanding orientation and encourage active online participation by highlighting the importance of learning online.
  • Students’ sense-making of personalised feedback based on learning analytics

    Lim, Lisa-Angelique; Dawson, Shane; Gašević, Dragan; Joksimović, Srećko; Fudge, Anthea; Pardo, Abelardo; Gentili, Sheridan (ASCILITE, 2020-12-23)
    Although technological advances have brought about new opportunities for scaling feedback to students, there remain challenges in how such feedback is presented and interpreted. There is a need to better understand how students make sense of such feedback to adapt self-regulated learning processes. This study examined students’ sense-making of learning analytics–based personalised feedback across four courses. Results from a combination of thematic analysis and epistemic network analysis show an association between student perceptions of their personalised feedback and how these map to subsequent self-described self-regulated learning processes. Most notably, the results indicate that personalised feedback, elaborated by personal messages from course instructors, helps students refine or strengthen important forethought processes of goal-setting, as well as to reduce procrastination. The results highlight the need for instructors to increase the dialogic element in personalised feedback in order to reduce defensive reactions from students who hold to their own learning strategies. This approach may prompt reflection on the suitability of students’ current learning strategies and achievement of associated learning goals. Implications for practice or policy: Personalised feedback based on learning analytics should be informed by an understanding of students’ self-regulated learning. Instructors implementing personalised feedback should align this closely with the course curriculum. Instructors implementing personalised feedback in their courses should consider the relational element of feedback by using a positive tone. Personalised feedback can be further enhanced by increasing the dialogic element and by including more information about learning strategies.
  • Perspectives from the stakeholder: Students’ views regarding learning analytics and data collection

    West, Deborah; Luzeckyj, Ann; Searle, Bill; Toohey, Danny; Vanderlelie, Jessica; Bell, Kevin R. (ASCILITE, 2020-12-23)
    This article reports on a study exploring student perspectives on the collection and use of student data for learning analytics. With data collected via a mixed methods approach from 2,051 students across six Australian universities, it provides critical insights from students as a key stakeholder group. Findings indicate that while students are generally comfortable with the use of data to support their learning, they do have concerns particularly in relation to the use of demographic data, location data and data collected from wireless networks, social media and mobile applications. Two key themes emerged related to the need for transparency to support informed consent and personal-professional boundary being critical. This supports findings from other research, which reflects the need for a nuanced approach when providing information to students about the data we collect, including what we are collecting, why and how this is being used. Implications for practice or policy: When implementing the use of dashboards, institutions should ideally include opportunities for students to opt in and out, rather than being set so that students have agency over their data and learning. When undertaking work in relation to learning analytics, staff need to ensure the focus of their work relates to student learning rather than academic research. When institutions and academic staff collect and use student data (regardless of the purpose for doing so), all aspects of these processes need to be transparent to students.
  • Data in practice: A participatory approach to understanding pre-service teachers’ perspectives

    Prestigiacomo, Rita; Hunter, Jane; Knight, Simon; Martinez-Maldonado, Roberto; Lockyer, Lori (ASCILITE, 2020-12-28)
    Data about learning can support teachers in their decision-making processes as they design tasks aimed at improving student educational outcomes. However, to achieve systemic impact, a deeper understanding of teachers’ perspectives on, and expectations for, data as evidence is required. It is critical to understand how teachers’ actions align with emerging learning analytics technologies, including the practices of pre-service teachers who are developing their perspectives on data use in classroom in their initial teacher education programme. This may lead to an integration gap in which technology and data literacy align poorly with expectations of the role of data and enabling technologies. This paper describes two participatory workshops that provide examples of the value of human-centred approaches to understand teachers’ perspectives on, and expectations for, data as evidence. These workshops focus on the design of pre-service teachers enrolled in teacher education programmes (N = 21) at two Australian universities. The approach points to the significance of (a) pre-service teachers’ intentions to track their students’ dispositions to learning and their ability to learn effectively, (b) the materiality of learning analytics as an enabling technology and (c) the alignment of learning analytics with learning design, including the human-centred, ethical and inclusive use of educational data in the teaching practice.   Implications for practice or policy: Pre-service teachers ought to be given opportunities to engage and understand more about learning design, learning analytics and the use of data in classrooms. Professional experience placements for pre-service teachers should include participatory data sessions or learning design workshops. Teacher education academics in universities must be provided with ongoing professional development to support their preparation work of pre-service teachers’ data literacy, learning analytics and the increasing presence of data.

View more