Welcome to the Globethics.net Library!

 

  • Data in practice: A participatory approach to understanding pre-service teachers’ perspectives

    Prestigiacomo, Rita; Hunter, Jane; Knight, Simon; Martinez-Maldonado, Roberto; Lockyer, Lori (ASCILITE, 2020-12-28)
    Data about learning can support teachers in their decision-making processes as they design tasks aimed at improving student educational outcomes. However, to achieve systemic impact, a deeper understanding of teachers’ perspectives on, and expectations for, data as evidence is required. It is critical to understand how teachers’ actions align with emerging learning analytics technologies, including the practices of pre-service teachers who are developing their perspectives on data use in classroom in their initial teacher education programme. This may lead to an integration gap in which technology and data literacy align poorly with expectations of the role of data and enabling technologies. This paper describes two participatory workshops that provide examples of the value of human-centred approaches to understand teachers’ perspectives on, and expectations for, data as evidence. These workshops focus on the design of pre-service teachers enrolled in teacher education programmes (N = 21) at two Australian universities. The approach points to the significance of (a) pre-service teachers’ intentions to track their students’ dispositions to learning and their ability to learn effectively, (b) the materiality of learning analytics as an enabling technology and (c) the alignment of learning analytics with learning design, including the human-centred, ethical and inclusive use of educational data in the teaching practice.   Implications for practice or policy: Pre-service teachers ought to be given opportunities to engage and understand more about learning design, learning analytics and the use of data in classrooms. Professional experience placements for pre-service teachers should include participatory data sessions or learning design workshops. Teacher education academics in universities must be provided with ongoing professional development to support their preparation work of pre-service teachers’ data literacy, learning analytics and the increasing presence of data.
  • Mobile-assisted language learning through learning analytics for self-regulated learning (MALLAS): A conceptual framework

    Viberg, Olga; Wasson, Barbara; Kukulska-Hulme, Agnes (ASCILITE, 2020-12-23)
    Many adult second and foreign language learners have insufficient opportunities to engage in language learning. However, their successful acquisition of a target language is critical for various reasons, including their fast integration in a host country and their smooth adaptation to new work or educational settings. This suggests that they need additional support to succeed in their second language acquisition. We argue that such support would benefit from recent advances in the fields of mobile-assisted language learning, self-regulated language learning, and learning analytics. In particular, this paper offers a conceptual framework, mobile-assisted language learning through learning analytics for self-regulated learning (MALLAS), to help learning designers support second language learners through the use of learning analytics to enable self-regulated learning. Although the MALLAS framework is presented here as an analytical tool that can be used to operationalise the support of mobile-assisted language learning in a specific exemplary learning context, it would be of interest to researchers who wish to better understand and support self-regulated language learning in mobile contexts. Implications for practice and policy: MALLAS is a conceptual framework that captures the dimensions of self-regulated language learning and learning analytics that are required to support mobile-assisted language learning. Designers of mobile-assisted language learning solutions using MALLAS will have a solution with sound theoretically underpinned solution. Learning designers can use MALLAS as a guide to direct their design choices regarding the development of mobile-assisted language learning apps and services.
  • Deep neural networks for collaborative learning analytics: Evaluating team collaborations using student gaze point prediction: Evaluating team collaborations using students’ gaze point prediction

    Barmaki, Roghayeh; Guo, Zhang (ASCILITE, 2020-12-28)
    Automatic assessment and evaluation of team performance during collaborative tasks is key to the research on learning analytics and computer-supported cooperative work. There is growing interest in the use of gaze-oriented cues for evaluating the collaboration and cooperativeness of teams. However, collecting gaze data using eye-trackers is not always feasible due to time and cost constraints. In this paper, we introduce an automated team assessment tool based on gaze points and joint visual attention (JVA) information drawn from computer vision solutions. We evaluated team collaborations in an undergraduate anatomy learning activity (N = 60, 30 teams) as a test user study. The results indicate that higher JVA was positively associated with student learning outcomes (r(30) = 0.50, p < 0.005). Moreover, teams who participated in two experimental groups and used interactive 3D anatomy models, had higher JVA (F(1,28) = 6.65, p < 0.05) and better knowledge retention (F(1,28) = 7.56, p < 0.05) than those in the control group. Also, no significant difference was observed based on JVA for different gender compositions of teams. The findings from this work have implications in learning sciences and collaborative computing by providing a novel joint attention-based measure to objectively evaluate team collaboration dynamics. Implications for practice or policy: Student learning outcomes can be improved by receiving constructive feedback about team performances using our gaze-based collaborative learning method. Underrepresented and underserved minorities of science, technology, engineering and mathematics disciplines can be engaged in more collaborative problem-solving and team-based learning activities since our method offers a broader reach by automating collaboration assessment process. Course leaders can assess the quality of attention and engagement among students and can monitor or assist larger numbers of students simultaneously. 
  • Critical success factors for implementing learning analytics in higher education: A mixed-method inquiry

    Clark, Jo-Anne; Liu, Yulin; Isaias, Pedro (ASCILITE, 2020-12-23)
    Critical success factors (CSFs) have been around since the late 1970s and have been used extensively in information systems implementations. CSFs provide a comprehensive understanding of the multiple layers and dimensions of implementation success. In the specific context of learning analytics (LA), identifying CSFs can maximise the possibilities of an effective implementation and harness the value of converting data into actionable information. This paper proposes a framework that aims to identify and explore the CSFs for the implementation of LA within the higher education sector by examining the viewpoints of higher education professionals. To obtain a rounded insight into stakeholders’ perceptions, we conducted a mixed-method inquiry with factor analysis, profile analysis and thematic analysis of both quantitative and qualitative data collected with an online questionnaire from an international sample. The responses validate five CSFs of LA implementation: strategy and policy at organisational level, information technological readiness, performance and impact evaluation, people’s skills and expertise and data quality. Results also disclose diverse views about the CSFs’ priorities and the associated difficulties and achievements. Implications for practice or policy: Higher education practitioners should consider CSFs for implementing an LA initiative successfully. This study validates five dimensions of the CSFs of implementing LA in higher education. The validated framework enumerates several factors in each of the main dimensions for achieving optimum results. Stakeholders have diverse opinions about the priorities of CSFs, particularly in organisational commitment, data quality and human capital.
  • Engaging the control-value theory: a new era of student response systems and formative assessment to improve student achievement

    Mary W. Paul; Colleen Torgerson; Susan Tracz; Kimberly Coy; Juliet Wahleithner (Association for Learning Technology, 2020-12-01)
    The use of student response systems (SRS) in the form of polling and quizzing via multiple choice questions has been well documented in the literature (Caldwell 2007). This study addressed the gap in the literature and considered content-generating SRS, such as Socrative and Google Slides, during formative assessment activities in college composition courses. Content-generating SRS display student responses to formative assessment questions, and instructors are able to evaluate and adjust course material and feedback in real-time. Quantitative data measuring student perception using Likert-scale surveys and student achievement using essay scores were collected. The statistically significant results between the treatment and control groups for essay scores are objective measurements of student achievement and have implications for how to support both students and faculty in innovative curriculum design. Content-generating SRS allow for a more robust illustration of student understanding and can be adopted for larger lecture classes.

View more