7 Educator Perceptions of Learning Analytics

A number of learning managment systems (LMS), such as Blackboard, Brightspace/D2L, Instructure Canvas, Moodle, and Sakai, are currently available to educators to structure and develop the asynchronous aspects of their online courses. In addition, web-based video conferencing tools such as Adobe Connect and Blackboard Collaborate Ultra enable educators to enhance their courses with interactive synchronous sessions. Yet, educators are less familiar with the features of the learning analytics tools that are available for their LMS. This section of the literature review explores educators perceptions with regard to adoption of learning analytics (LA) tools, their degree of engagement with a tool, the interpretability of visualizations, and ethical considerations in using student data.

Educators’ Adoption of LA

Ali, Asadi, Gasevic, Jovanovic, and Hatala (2013) proposed and empirically validated the Learning Analytics Acceptance Model (LAAM) to examine the how the analytics offered in a learning analytis tool affects educators’ adoption beliefs. In particular, the LAAM attempts to explain how factors such as the usage beliefs (perceived usefulness and ease-of-use) from the Technological Acceptance Model (Davis, 1989) are associated with the intention to adopt a learning analytics tool. The authors targeted the learning analytics tool, the LOCO-Analyst tool, which offers the following types of learning analytics: single lesson analytics, composite lesson analytics, module analytics, quiz analytics, social interaction analytics, lesson interaction analytics, and comprehension analytics. They administered a self-report survey to six instructors, eight teaching assistants, and eight researchers/learning analysts (N=22) from mostly computer science or Information Science backgrounds. The survey questions on usage beliefs of the tool’s features used a 5-point Likert scale, and participants were asked to explain their responses in 1-2 sentences, making use of both quantitative and qualitative data collection. Participants completed a part of the questions while completing LOCO-Analyst tool activities, and the remainder after completion.

The authors found that while most participants indicated favourable usage beliefs, these factors were not sufficient to predict the behavioural intention to adopt the tool. Instead, the participants’ pedagogical role (instructor) was a significant predictor, with those in direct online teaching positions more likely to adopt the tool. Furthermore, the perceived usefulness of the analytics identifying areas of content that need improvement was also significantly related to the intention to adopt. Ali et al. (2013)’s study differs from previous related studies (Kosba et al., 2007; Mazza & Dimitrova, 2007) by focusing on the quantitative analysis of evaluation data of a learning analytics tool and focusing on educators. They recommended that future studies should be conducted with other disciplines and with other learning analytics tools.

Teachers’ Favourable Perceptions of Learning Analytics Tools

After adopting a learning analytics tool, there can still be a great variation in educators’ degree and engagement with LA. Herodotou, Rientes, Boroowa, Zdrahal, Hlosta, & Naydenova (2017) investigated educators’ uses and practices of predictive data to support students at risk of not completing or failing a module. Data were collected from a sample of 240 teachers across 10 modules at a Open University, a UK distance learning higher education institution. Data from 17,033 students and five individual interviews with teachers were also analysed. The authors focused on a learning analytics model, OU Analyse (OUA), which uses advanced statistics and machine learning approaches to make predictions of at-risk students based on static data (e.g. demographics) and fluid data (e.g. students’ interactions within the LMS).

The OUA dashboard provides teachers with information about the students performance such as the average performance of the whole cohort of students, list of all students and predictions of their performance in the next teacher-marked assignment, and the similarity between an individual student and their nearest neightbours in terms of LMS activity and demographic parameters. Seventy teachers received a weekly email reminding them that OUA predictions were available through a dashboard; the remaining 170 teachers who did not have virtual private network connections to the dashboard received predictions in the form of excel sheets. The authors analyzed the student performances of teachers who had access to predictive data compared to students whose teachers did not have access to this data. The authors found that although 70 teachers had access to the predictive analytics on the OUA dashboard, they did not access them systematically. Some teachers logged in on a weekly basis, whereas others had substantial gaps.

Semi-structured interviews were conducted with five teachers and analyzed as individual case studies. As participants self-selected into interviews, this may have biased the data towards favourable perceptions. These teachers had existing positive attitudes towards pedagogical innovations. They also had established teaching practices for intervening and supporting students at risk that might have aligned well with the tool. For example, teachers were already monitoring student progress, and the tool encouraged them to become more proactive in emailing or calling students by phone, or referring them to student support services in between teacher-marked assignments.

Educators’ Perceptions of Visualizations Tools

Learning analytics usually focuses on collecting traces that learners leave behind in Learning Management Systems (LMS) and analyzing those traces to improve learning (Duval & Verbert, 2012). There are different approaches to analysis, with some applying educational data mining methods to discover patterns (Romero & Ventura, 2007), while others attempt to use information visualization techniques to grab users attention and provide a dashboard for learners and educators to take advantage of the traces in more manageable ways (Duval, 2011). Visualizations filter raw data and metrics to make them more accessible in tables, graphs, and other graphical representations (Sclater, 2017). Dashboards consolidate on a single computer screen a “visual display of the most important information needed to achieve one or more objectives” (Few, 2013, p. 26) to facilitate effective communication and decision making. For example, visualizations in dashboards may enable teachers to identify problems of understanding that students have and make appropriate pedagogical decisions in real time and in actual classrooms (Vatrapu, Teplovs, Fujita, & Bull, 2011).

When educators attempt novel pedagogical approaches in distance education, such as the application of serious games, being able to track their students’ progress to assist them in achieving learning outcomes becomes especially important. Minović, Milanović, Sosevic, and Conde-González (2015) employed an empirical design to evaluate a LA visualization tool in a educational game session. They first conducted a two-phase experimental design, where 20 students engaged in a 15 minute role-playing adventure game through which there data was logged and reinterpreted to create a scenario to evaluate a visualization tool. Then, six educators monitored the scenario and were asked to identify problems using a visualization that were pre-identified by the research team. Afterwards, guided inteviews elicited educator opinions and feedback regarding the visualization tool. The authors found that on average, educators were able to identify 79% of the pre-identified problems in the game play, but this was based on the use of a tabular report rather than the visualization tool. However, the educators had an average success rate of 71% on identifying the cause of the problem using visualization tool. Interviews indicate that educators had very positive perceptions of the visualization tool, indicating that it was easy to interpret, made it easier to trace an individual student’s progress through the game than a tabular report, and facilitated detection of an imbalance in the difficulty of course concepts. The educators suggested design improvements in future tool development, such as incorporating comparative views of progress for groups of students as well as individuals, and a clearer representation of the fulfilment of learning outcomes according to Bloom’s taxonomy. Thus this study provides a good basis for further development of real-time learning analytics in an educational gaming context.

Educator Perspectives of the Ethical Considerations in the use of Student Data for Learning Analytics

The research on educator perceptions of learning analytics tends to focus on adoption or evaluation of particular analytical or visualization tools. Nevertheless, as more student data is used as evidence of learning and interventions are taken, there is also a growing concern about ethical use of student data for learning analytics at national and international levels (Pardo & Siemens, 2014; Willis, Slade, and Prinsloo, 2016). Jones (2016) argues that there exists a disconnect between the institutional policy and guidelines regarding the ethical use of student data, and the perceptions of academics about these guidelines. Jones surveyed the perceptions of 68 academic staff at an Australian university and found that institutional policy was the least of the concerns reported as impacting current knowledge and use of learning analytics. The majority (60%) of the participants indicated time constraints as the main factor affecting their current learning analytics use. Thus, while much has been written on the implementation of learning analytics and ethical use of student data at institutional levels, there is gap for the practical solutions of how to engage educators in adopting ethical practices in the use of student data (Jones, 2016).

Summary

Through a review of this and other LA literature, it is evident that the Learning Analytics Acceptance Model (LAAM) is targeted toward individual adopters of specific LA tools; while the Learning Analytics Readiness Inventory (LARI) is a reflective measure aimed to provide insight into actionable steps toward institutional adoption of LA. Both have their place, and may be useful tools for future stages of research in implementing LA at higher education institutions.

License

Share This Book