9 Student Perceptions of Learning Analytics

Much of the existing research on student perceptions of learning analytics has placed an emphasis on examining the ethical and privacy issues related to learning analytics (Arnold, & Sclater, 2017; Ifenthaler & Schumacher, 2016). As ethical and privacy concerns can often act as a barrier to LA implementation for institutions, this section begins by reviewing a selection of recent studies using different methods in an effort to include student voices in conversations around the use of learning analytics. Second, we turn to the literature on student-facing dashboards and visualizations. Finally, we consider quality assurance of LA as a “service” for students during their learning.

Ethical and privacy concerns

Arnold and Sclater (2017) adopted a basic survey methodology that differed slightly in  two data collection sites: JISC (UK) and University of Wisconsin (USA). JISC is a not-for-profit organization providing digital services and solutions in the UK that has invested considerably in building capacity for LA since 2014; University of Wisconsin is comprised of 26 campuses with more than 150,000 students that has piloted a variety of LA tools in diverse contexts. There were three yes/no questions asked of both JISC and Wisconsin students in an online suvey, asking if they would be happy for their data to be used if it kept them from dropping out/helped the get personalized interventions, or if it helped improve their grades, and if they would like their data to be visualized so they can compare themselves with classmates. Overall, it was found that American students tended to be more accepting of data use. Across both institutions, the use of student data to improve grades was the most acceptable purpose. Fewer students in the UK were interested in social comparison via an app for LA, but American students were still hesitant. While the questions the authors used do not provide information about how the data are actually used and fail to provide sufficient qualitative context for the students’ responses given the complexity of privacy issues, their study provides a starting point for engaging students in the discussion.

An exploratory study using a narrative approach sought five students’ impressions of how their instructor used data of their interactions with their LMS, Blackboard (Kammer 2015). First, the participants were asked to provide a written response to a scenario, then to write a description of the data that they thought was being collected on them, how it might be used, and the associated risks. Second, the participants were interviewed in semi-structured interviews to answer follow-up questions to their writing. The authors used content analysis to code the interviews for instances of deotologist and consequentialist ethical perspectives. Deontological approaches support the right to information and truth; consequentialist approaches recognize that use of information can be well-intentioned but can lead to harmful results.  Generally, students felt complacent about their privacy, indicating that they already have little control over their data, and felt it was useless to worry about it. Students also believed that instructors use data for legitimate educational purposes, but some worried about the effect on their grades. Overall, students were open to instructors using their data, but also indicated that they wanted to be more informed about the use of their data, and that they would probably change their behaviours if they knew the instructor was looking.

Roberts, Howell, Seaman, & Gibson (2016) conducted four focus groups with a total of 41 students, mostly from psychology, to explore students’ knowledge, attitudes, and concerns about big data and learning analytics.  Six key themes emerged from their analysis. The first theme, “Uninformed and Uncertain,” represents the students’ views at the beginning of the focus groups. The students who participated in the study had little, if any, knowledge of LA. The remaining five themes emerged after students were provided with information on LA, viewed videos, and discussed learning analytics scenarios.  The second theme, “more than a number,” captures students reflections on the potential for LA to create personalized experiences, as they currently felt anonymous in their courses. The third theme, “help or hindrance to learning”  reveals students’ anticipated effects on their learning. Positive attitudes related to identification of students at risk, increased support, enhanced motivation—“the fitbit version of the learning world”—for a more directed learning experience. The potential for LA to provide data for students to compare their performance with peers was more contentious, with especially first-year students relating it to high school experiences of being ranked and competition among students. Furthermore, students voiced possible negative consequences when dashboards display information or send alerts that a student is doing poorly, especially when they were working hard in their studies, or when students are doing very well and might start to “slack off.” The fourth theme, “impending independence,” amounts to the tension between students’ appreciation for the additional support and the desire to be more self-directed in their learning. Students were wary that an overdependence on such systems would hinder their ability to succeed in the workforce, where similar systems would not be in place. The fifth theme, “driving inequality,” outlines issues of inequity and bias. For example, students saw extra guidance might unfairly affect grades. The students’ greatest concern was the potential for identifiable information to bias how educators interacted with them and affect the course of future studies. Finally, the theme “where will it stop” reflect students’ concerns of privacy and consent. It was very important for all students to be educated on LA and to be given the option to provide consent (or not).

Using a quasi-experimental design, Iftenhaler and Schumacher (2016) examined student perceptions of privacy principles related to learning analytics at a German University. A total of 330 university students were familiarized with three different LA systems, and asked to rate and compare each system for acceptance and expected use for learning. The participants also completed measures for the control over data and sharing of data. Findings suggest that students are not willing to share all of their data, specifically their personal information and traces of online behaviour (e.g., user path, online times, download frequencies). Students’ perceived control over their data was positively related to acceptance and expected use of LA tools, as was their willingness to share data. In terms of preferences for LA systems, students preferred systems that provided a wide variety of support for learning (e.g. self-assessment, predicted mastery, personalized feedback, etc.) and prompt students to engage actively in specific strategies (e.g., recommend relevant chapters, take self-test on assigned readings, etc.). Overall, the relationship between LA acceptance and privacy princiles support the need to include students as key stakeholders in LA research and implementation. The authors also suggest that there should be policies in place to obtain students’ informed consent prior to their use of LA data.

 

Design of visualizations/Student-facing dashboards

Although much of the existing literature on learning analytics focuses on presenting data to educators (e.g., instructors, advisors) so that they can intervene or re-design curricula, there is a growing interest in providing students with opening up access to their own analytics.  Viewing their own analytics, for example in a student-facing dashboard, benefits students by increasing awareness, responsibility, and independence in learning, as well as promoting metacognitive processes such as self-assessment, planning, and reflection (Johnson, Reimann, Bull, & Fujita, 2011).  Some posit that student use of analytics can empower students to take agency over their learning instead of relying on diagnoses and interventions from their instructor (Kruse & Pongsapan, 2012; Wise, 2014).

Arnold, Karcher, Wright, and McKay (2017) authors posit that technology that raises awareness of learning behaviours and personal goals can improve self-directed learning, much like fitness trackers increase fitness behaviours.  They explored the student user experiences of using a web and mobile application called Pattern developed by Purdue University to empower students to track their study habits and provide relevant recommendations and peer comparisons in real time.  Interestingly, as Pattern is designed to be a student-centric tool, students control the app’s access to their identifiable data. Pattern was pilot tested at Purdue and University of Wisconsin-Madison at a course-level. Students were able to view their activities on a dashboard; instructors were able to view a dashboard of aggregate and anonymized data for students in their courses; and academic advisors were allowed access to the curated student data at an identifiable level  only if the student opted to share it. Results from early testing indicate that most (71% of Purdue students and 74% of UW-Madison students) found the app’s recommendations helpful. Further, most UW-M students found it easy to use, believed that it allowed them access to otherwise ‘hidden’ information, reported a good understanding of dashboards and visualizations, and most indicated that they would use the app again for other courses. One student-reported concern, and one that recurs throughout the literature, is the concern about the anonymity of the visualizations.

Graphic Literacy

From reviewing the literature on student-facing dashboards, we turn to post-secondary student understandings of visualizations.  A common educator concern is that students will misconstrue the information represented graphically and numerically in visualizations and dashboards. Individuals with low graphic literacy, or graphicacy, are more likely to be mislead by framing, or the way it the information is presented, and disregard the depicted relationships in the graph (Okan, Garcia-Retamero, Galesic, & Cokely, 2012). If learning analytics visualizations are to promote self-directed or self-regulated learning, it is important that students can comprehend the visualized information to make appropriate behaviour changes.

Park & Jo (2015) reviewed the features of existing learning analytics dashboards such as LOCO Analyst (Ali et al., 2015) and Course Signals (Pistilli & Arnold, 2010) to develop an early version of the learning analytics dashboard, called Learning Analytics for Prediction and Action (LAPA), to support students’ learning performance. To design and develop the dashboard, they first completed a needs assessment through interviews with eight university students. Perceptions regarding needs for a were mixed. In terms of the analysis of the learning patterns, some students reflected on the potential helpfulness of the data, while others reported concerns about privacy. On the other had, all students anticipated that the information about their performances would be useful, accurate, and objective; some specifically liked the potential for comparison. Second, the authors prototyped the first version of LAPA dashboard and conducted a usability test with six students. They determined that students often lacked graphicacy required for making sense of the graphs, which underscores the importance of support and explanations  for students using learning analytics tools. Finally, after correcting major issues, the researchers completed a pilot test of LAPA in a real classroom setting with 73 students (37 students  in the ’treatment’ group). Participants indicated greater understanding of the graphs, but did not perceive them as useful for learning. Specifically, students viewed log-in regularity as unhelfpul, but reported the potential for data on visits to repositories and discussion boards. Overall, student indicated a moderate level of satisfaction with LAPA, and that it helped them to reflect on their learning behavior in the LMS—but the dashboard did not otherwise contribute to behavioural changes.

Alabi and Hatala (2017) conducted a mixed-method study to examine whether the student perceptions of two learning analytics visualizations during online discussion activities matched the pedagogical intentions of the tools. The two visualizations represented a comparison to the highest performing peers, or the quality of discussion posts. Finding suggested that students motivated by competition were more likely to engage with the visualizations, and that the social comparison visualization resulted in more discussion posts than the visualization for discussion post quality. Through interviews with 12 students, the authors identified that around two-thirds of participants misunderstood the data and/or visualizations, which they explained was in part due to a number of students not reading the instructions for how to access or interpret the visualizations. Further, some interviewees conveyed concern about trusting the visualizations and their underlying algorithms. Other interviewees chose to largely ignore the visualizations if they did not match their own opinions of their performance, possibly exihibiting the cognitive bias effect of anchoring. Finally, almost all of the interviewees indicated that the visualizations should be “visually stimulating” or make them “feel good,” emphasizing the need for affective considerations to be taken into account in the design of learning analytics visualizations for students, alongside the numeracy and graphic literacy and motivational factors.

From the studies reviewed above, it is clear that providing scaffolding to help students develop graphic literacy and numeracy in interpreting the visualizations are important if we want to foster the use of such learning analytics tools. Promisingly, these studies report qualitative, interview data to contextualize the iterative nature of developing visualizations and dashboards to incorporate feedback from the key stakeholders—the students.

Quality assurance of LA as a “service”

Whitelock-Wainwright, Gasevic, & Tejeiro (2017) posit that we  conceptualize LA as a “service” that provides the most important stakeholders in  higher education—students—with support during their learning. The authors take an action-research approach to assuring LA quality, such that iterative rounds of student feedback inform practitioner reflection and changes to LA services. They draw attention to the top-down approach taken in many LA service implementations that prioritize needs of researchers, managers, and policy makers and perpetuate an ideological gap between what practitioners think the LA service should be and what students actually expect from such a service. The authors develped a scale to elicit student expectations and perceptions of LA as a service based on the literature on legal and ethical issues as well as frameworks for LA implementation. The authors identified four main themes as most important for measuring service quality: ethics and privacy, meaningfulness, agency, and intervention. The scale items consider both the desired and predictive expectations, since satisfaction often comes from desired expectations being met and dissatisfaction from predictive expectations being unmet. The authors have refined the scale using expert input, and plan to run a pilot study with students in the next phase of their research.

License

Share This Book