6 General User Perceptions of Learning Analytics

Learning analytics (LA) has attracted much attention in higher education and a growing number of studies are showing its significance in informing teaching and learning. However, there is also much uncertainty and hesitations about its practical application (Drachsler and Greller, 2012), perhaps due to the fact that LA projects also typically require significant investments from institutions (Arnold, Lonn, & Pistilli, 2014).

As a common understanding of and a vision for the practical application of LA in education has not yet formed, Drachsler and Greller (2012) developed a survey to collect perceptions from about LA from individuals working in multiple education sectors. The survey was based on a framework that identified six critical dimensions that they consider necessary to take advantage of LA: (1) Stakeholders, (2) Objectives, (3) Data, (4) Method, (5) Constraints, and (6) Competences. The survey was distributed internationally to individuals working in multiple education sectors; the highest response rate was from post-secondary institutions (74%, n=116). Interestingly, respondents indicated that students, followed by teachers, and then institutions, would be the primary beneficiaries of LA, with the highest impact on teacher-student relationships. They also indicated that objectives should focus on stimulating reflection in stakeholders, and unveiling new information about students, as opposed to predicting their performance. Findings also indicated stakeholders concern that LA will impact data ownership, openness, and transparency of education. Finally, the majority of respondents stressed the idea that users of LA will require skills in self-directedness, critical reflection, evaluation, and analysis, and that scaffolding and additional support will need to be provided for successful LA implementation at institutions.

To identify relevant institutional strengths and potential areas for development prior to the implementation of LA initiatives, Arnold, Lonn, & Pistilli (2014) developed and pilot-tested the Learning Analytics Readiness Instrument (LARI). The LARI is a survey instrument to be completed by people in different roles in an institution as they reflect on the merits of engaging with a LA project. LARI situates LA “at the intersection of “big data” and student success” by integrating definitions of LA and how they should be used to support teachers and learners in taking actionable steps toward increasing success. Through exploratory factor analysis, they identified the five biggest factors for institutional reflection:

  1.  Ability, which encourages the exploration of necessary skillsets inherent in LA implementation;
  2. Data, addressing the need for certain types of valid and reliable data, as well as how the data are stored and accessed;
  3. Culture & Process, measuring perceptions of organizational norms for data use, sharing, and security;
  4. Governance & Infrastructure, related to institutional investment in LA, etc.; and
  5. Overall Readiness Perceptions.

Respondents in their study (n=33, 9 institutions in US and Canada) generally indicated high scores in Data and Governance & Infrastructure, which indicate perceptions of institutional investment and the availability of data. However, lower reported levels of Ability and Culture & Process indicate a lack of institutional objectives for LA, as well as concerns regarding internal policies, and the time and resources needed for LA initiatives. Arnold et al. explain that the LARI is intended to be a reflective tool, rather than simply providing a numerical “position” of an institution, and suggest that institutions using LARI to allow for further insights into actionable steps toward the implementation of LA.

Whereas Drachsler and Greller (2012) conceptualized dimensions that need to be covered to implement LA in a beneficial way, Arnold et al.’s (2014) study empirically outlines factors necessary for consideration before institutional adoption of LA. The influences that are involved in each report are slightly different. For example, the earlier study by Drachsler and Greller do not address cultural influences for data use, sharing, and security. The organizational norms are important factors influencing the deployment of learning analytics systems, as the following studies on early warning systems rely on data use and sharing in institutions, with appropriate data security measures in place.

Early Warning Systems

The early LA literature focuses on institutional implementations of early warning systems to enable instructors and academic advisors to intervene to aid student success and retention efforts of students “at risk.”

Perhaps the most prominent system of these early warning systems is Course Signals, piloted in 2007 at Purdue University in Indiana, USA. It partnered with SunGard Higher Education in 2010, and has more recently partnered with Ellucian to assist other institutions to harness the power of learning analytics. Arnold and Pistilli (2012) explain that Course Signals allows instructors to provide real-time feedback to students. Their predictive model integrates data from multiple institutional sources including student’s grades, interaction with the LMS, prior academic history (e.g. high school GPA, standardized test scores), and characteristics (e.g., residency, age, or credits attempted)—to predict students who may be falling behind. It generates a ‘risk-level’ that is then presented to the student in the form of a traffic signal—red, yellow, or green light—on their LMS course home page. This allows instructors to intervene early and give students an opportunity to adjust their behaviour in specific courses. This intervention has improved academic performance, as well as student retention, especially when students encounter Course Signals earlier in their post-secondary education.

These authors implement user surveys for students and instructors at the end of each semester, and have also held focus groups and interviews. Most (89%) of the students reported positive experiences with Course Signals, and 58% indicated that they would like every course to employ this system. In particular, students perceived the automated emails and warning as personal communication between themselves and the instructor that mitigated their feelings of anonymity, which is common among first-year students. Negative feedback related to faculty use of the tool, such as concurrently delivering the same message via email, text message, and LMS message; stale traffic signals from not updating the course home page; and a desire to receive more specific information. Instructors tended to view this system as beneficial in that it allows them to to identify students who may need help, and provide earlier guidance to them. Instructors also indicated that students tend to be more proactive as a result of Course Signals intereventions, but reported concerns regarding the large number of emails from concerned students, students developing a dependency on the system, and a lack of ‘best practices’ to guide their practice.

As Course Signals emphasizes analytics at the institutional level rather than at the course level, following Long & Siemens (2011) definitions, this system might be more accurately classified as academic analytics. The following study by Aguilar, Lonn, & Teasley (2014) focuses on learning analytics at a program and course levels.

Another notable early warning system is Student Explorer, developed as a design-based research implementation project led by Stephanie Teasley and Steve Lonn at the University of Michigan. Student Explorer leverages LMS data using LA techniques to help academic advisors identify students at risk academically. Initially, it was developed to track student effort and performance and to provide support for advisors in an integrated student development program for at-risk students in their first- and second-year in undergraduate STEM programs. Aguilar et al. (2014) investigated the perceptions of academic advisors (n=9) on Student Explorer in the context of a much shorter, 7-week Summer Bridge Program (referred to as “Bridge”) to assist non-traditional students transition from high school to college by providing highly structured course work in a mathematics course, an English composition course, and an introductory social science course. Student Explorer is an advisor-facing tool intended to allow advisors to prepare for meetings with students. However, in practice, most advisors used the tool during student meetings, with the student present. Since the learning analytics interventions are likely to be viewed by students in the advising context, this had implications for future design and development. For example, once the researchers discovered this pattern of usage, they added an option to hide information on other students. They suggested that future design efforts should take into account the possibility of unintended audiences viewing the data and interfaces. Further, most advisors perceived Student Explorer to be valuable, regardless of their actual usage. The authors, engaging in design-based research, continue to refine the design of the Student Explorer system in consideration of user-feedback. Overall, this paper favours the implementation of LA-based interventions in learning management systems.

Delving deeper into learning processes, other LA studies have investigated “micro-level” processes of complex learning such as of self-regulated learning (Siadataty, Gasevic, & Hatala, 2016) and knowledge creation (Chiu & Fujita, 2014). The latter study was conducted in a workplace context of two European organizations; however, there are parallels that are useful for consideration in higher education contexts. Below, we emphasize aspects that are most relevant to this review of general user perceptions.

Case studies were undertaken at one car manufacturing business (n=33) and one teacher professional association (n=20) where the participants used a learning software, Learn-B, during regular work activities for two months. The researchers had developed and incorporated seven different technological scaffolding interventions into Learn-B, to enhance users’ self-regulated learning. They then analyzed trace data along with self-report surveys to determine the use of, and perceived usefulness of each intervention. Interestingly, the participants’ self-reported effects of the LA interventions on their self-regulated learning were often incongruous with the trace data, which indicates a discrepancy between users’ perceptions and their actual behaviour. Users also indicated that usage information, and user-recommended learning goals complemented the functions of system-recommended competences, learning paths, learning activities, and knowledge assets. These findings suggest that users of LA tools prefer to understand institutional expectations before setting personal learning goals themselves. Overall, this research supports the inclusion of “recommender” system technology in workplace learning environments, which also has implications for LA use in higher education.

As Sclater (2017) explains, recommender systems are widely used to provide personalized recomendations in Internet applications. For example, Amazon provides recommendations for new or featured products using data from users’ previous searches and purchases. Once a product is selected, recommendations for other products “frequently bought together” are displayed to the customer. In educational contexts, course recommender systems that recommend which courses students should take and in what order are most notable (e.g., Stanford University’s Course Rank, Austin Peay State University’s Degree Compass).

License

Share This Book