Key Findings: Virtual Lab Simulations

Virtual Lab Simulations Overview

Virtual labs and gamified simulations expose learners to exercises and practice opportunities that, in “wet labs,” would normally involve costly equipment, hazardous materials, and techniques that are difficult to provide equitably across students, programs, and geographical locations. Exposure to these methods of investigation is critical to meeting curriculum learning outcomes. Within the STEM (science, technology, engineering, math) disciplines in particular, “simulations and games have great potential to improve science learning in… undergraduate science classrooms” (Honey and Hilton, 2011).

Implementing virtual labs in post-secondary teaching is gaining traction because of the increased demand on physical lab spaces and the barriers of cost, safety, and time required to ensure student preparation. Virtual labs can avoid these barriers and better prepare students for wet labs. A virtual lab can function as a supplement to a wet lab or a pre-lab requirement, or can replace the wet lab experience (Bak et al, 2013).

In studies measuring the benefits of virtual labs, students report that they are an engaging alternative to a lecture-based introduction to wet labs, saying that the self-paced completion and ability to pause and review material contribute to better individual and collective learning outcome achievements (Bak, Dandanell, and Sichhlau-Brunn, 2013). There is also evidence of higher learning outcome achievement when the virtual labs are used in tandem with traditional teaching methods (Bonde et al., 2014).

In addition, gamification elements contained in virtual simulations have been shown to increase motivation and interest in pursuing education in STEM fields (Bonde et al., 2014). Another interesting finding is that students feel less anxiety when introduced to lab practices virtually than in the wet labs, and this alleviated their concerns when preparing to enter the wet labs for the first time (Bak et al., 2013).

The eCampusOntario Educational Technology Sandbox case studies involved integrating virtual labs into a variety of STEM courses (e.g., engineering technology, general chemistry, human biology). Most implementations accompanied face-to-face classes, but a couple were part of an online cohort. The project at the University of Toronto was coupled with a previously funded research and innovation grant, and thereby offered a longer and more intensive evaluation opportunity. All institutions that participated indicated interest in virtual lab simulations as a possible solution to limited availability of physical lab resources as well as a means to deliver the content to more students more often.

Virtual Lab Simulation Teams

The virtual lab simulation teams comprised faculty, staff from teaching and learning centres, continuing education staff, program coordinators, advisory committee members and in some cases, senior administration/leadership team members. Identifying a project lead was part of the expression of interest process, and identifying and describing the responsibilities of this role was critical at the outset.

Those teams that planned and held regular communication and project check-in meetings, either face-to-face or virtually, appeared to be more successful. Centennial College was of particular note: their meetings included not only institutional team members but also members of Labster’s support team. Frequent meetings helped them navigate implementation challenges, such as mapping simulations to curriculum and responding quickly to support issues.

Integration

The labs were integrated in a variety of ways:

  • As a supplementary study aid.
  • As a low-stakes assessment.
  • As a required assessment.
  • Before class (flipped).
  • In-class (active learning) or as an assignment.
  • As a bonus activity/study resource.
  • As a reinforcement of challenging topics raised in class.
  • In tandem with instructor-created lab reports.
  • As a pre-tutorial assignment.
  • In face-to-face, blended, and online classes.

Assessments ranged from 5% to 15% of the course grade and largely were dependent on the number of labs students were expected to complete during a course.

The University of Ottawa integrated the tool into the same course twice under different delivery formats: face-to-face in the fall term and online in the winter term. They also translated the labs into French through a separately funded project.

For lab-based courses, completing a relevant lab was a prerequisite to entering the physical lab. For theory-based courses, the simulations were used to help solidify concepts introduced in class. For the most part, learners completed the labs individually, on their own time. However, in some cases, they worked through the labs in small groups.

The University of Toronto project allowed students the freedom to choose five out of 23 selected labs. Students also completed an exit survey as a lab skills workshop credit/certificate. This approach was intentionally not tied to curriculum in order to offer students an opportunity to deepen their learning on labs that were otherwise unavailable.

The labs that were integrated as a bonus mark or as an optional study aid showed a very low uptake by students, with an average of 20% participating. There was much greater engagement and feedback when the labs were a mandatory part of the course.

Benefits

The value of the labs was more apparent with online learners, providing them with a lab experience that was as close to the real world as possible. But both students and faculty reported several other benefits to participating in virtual labs.

Students:

  • Felt more engaged in the material because of the interactivity.
  • Liked the flexibility to learn according to their own schedule.
  • Improved their critical-thinking capabilities and mastery of skills because they could test and retest their understanding of concepts, troubleshoot various experimentation processes, and interpret the results.
  • Reported that the labs were effective in teaching the concepts behind the experiments.
  • Appreciated being able to follow up on the resources that accompanied the labs.
  • Liked getting immediate feedback through the assessment upon completion of an assignment.

Faculty:

  • Appreciated having the dashboard to quickly assess individual student performance within each of the labs used.
  • Liked having the ability to replicate situations and present material that may not be possible, or that would be more challenging, in real-world scenarios. For example, Centennial College reported the advantage of giving students the opportunity “to observe otherwise unobservable biochemical phenomena,” introducing them to concepts and processes in a unique and visual manner.

Challenges

The following specific challenges were reported:

  • Some content was too advanced for certain curriculum and was difficult to integrate into courses; educators would have preferred to remove certain elements.
  • For some, it seemed that the labs were designed for more advanced, upper-level university courses rather than introductory levels.
  • Alignment with learning outcomes was difficult, especially with well-designed, established courses; in these cases students were exposed to content that seemed to have no relevance to course outcomes.
  • The labs were lengthy, and some students found progressing through them tedious.

Generally, there were strong indications that however beneficial virtual lab simulations are, they cannot be used in isolation. Time spent in a physical lab with hands-on, practical activities is required—in particular when thinking about meeting curriculum learning outcomes and student learning needs. One faculty member noted that it wasn’t possible to actually collect and interpret data—a critical outcome for her course—in the virtual labs. Many students also reported preferring the tactile physical lab experience, saying that they were receiving an inferior learning experience with the virtual labs.

In response to these challenges, it is important to note that simulations can be used effectively in tandem with real-world applications.

User Experience

The labs were reported as being easy to use, with setup and student registrations being seamless with the support of the Labster team. The resources and reference materials provided supported mapping the labs to curriculum learning outcomes.

Anecdotal evidence from students was positive, indicating that the experience left them feeling better prepared for real labs. Some reported that the animations were realistic and that the lab was a well-designed user experience. Users liked being able to control the pace and progression, and the ability to repeat assessments to achieve better scores and understanding of the concepts being tested. Having immediate feedback on the quizzes was effective.

Not all comments were positive. Some said the interface was sensitive to cursor placement and did not display properly on all monitor widths (e.g., screen buttons and tabs were incorrectly positioned). In addition, at times, the simulations seemed to lag or freeze midway depending on the computer operating system and connection speed. Freezing labs was particularly problematic if it meant students had to start the lab over. Students also noted some inconvenience in the simulations working better on desktop computers versus mobile devices. Users also noted a need to create a single sign-on interface with all institutional learning management systems and tie performance to grade books. At the time of this evaluation, only Blackboard’s grades linked to Labster.

There were a few other negative remarks about the user experience: not having a full transcription of the instructions was inconvenient and didn’t promote accessibility; the computer-generated voice and strange pronunciation of words and terms was off-putting, and the length of the labs was challenging (for both learners and educators).

University of Windsor reported another specific challenge: The institution is developing new policies on the use of educational technologies such as Labster, which includes more rigorous scrutiny on tools, particularly for privacy and risk abatement. It is important to recognize that institutions should be encouraged to engage in an evaluation of educational technology, and this process and its accompanying timelines should be considered as an important part of any future proposals.

Future Plans

Cambrian College, Canadore College, Lambton College, and the University of Guelph indicated that they were not planning to use Labster’s virtual lab simulations in the near future, mainly because they were not able to customize the labs to program and course learning outcomes. The uncertainty of cost and how that would be sustained was also a concern.

Institutions that reported they would likely continue were Centennial College, Durham College, Mohawk College, Sault College, University of Ottawa, University of Toronto, and the University of Windsor.

The institutions that reported they were more likely to continue integrating the labs into their technology road map were able to integrate the project with institutional priorities. Those who were keen said they were looking forward to the release of additional labs in anatomy and physiology. One institution surveyed their students, who indicated a willingness to pay for the labs if the cost was manageable and replaced textbook costs.

Lessons Learned

Lessons Learned

A number of the lessons learned reported by institutions were common across projects and seen as critical.

  • In the evaluation of any learning technology, it is important to engage with students on the rationale and get their buy-in as assessors. Their active evaluation and reflection on the usefulness of the labs as a learning tool is critical.
  • Giving students a specific time frame to complete the labs is beneficial, rather than presenting them as an open supplement.
  • Adequate planning time is needed. Some institutions noted that it would be ideal to build a course around the labs rather than trying to squeeze labs into existing curriculum.
  • With this project the licences were available at the start of the fall term. For an effective evaluation, a six-month planning period would be optimal prior to integration.

References

Bak, L. K., Dandanell, G., & Sichlau-Brunn, C. V. (2013). Implementation of Labster at Copenhagen University – A case study. Retrieved from  http://www.elearningmedia.es/sites/default/files/pdfs/The_implementation_of_Labster_at_Copenhagen_University_A_case_study_final.pdf

Bonde, M. T., Makransky, G., Wandall, J., Larsen, M. V., Morsing, M., Jarmer, H., & Sommer, M. (2014). Improving biotech education through gamified laboratory simulations. Nature Biotechnology, 32. http://doi.org/10.1038/nbt.2955

Honey, M. A., & Hilton, M. L. (Eds.) (2011). Learning science through computer games and simulations. Washington, DC: National Academies Press.

License

Share This Book