Evaluation

Educators routinely evaluate their teaching and learning activities and virtual gaming simulation should be no exception. Responsive evaluation, where educators evaluate and apply what they learn to their practice, helps improve the learner experience, promoting learning and learner satisfaction (Stake, 1975).

Evaluation questions may include:

  • Did the learners learn?
  • What actions or activities contributed to learning?
  • Was my facilitation technique or strategy effective? What could I do better?

An important principle of evaluation is not to try to answer all questions in one evaluation. It is a good idea to clarify the scope of the evaluation in the initial planning stage.  A helpful way to get started is to develop a list of questions that the teaching team most wants answered and which can be answered. Once those questions/outcomes have been identified and prioritized, educators should ask: which group should be the focus of the evaluation, what data collection method should be used, how will the analysis be conducted and who will do the analysis?

Just as specific learning outcomes drive an educator’s choice of virtual gaming simulation, they also drive the evaluation strategy an educator will use. There are several evaluation methods that can be used: learner testing, focus groups, surveys, facilitator self-reflections and peer feedback. Sample outcomes and evaluation strategies are outlined in Table 7.1.

Click here to download an accessible PDF copy of Table 7.1

Table 7.1. Evaluating Learner Outcomes and Evaluation Strategies

Outcome Approach

Learner knowledge gains

  • Pre-post multiple choice knowledge test
  • Survey with open-ended items
  • Reflective practice activities
  • Analytics

Learner virtual gaming simulation satisfaction (including the debrief)

  • Informal discussions
  • Surveys
  • Focus group interviews

Impact on practice

  • Survey with open-ended items
  • Reflective practice activities
  • Learner feedback

Learner team building skills

  • Informal discussions
  • Reflective practice activities
  • Surveys

Learner self-efficacy

  • Informal discussions
  • Surveys

Facilitator skills

  • Learner feedback
  • Peer mentoring/review
  • Co-debrief the virtual gaming simulation

Be sure your evaluation plan is feasible; avoid developing a plan that takes too much time or resources.  Consider what, if any, specialized data collection or analysis skills will be needed to conduct the evaluation. For example, a basic knowledge of how to run and interpret statistical tests will be needed for measuring pre and post activity knowledge gains. Educators should think through their choice of evaluation method and ensure they have the necessary resources.

Director's slate

Examples in Action: Assessing Knowledge

In response to the question, “What do we really want to know?” we wanted to know if learners gained knowledge by playing the virtual gaming simulation on neonatal care. We asked learners to complete a 10 item multiple choice quiz, based on neonatal care virtual gaming simulation learning outcomes, online, before playing the simulation and again one week after playing the virtual simulation. The data were collected on Qualtrics (an online survey software package) and a member of our team used the statistics generated by Qualtrics to see if learners’ scores increased, providing evidence of knowledge gains.

Another important point to consider is how the evaluation results will be used. There are two main types of evaluation that apply to virtual simulation: formative and summative.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Using Virtual Gaming Simulation: An Educator's Guide Copyright © 2022 by Centennial College is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book