Challenging the Dystopian Future of AI-infused Classroom
Heather Leatham and Ilan Danjoux
Themes: Assessment, Ethical challenges in using AI, Teaching Strategies
Audience & Subject: Grades 9-12; General
Introduction
The rapid adoption and prevalent use of Artificial Intelligence (AI) technology among students and teachers has led to calls for policies to mitigate and limit the use of AI technology in the classroom. While acknowledging the risks to student privacy and learning, this brief outlines a three-step approach to developing a meaningful approach to using AI within the classroom. The overall goal of this model is to assist teachers in designing and communicating guidelines for the appropriate use of AI in the classroom.
Activity: Meaningful AI in the Classroom
Step 1: The Classification of AI
Overview
The first step consists of establishing a consensus over the meaning of artificial technology by familiarising oneself with the different forms and applications. While much of our conversations around AI have focussed heavily on Open AI and its flagship technology, ChatGPT, one needs only look at the plethora of available AI tools that act as stand-alone products, online services or extensions of existing software packages. The tools range from those aimed at educators, such as MagicSchoolbus.ai, to those geared towards users, such as pictory.
Description
Classifying AI technologies by their function is necessary and helpful. We propose grouping them into three categories:
- Reactive: Think of reactive as your Alexa or Siri, a tool that executes commands and retrieves information (e.g., “Siri, what is the weather for today?”).
- Predictive: Predicting the information you seek. Predictive is Generative is the AI most are currently worried about and talking about. This AI tool makes content based on prompts (e.g., Filling in text in a Google search bar).
- Generative (GenAI): AgI or Q* involves fusing robotics or thinking artificial intelligence to unfamiliar situations. Generative AI is an unknown entity in education, the dystopian future that will change how we teach. We challenge this notion as we see AI as a tool, like any other digital technology, that has its place in education when understood and used when and where needed. In a 2023 survey, many students acknowledge that they are using generative AI (52%) and want courses on how to use it (72%; KMPG, 2023). In the same survey, most students used GenAI to generate ideas and research.
Step 2: Curricular Goals
Overview
The second step involves articulating pedagogical goals and determining where AI technology might support or undermine these objectives. Curriculum goals must inform AI policy if its use in education is to be meaningful. The process begins by asking whether specific applications (reactive, predictive or generative) offer a net benefit or a threat.
Description
A classification of potential benefits and threats is a helpful starting point. For example, we see GenAI parallel to benefits and predictive AI as a challenge. For example:
- Teacher:
- Benefits: Content creation and differentiation,
- Threats: Assessment and skills loss;
- Student:
- Benefits: Comprehension and formative assessment opportunities,
-
- Threats: Privacy and misinformation issues.
When choosing an AI tool, teachers must be mindful of privacy concerns, as MFIPPA (1990) protects a student’s personally identifiable information (PII). Teachers need to work with their school districts on the ethical use of AI to protect student PII. Students must understand the privacy paradox of voluntarily giving up their PII to use a digital tool and how that might impact them (Cavoukian, 2019).
While AI can and should challenge teachers to change their pedagogical practices, assessment is the key area where AI challenges teachers. Traditionally, the triangulation of assessment data has skewed towards products precisely because they were more scalable and objective, while conversations and observation were deemed time-consuming and subjective. The efficiency of generative AI can create products demands reconsidering the cost-benefit analysis. Despite the difficulty in implementation, observations and conversations may soon be considered more reliable assessment measures, although more is needed to negate the use of AI. For example, reactive AI may enhance the use of conversation, as in the case of Otter.ai, which does real-time transcription and summation of conversations, freeing teachers to engage dialogically in learning.
Step 3: A Framework
Overview
The third step involves developing practical approaches to leveraging or minimizing the use of AI in instruction and assessment. The real challenge that AI has forced upon education is the need for clear, safe and pedagogically sound policy around AI technologies. However, only two public school boards in Ontario have AI policies (Danjoux et al., 2024). In the United States, a survey found that Oregon and California had official policies, with eleven other states in development for the fall (Dusseault & Lee, 2023). One framework of interest comes from North Carolina and is a living document.
Description
Understanding that board and governmental policy is slow and that the need for a useful AI classroom strategy is immediate, we propose that teachers adopt an AI assessment metric based on the work of Perkins et al. (2024), in which they propose thinking of the role of AI in assessment as a tiered approach. Using a stoplight metaphor, they ask teachers to consider whether any given evaluation would be best AI-free (red) to AI-empowered (blue). By asking teachers to move from curricular goals to assessment strategies, we consciously employ the backward design model proposed by Wiggins and McTighe (1998).
Resources
Below, we offer some resources that can help you navigate the challenging dystopian future of AI-infused classrooms:
- Massachusetts Institute of Technology (MIT) Media Lab:
- AI and Ethics curriculum,
- “AI and Data Privacy” workshop;
- Code.org’s: AI and Oceans activity;
- The Chinese University of Hong Kong (CUHK): AI for the Future Project (AI4Future);
- IBM’s Educator: AI Classroom Kit;
- Google: Teachable Machine;
- Toronto District School Board (TDSB): AI in Education (curated book list)
- Edutopia: AI tools that help teachers work more efficiently;
- CommonSense Media: ChatGPT and Beyond: How to Handle AI in Schools.
References
Cavoukain, A. (2019, June 27). Dr. Ann Cavoukian: Privacy by design, security by design [Video]. Youtube. https://www.youtube.com/watch?v=xqreZlGL8Dk
Danjoux, I., Leatham, H. & Coulson, E. (2024). AI In education: A discourse analysis of digital use policies in Ontario public school boards. In J. Cohen & G. Solano (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 2006-2010). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/p/224250
Dusseault, B., and Lee, J. (2023, October). AI is already disrupting education, but only 13 states are offering guidance for schools. Center on Reinventing Public Education. Retrieved May 2024, from https://crpe.org/ai-disrupt-ed-13-states/
KPMG. (2023, August 30). While popular with Canadian students, six in 10 consider generative AI tools cheating. Cision. Retrieved May 2024, from https://www.newswire.ca/news-releases/while-popular-with-canadian-students-six-in-10-consider-generative-ai-tools-cheating-821196002.html
Leaton Gray, S. (2020). Artificial intelligence in schools: Towards a democratic future. London Review of Education, 18(2), 163–177. https://doi.org/10.14324/LRE.18.2.02
Municipal Freedom of Information and Protection of Privacy Act, R.S.O. 1990, c. M.56. https://www.ontario.ca/laws/statute/90m56
North Carolina Department of Education. (2024, January 16). Generative AI implementation recommendations and considerations for PK-13 public schools. Retrieved, May 2024, from https://go.ncdpi.gov/AI_Guidelines
Perkins, M., Furze, L., Roe, J., & MacVaugh, J. (2024). The Artificial Intelligence Assessment Scale (AIAS): A framework for ethical integration of generative AI in educational assessment. Journal of University Teaching and Learning Practice, 21(06), 1-18. https://doi.org/10.53761/q3azde36
Wiggins, G., & McTighe, J. (1998). Understanding by design. Association for Supervision and Curriculum Development.