3.2 Emotional Intelligence
Emotional Intelligence
In this section, we will consider 4 key components of emotional intelligence:
- Self-Awareness: The ability to recognize your emotions and name them. Recognize the emotional response that you have to Generative AI and how this reaction may change over time in different scenarios or as your knowledge of AI changes.
- Self-Regulation: The ability to manage your emotions and prevent them from controlling your behaviour and response to situations. Recognize how your emotional response to Generative AI may impact your desire to use it in your teaching or other work and your response to others using it in their work.
- Empathy: The ability to recognize the emotions of others. Recognize that different people will have different emotional reactions to AI technologies and use, and that this will impact their adoption of or resistance to AI tools.
- Social Skills: The ability to communicate clearly in a way that acknowledges your own and others’ emotions.
The following section will provide multiple case studies exploring different ways in which discussion and use of Generative AI may evoke emotional responses and require Emotional Intelligence. For each case study, you will be asked to reflection on the 4 key components of EI.
Emotional Case Study: Academic Misconduct
You are teaching a first-year writing course and are marking the first assignment. You notice striking similarities in the structure and phrasing of the submissions, an unusual lack of grammatical and semantic errors, and a few tell-tale words and terms (e.g. “It’s important to note that…”, “Both sides have their merits and challenges.”) that make you suspect AI was used to generate the submissions. After hours of marking, you come across an essay with the following text “I don’t have personal experiences since I am an AI. However, I can tailor content to align with specific experiences or perspectives if you provide more details or context to guide the narrative.”
What are the affective considerations in this scenario?
Feedback
Your initial reaction may be frustration or anger, which may be compounded by the work you’ve already put into providing feedback on assignments. Before acting or responding, it might be a good idea to step away and consider what factors may have led to the misuse of Generative AI in this way. Are students confused about the assignment requirements? Do you have a clear policy around acceptable use of AI in your class?
Consider how you will respond to this individual student and the class as a whole around your concerns about Generative AI use.
Emotional Case Study: Using AI for Marking
You are teaching a third-year psychology course. Students are required to submit weekly reflections. You have developed a Generative AI tool to assess the reflections based on a rubric with detailed criteria. You are transparent about the use of this tool and students are aware that you are using Generative AI for marking. In your midterm evaluations, many students have provided negative feedback about the use of Generative AI for this purpose, with some comments suggesting that you aren’t doing your job as an instructor.
What are the affective considerations in this scenario?
Feedback
You may feel upset or unfairly judged by the student feedback in your midterm evaluations. First, consider your motivations for using Generative AI in this way. Does this use align with your values (see the section on values [link]). Next, consider what emotion students might feel with this use of AI in your courses. How have they been messaged about your reasons for using AI in this way? Have you provided a way for them to otherwise voice their concerns?
Emotional Case Study: AI Tutors
Your department has adopted a new Generative AI tool called TutorAI to help support students who are struggling academically. The tool is designed to provide personalised support to learners by providing knowledge checking questions, assessing responses, and providing resources to help learners address knowledge gaps. All students are able to access the tool, but students who are identified as needing remedial support are required to use this tool.
What are the affective considerations in this scenario?
Feedback
Use of Generative AI tools in this way may introduce uncertainty or discomfort, particularly if you feel as though part of your job is being replaced by technology. Consider your professional identity as an educator – is providing personalised support an important part of your practice? If so, how can you integrate this tool into your practices in a way that aligns with your values? (see the section on values [link]). Also consider how students may feel about the use of this tool. Are there particular student populations who may experience unique challenges in using an AI support tool? How do you identify and respond to these needs?
Emotional Case Study: AI Refusal
You are teaching a graduate research skills seminars, supporting students through the research process. You have asked students to use Generative AI to support their literature review for their research proposal. You have provided clear guidelines on how to use it and how to document its use. One of the learning outcomes that you’re hoping to achieve is being able to critically evaluate Generative AI tools and learning how to use them to support knowledge production. One of your students tells you that they believe that the use of Generative AI is completely unethical and refuses to use the tool for this assignment.
What are the affective considerations in this scenario?
Feedback
Choices around whether or not and how to use Generative AI technologies are very personal and tied into our individual ethics and values. You may feel conflicted or judged when another person’s use or views doesn’t align with your own. Part of emotional intelligence is considering different perspectives: what personal beliefs or values might lead to a students’ decision to not use Generative AI? How can you communicate your rationale for the use of this tool? What alternatives can you provide the learner to allow them to achieve or demonstrate the same learning?
If you do integrate Generative AI technologies in your teaching, learning activities, or assessments, you will also be introducing a need for learners to increase their emotional intelligence with regards to how these tools are being used. The following case provides an example of EI considerations from a student perspective.
Emotional Case Study: An AI Teammate
You are teaching a 4th year business course where students work on weekly business case studies in groups. The groups are established at the beginning of the semester and remain the same throughout the semester. This year, you’ve implemented a new requirement that all groups must create an AI Team member. They will assign the AI team member a persona and role on the team that complements the strengths of the human group members. Groups are required to engage with the AI team members on all case studies and document how and when the Generative AI tool is used. In the middle of the term, a student comes to you with significant concerns about the use of Generative AI in this way. They believe that their team is assigning too much authority to the contributions of the Generative AI team member, constantly deferring to their recommendations and ignoring the ideas of the human members of the team. Because of this, it seems that most of the group members have disengaged from the group work.
What are the affective considerations in this scenario?
Feedback
Students may feel unvalued or less confident in their contributions to a group where the input from a Generative AI tool is being privileged. Consider how you could support the development of group dynamics, including with the AI team member, and how you can instil critical reflection in your students to allow them to be more critical of Generative AI outputs.