"

2.6 Misinformation

Misinformation & Deception

The generation of fake, inaccurate, or misleading information through Generative AI could be unintentional or deliberate. AI can be used to generate fake news stories, fake datasets, or otherwise employed in attempts to deceive.

One of example of this is the use of text-to-image and text-to-video Generative AI tools to produce visual media for the purposes of (malicious or not) deception. A deepfake is the product of Generative AI that creates a believable but fake video, audio, or image. They often feature real people saying or doing something that they didn’t really say or do. Deepfakes do have potential benefits for the arts, for social advocacy, for education and for other purposes, but they do present ethical issues because often permission has not been received to use the person’s likeness and because it has the potential to spread misinformation or to mislead people.

Ethical Case Study: Misinformation & Deception

One of your course assignments asks students to produce a piece of speculative fiction reflecting on the future if immediate action isn’t taken in response to Climate Change. One student creates a video of a news report showing the world in crisis. Within the video, they have deep fakes of several world leaders justifying their lack of action over the past 10 years.

What ethical considerations are there around this use of AI?

Feedback

Deepfakes present a few important ethical issues, particularly with regards to misrepresentation, intention to deceive, and politics and political agendas. In this case, the student wasn’t necessarily attempting to deceive viewers, but it’s important to help students understand the ethics of Generative AI and the potential harms if you allow or encourage AI use in your courses.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

AI Literacy for Higher Education Copyright © by ddilkes2 is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.