Skip to main content

AI Resources for Instructors

Looking to explore how artificial intelligence can support your teaching? We’ve curated resources to help you get started. Access upcoming TLC workshops on AI in education, review our Quick Guide to AI for practical tips, and enroll in our Introductory Gen AI Literacy Course on Canvas to build your foundational knowledge.

 

 

AI Workshops

For a complete list of workshops by the TLC, visit us at https://tlc.ontariotechu.ca/instructional-support/programs-support/workshops/index.php 

Back to B(AI)sics

Date: Monday September 29, 10:00 am to 11:30 am
Facilitators: Kevin Johnson and Miranda Varricchio

Don’t let generative artificial intelligence (GAI) be the elephant in the room. Let’s explore together the good, the bad and the ugly of generative artificial intelligence in the educational context. Learn the basics of generative artificial intelligence and why critical literacy is important as we move forward. This workshop is being re-offered as part of a series that approaches various topics related to GAI.

Learning Outcomes

  • Discuss the benefits and challenges of Generative Artificial Intelligence (GAI) in teaching and learning.
  • Explain the importance of critical literacy in the context of GAI.
  • Identify indicators that GAI has been used in the creation of content.

♦ CUT-eligible: Educational Technologies

 


 

Generative Artificial Intelligence in Teaching - A Quick Guide

Generative artificial intelligence (generative AI) is changing how we teach and how we learn. What we want our students to learn – the core knowledge, skills and values of our disciplines – guide how we craft our curriculum and shape our pedagogical approaches. As educators we have long adapted what and how students learn to changing technology, changes in our disciplinary knowledge, and changes to the context of the University. We care about our students and about what and how they learn.

The capabilities of generative AI to produce coherent, logical and reflective text – as well as images, code, audio and video – invite new, and sudden, change to teaching and learning around the world. How we respond to this change – if we respond – is a personal question, and an institutional one.

While many institutions and organizations are offering guidebooks, webinars and resources for adapting teaching methods and materials to address this rapid shift, the truth is we simply don’t yet know the scale of change required. Will you want to adapt a single assessment? Will we need to rethink the core learning outcomes for a program? Will we need to reconsider the purpose of a post-secondary degree?

To say that any one guide – like this – can prepare you to teach amid the changes brought and coming by generative AI is foolish. We write this guidebook knowing some of its content will be obsolete in months. We wanted examples – so many examples – that we just do not have yet to offer. We wanted to provide clear, simple and actionable advice for how to adjust your courses and your teaching methods, but ran up against the reality of idiosyncratic courses with unique assessments that each require slightly different guidance.

We offer this guide recognizing its limits. It aims to ground you in what generative AI is and what it might mean for student learning and for your teaching here. It explores some of the ethical questions you may already be grappling with and invites you to share those we haven’t yet considered. It offers specific advice for redesigning assessments and for how you might explore the use of generative AI in your teaching. It tries wherever possible to be clear about what we don’t yet know, but are trying to answer.

While generative AI is not new, OpenAI’s launch of ChatGPT in November 2022 marked the fastest recorded adoption of a technology tool to date. Over the following months, the release of similar text-based generative AI tools from Microsoft’s Bing to Google’s Bard, in addition to improvements in tools have contributed to a perception of an explosion of AI.

Indeed, the rapid proliferation of tools and advancements in technology saw over 100 leaders in AI technology write an open letter urging a collective pause on AI developments more powerful than GPT 4 to give time for security and safety features to develop and for the creation of regulation and governance structures.

The need for such regulation or governance extends to full nations, but also to specific sectors, such as post-secondary education. Broader issues related to generative AI include privacy of personal data, risks of misinformation, existential risks, concerns about job dislocation or loss, environmental costs, labour exploitation, and copyright.

While the innovation and creativity of generative AI is exciting, these systems do not come without limitations or ethical challenges. Some of these challenges speak to the specifics of our post-secondary context – like academic integrity – while others intersect with communities, the environment, and humanity as a whole. Many AI experts have documented such alarming concerns including; the size and scale of large language models, misinformation, AI misalignment, and existential risks to humanity.
“Academic Integrity” refers to honest and ethical behaviours in the pursuit of research, education, and scholarly activities. The University promotes a culture based on the fundamental values of Academic Integrity that is sustained by a balance between:
  • Education about the values and behaviours consistent with Academic Integrity; and,
  • The disciplinary measures necessary for those who violate its fundamental values through breaches of Academic Integrity.

Students’ self-reported reasons for academic misconduct include performance pressure, high stakes exams, overwhelming workload, being unprepared, feeling ‘anonymous’, increased opportunities to cheat enabled by technology, peer acceptance of cheating, misunderstanding plagiarism, and feeling like it will go unpunished. Instead of positioning the educator as one to detect and survey, research suggests the role of the instructor be one of designing authentic and scaffolded assessments and explaining and exploring academic integrity with students.

Providing scalable, supported and realistic assessment redesign will be one of the ongoing areas of need for educators as generative AI is integrated into more tools and more courses. 

 

Questions around detecting AI generated writing fall into:

the technological – is it possible to reliably detect AI-generated writing?
the philosophical – is the role of the educator one of trust or one of surveillance?, and
the existential – what is the value of a university degree if the academic labour behind it is uncertain?

There are not yet reliable detection tools. Those that are available – GPTZero, Turnitin, Originality.ai, etc – have been found to misidentify original student content as AI generated, with some findings demonstrating that “these detectors consistently misclassify non-native English writing sample as AI-generated, whereas native writing samples are accurately identified.”[1]

Moreover, students have not consented to having their work submitted to these tools, with open questions related to data privacy and security.[2]

While technology and a perceived ‘arms race’ between detection and AI tools pose their own challenges.[3], there are also questions about the role of educators and their assumptions about students as learners. With significant evidence pointing to student academic misconduct on the rise, particularly over the pandemic, there are arguments that “we must prioritize student learning above catching cheaters”[4] and that understanding why students engage in academic misconduct may point to approaches to reduce these behaviours. Indeed, the instances of academic dishonesty and opportunities to cheat predate generative AI; what the tools introduce is “ease and scope”[5] that amplifies an existing challenge.

References

    1. Liang, 2023 
    2. Mortati, 2021 
    3. Ibid. 
    4. Eaton, 2022 
    5. Supiano, 2023 

If you do use generative AI in your teaching materials or assessment practices, share this with your students both in the course outline and in class. Sharing your use of generative AI with your students is intended to build trust and transparency, and to acknowledge that you are also using – and learning about – generative AI. 

It is important that you check the accuracy of any AI-created content. Recognizing that these tools “hallucinate” – or come up with factually incorrect responses – it is important that you check the accuracy of any content you might use in class, or any feedback offered to a student.

With that said, here are some broad categories where generative AI may be useful to you as an instructor:

Using AI as an Student Tool

If you choose to allow students to use AI in a variety of formats throughout your course, consider using the AI in Assessment Scale to provide consistent, clear guidelines on how you wish AI to be used.

Generating Test Questions and Assignments

By prompting a generative AI tool with the specific context of your course, as well as the subject you are aiming to assess and the kind of question or assignment you are interested in, the generative AI tool can offer many – many – examples of test questions at different levels of complexity, or different types of assignments. You can even ask for assignment ideas that meet the criteria of authentic assessment discussed in the chapter on assessment, or for assignment ideas that incorporate pedagogical approaches you value (e.g. problem based learning, community engaged learning or case based learning).

Generating Examples, Explanations, and Counter Positions

Students benefit from practicing what they are learning with examples. Many, many examples. Generative AI can be powerful in producing lots of examples for students to practice with, while also providing students with feedback on whether their submitted responses are correct, or how they might improve on a response. This personalized, immediate feedback is incredibly powerful for learning.

It can be challenging sometimes to describe a concept at many different levels of complexity. Some courses – especially those with no prerequisites – may have a range of experience and abilities in the class. Using generative AI tools you can quickly develop (and then check for accuracy) multiple explanations for a course concept. You could even have these explanations be written in unique and memorable ways – like, explain the carbon cycle in a limerick or describe the Canadian political parties as characters on the Simpsons.

Generative AI tools like ChatGPT can take on different personas by prompting – for instance, you could ask the tool to “pretend you are a heart surgeon” or “act like you are the Prime Minister”. In assigning this persona, the generative AI tool will produce text written as if from that position. This kind of role can be useful in inviting unique perspectives into a class discussion, or providing a provocative counter point.

Gathering Ideas for Activities and Assessments

Confronted with the challenge of generative AI you may be looking for new ways to teach a concept or skill, or new ways to assess a learning outcome. Generative AI can provide customized suggestions for interactive and engaging classroom activities (e.g. suggest six different interactive ways I could teach an auto-ethnographic research method to a third year, online class of 60 students in Sociology), as well as assessments that either incorporate generative AI or make generative AI less likely to be used.

Assessment has become a thorn in the side of many educators over the past three years. First, the rapid shift to remote teaching during the pandemic forced many educators to adopt assessment approaches that they may not have been comfortable with or that they recognized were not ideal for student learning. Then – just as many of us were returning to the more familiar assessment circumstances of in-person classes – OpenAI released ChatGPT. Any assessment with a non-invigilated written component, including the writing of computer code, now raises questions about if – and to what extent – students are making use of generative AI.

At the same time, the events of the past three years have highlighted existing troubles with our assessment practices and prompted us to reflect on the purpose of assessment in teaching. The language of care in teaching that became more prevalent during the pandemic helped to reframe the conversation about academic integrity into a deeper consideration of why students cheat. One culprit is poorly designed assessments, which may:

  • only require students to recall what they have already learned, and/or
  • are mismatched with what students expect to do and learn in the course, and/or
  • unfairly disadvantage some students and not others, and/or
  • have unnecessarily high stakes.
  • Assessment (re)design thus offers educators the opportunity to have a meaningful impact on issues of academic integrity.

We recognize that you may not currently be able to significantly redesign your course assessments, which takes time and effort to do thoughtfully. Trying anything new in the classroom also carries a degree of risk, particularly for educators who are already in precarious roles like sessional instructors, contractually limited appointments and pre-tenure faculty. Even if you do have the capacity to redesign your assessments, we suggest starting small: addressing the assessment that concerns you the most or will have the greatest impact, and then building on your experience. You can do this by permitting some level of AI use in your assessments. We have created the AI in Assessment Scale to support you through this process to ensure your requirements for AI use are clear and easy-to-understand.

Assessment design principles that may help you in the age of AI include:

What benefits, if any, does generative AI pose for student learning? What learning outcomes could its use support or enhance? This chapter assumes your familiarity with the risks and challenges of generative AI for post-secondary (e.g. academic integrity, assessment design, hallucinations) and imagines what benefits their might be and what opportunities for preparing students for a generative AI supported learning experience.

You can think of the possibilities in two domains:
  • supporting personalized learning and
  • generating academic content.

Generative AI has many capabilities in supporting personalized learning, some of which we detail below. Chief among them is providing actionable, timely and relevant feedback on drafted student content. This feedback might be focused on the grammar or style of the draft, or on the logic of the argument, organization of the piece, or further examples to consider.

With respect to generating academic content or performing academic skills, you want to think carefully about what the core learning outcomes are for the course, and whether and how students can demonstrate these outcomes. Those skills or knowledge that are not essential to the core learning outcomes might be appropriate for ‘cognitive offloading’ to a generative AI tool. Cognitive offloading refers to the use of external resources or tools to change the information processing requirements of a task so as to reduce cognitive demand.[1]

For instance, if your course learning outcomes require students to demonstrate abilities to generate multiple hypotheses to explain a phenomenon, using generative AI to generates these hypotheses would be inappropriate. However, if your course learning outcomes were focused on having students test a hypothesis it in a laboratory setting, having a generative AI tool generate the hypothesis which the student would then test would be an example of appropriate cognitive offloading.

When you consider if you want to inviate students to use generative AI in an assessment, familarize yourself with the AI in Assessment Scale to ensure your expectations are clear. With any of the uses described in the AI in Assessment Scale, it’s important to remind students that what the generative AI tool generates may have hallucinations or biases. Students should be reminded to review and evaluate the output from the generative AI tool to ensure its accuracy and evaluate its effectiveness.

References

  1. Risko and Gilbert, 2016. 

Content is adapted from Generative Artificial Intelligence in Teaching and Learning at McMaster University Copyright © 2023 by Paul R MacPherson Institute for Leadership, Innovation and Excellence in Teaching is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.