Educators ChatGPT
ChatGPT chat bot screen seen on smartphone and laptop display. (Shutterstock photo)
Read: 3 min

Schools may be out for the summer, but educators are still reckoning with the chaos that large language models (LLM) such as ChatGPT introduced to the classroom this year. 

“I think it’s really important that we as educators help students learn as much as possible about AI, its capabilities, limitations and its ethical considerations,” said Myke Healy, who chairs the AI task force at Trinity College, a private high school in southern Ontario. 

“Many of us are discussing the topic, at length, every day,” he said.

And yet few have been able to nail down an exact plan for when generative AI can be used, how it should be sourced and if it should be used at all.

Educator guidelines on permissibility of generative AI vary

At the University of Toronto, the decision regarding AI use in coursework rests with the individual professor. The university has released a syllabus guideline with sample statements for professors to use when telling their students where AI use is acceptable.

The restrictions are “similar to indicating to students when they may collaborate, and to what degree, with their classmates, and when an assignment should be solely their own work,” the guideline notes.

McMaster University has likewise released a set of guidelines for generative AI use but acknowledges they are merely a “starting point.” The guidelines provide that “unless otherwise stated, students should assume use of generative AI is prohibited.” 

McMaster says these guidelines “will be regularly reviewed and revised with the aim of updating them before winter course outlines are due.” 

ChatGPT responses are like conversations, making sourcing difficult

There is also the question of how ChatGPT should be cited.

“Educators are looking for clarity on both how and who to cite when using content generated from a large language model,” said Healy.

The University of Toronto recommends its professors instruct students to cite any work informed by generative AI using the tentative guidelines issued by the Modern Language Association (MLA).

The MLA, an international group that studies language and literature, has an established style for citing sources. It recommends that a citation include details such as the prompt itself, the name of the program, the version of the program and the company that makes it. 

For example, if a student asks ChatGPT, “Describe the symbolism of the green light in the book The Great Gatsby,” the MLA recommends the response be cited as:

Describe the symbolism of the green light in the book The Great Gatsby by F. Scott Fitzgerald” prompt. ChatGPT, 13 Feb. version, OpenAI, 8 Mar. 2023, chat.openai.com/chat.

The American Psychological Association, whose citation is generally used for publications in sciences, psychology and education, is likewise collecting feedback from the public and has put out its own guidelines. 

One of the challenges, writes Timothy McAdoo for the association on its blog, is that LLM results are like a conversation with a source that is not “retrievable” by another person. 

Therefore the association recommends crediting the author of the algorithm, which in this case is OpenAI, the company behind Chat GPT. 

For example, the response to the Gatsby query would be cited in text as “(OpenAI, 2023)” with an end reference “OpenAI. (2023). ChatGPT (Mar 14 version) [Large language model]”.

While including the full text response in an appendix is optional, the APA notes that it is “particularly important to document the exact text created because ChatGPT will generate a unique response in each chat session, even if given the same prompt.”

The long-term challenge will be motivating students to learn

While educators are grappling with near term concerns, generative AI also raises longer term questions. 

“The challenge for teachers in the coming years will be convincing students that the effort is worth it,” said Healy. “Because if you are a Grade 9 or 10 student and the generative AI is producing writing better than you could, then the work that is required to learn how to [write] that well or better takes a lot of effort.”

“So how do you as an educator position the value proposition such that it’s worth it to learn the rules of grammar and learn structure and be able to formulate an argument yourself?”

That may be a question even ChatGPT can’t answer. 

Fin de Pencier is a journalist, photographer and filmmaker based in Toronto. Over the past few years, he has reported on the ground from Ukraine, Armenia, Lebanon and Kazakhstan for outlets such as CTV...

Join the Conversation

3 Comments

Leave a comment
This space exists to enable readers to engage with each other and Canadian Affairs staff. Please keep your comments respectful. By commenting, you agree to abide by our Terms and Conditions. We encourage you to report inappropriate comments to us by emailing contact@canadianaffairs.news.

Your email address will not be published. Required fields are marked *