Listen to the story here:
Professors and lecturers at Toronto Metropolitan University (TMU) are updating their syllabuses in response to the recent development of artificial intelligence program ChatGPT.
ChatGPT is a conversation bot launched in late 2022 by San Francisco-based startup OpenAI. It provides human-like answers to prompts and questions generated by users through its online platform. The program is currently in its research preview phase and is free for anyone to use.
In an email sent to instructors last month, journalism chair Ravindra Mohabeer said the language used in class syllabuses’ program-wide academic integrity policy had been updated to specifically address “fast-moving developments in AI, most recently the use of ChatGPT and implications for the post-secondary sector.”
Other professors and lecturers in other departments are making changes to their course curricula to integrate AI tech into their lessons and assignments.
Robert Dowler, a contract lecturer at TMU’s School of Urban and Regional Planning, teaches a fourth-year capstone course and changed his syllabus for the first time in six years to focus on higher-order skills like critical thinking, evaluation, analysis and the “creative spark” that he says are skills an AI bot can’t match.
“The marking schemes for the assignments give more points for those higher-order human skills,” Dowler said, “recognizing that in the future as planners enter the world of AI, they may be able to rely on new tools for some of the more mechanistic aspects of a job.”
He also said his students are able to use ChatGPT to help them create a literature review or the first draft of their essay, but they must thoroughly review and edit the bot’s work to ensure accuracy.
Assistant production professor Michael Bergmann said he encourages his students to use AI tech as a source of inspiration. For example, if they are putting on a performance, they can use ChatGPT to create a script for them, but as creatives, they shouldn’t use the bot-generated text verbatim.
“It’s a starting point, it shouldn’t be an end point,” Bergmann said.
Due to ChatGPT’s ability to provide synopses of information in a conversational and easy-to-understand manner, Chris MacDonald, an associate professor of law and business at TMU, said he has been trying to determine a role for the AI service in his classroom, including having students use the chatbot as a tutor.
“Saying we want to embrace the technology doesn’t mean we stop worrying about it. But it may mean we find a role for it,” he said.
Concerns around ethics and academic integrity
That said, students could, in theory, ask the bot to do their homework for them. TMU faculty and staff interviewed by On The Record said they are concerned about the potential for widespread academic misconduct, should students decide to use the bot to complete their assignments.
“In university, the challenge is that we’re fundamentally here to evaluate your skills,” MacDonald said. “Anything that makes it harder for us to evaluate you is a problem.”
TMU’s Policy 60 deals with academic integrity and misconduct. While it doesn’t include specific rules around using AI to complete assignments, it defines academic misconduct as “any behaviour that a student knew, or reasonably ought to have known, could gain them or others unearned academic advantage.”
Allyson Miller, an academic integrity specialist at TMU’s Academic Integrity Office (AIO), said using AI tech for assignments when it isn’t permitted by the instructor can be considered cheating and plagiarism.
“If you had a text generator like ChatGPT to write an assignment for you, and somebody submitted it without citation and acknowledging that contribution, then that’s technically plagiarism,” Miller said.
To combat issues of academic misconduct, Bergmann — who previously worked in the AIO — said, “The point shouldn’t be about how to make a better system to detect plagiarism,” but instructors should instead change their assignments to acknowledge AI use, as well as educate students on the ethics of using platforms like ChatGPT.
MacDonald agreed, saying that professors can alter the requirements of assignments to be more complex. For example, they can require students to include citations or references to course materials that ChatGPT doesn’t have access to in order to ensure students do the work themselves.
How can you tell the difference?
As AI tech continues to develop and become smarter, MacDonald said it can be harder to tell whether or not something was written by a student or a bot. However, he and Miller agreed that ChatGPT has a formulaic writing style that is noticeably different from students’ work.
“Even an A-plus student doesn’t have the same kind of writing style and pattern that ChatGPT does,” MacDonald said. “Writing is a very distinctive thing. We all know that our favourite authors have a very particular rhythm, a very particular style.”
“ChatGPT has one as well.”
What’s next for AI tech?
Even though it’s not a person, the ChatGPT bot has passed exams at the University of Minnesota Law School and the University of Pennsylvania’s Wharton School of Business. The bot’s results were equivalent to those of a C+ student on the law exams and it scored between a B- and a B on the business exam.
Leading tech companies have also joined the interactive AI race. In January, Microsoft announced it was investing $10 billion into OpenAI and has since launched an AI-powered Bing search engine to allow users to chat with a bot to get more detailed answers to their queries. On Monday, Google launched Bard in response to ChatGPT — it also works as part of the search engine to provide users with a short text summary of their results, rather than simply listing links.
When it comes to AI tech use at TMU, Miller said the AIO is not discouraging instructors.
“What we’re recommending the profs do is just be super explicit with students about what the expectations and rules are for their course,” she said.
Contributing editor, On The Record, Winter 2023