We use cookies on this site to enhance your experience.
By selecting “Accept” and continuing to use this website, you consent to the use of cookies.
Search for academic programs, residence, tours and events and more.
Jan. 12, 2024
Print | PDFChat GPT was released on Nov. 30, 2022. In just over a year, the generative artificial intelligence (gen AI) application and a host of others have sparked conversations about their implications for teaching, learning, research and work.
“There’s a panic surrounding AI right now and I think the antidote is actually using the technology to understand what it can and can’t do,” says Laurier History Professor Mark Humphries.
Humphries has been actively using ChatGPT since its launch last fall, exploring novel ways it can be incorporated into research and teaching. He has even launched a blog on the topic as a resource for academics featuring articles covering topics including the challenge of regulating gen AI use in research and how to introduce ChatGPT in university classrooms. These are questions universities around the world are navigating in a rapidly changing technological landscape.
At Laurier, the university launched a Generative AI Committee during the summer of 2023 to examine gen AI’s implications for the university, from teaching and learning to research and administrative work. The committee and subcommittees are meeting on a regular basis and recently developed and released an institutional statement on the use of gen AI and principles of use for faculty and staff.
Humphries sits on the committee as the representative from the Faculty of Arts. He is excited about the possibilities of gen AI, comparing it to the introduction of the smartphone.
“When the iPhone first came out, it didn’t change the world overnight, but within 10 years the ubiquitous nature of smartphones changed how we do everything, from navigating a vehicle to booking a restaurant table. I think this is going to be similar – but more profound.”
At the same time, Humphries sees risks if the technology is not used properly.
“Gen AI is not always going to do what you ask it to do in the ways you planned,” he says. “If you don’t identify your problem properly, it may appear to complete a task but make mistakes, or just go off in the wrong direction.”
Given gen AI’s disruptive potential and the risks the technology presents, Humphries and a growing number of university professors are realizing it is critical that students be introduced to the technology and how to use it in an ethical and responsible way.
“A lot of students have shared their perception with me that, in the future, they are just going to use generative AI to do their work, whether at school or in the workforce,” says Humphries. “I usually respond by saying, ‘But if you’re hired for a job and all you have to do is ask ChatGPT to complete your work for you while you do something else, that job is not going to exist for very long.’”
Instead, Humphries challenges students to think about how they would use AI to augment their work. As an example, arts students typically do not have coding skills. However, with a little bit of training, they could leverage the power of gen AI to do data processing and analysis, something they would not have been capable of previously without a background in statistics or computer science.
“Together, we can figure out how to use this technology to do our work better and more efficiently,” he says.
It’s a sentiment shared by Brandon Mattalo, assistant professor of strategic management in the Lazaridis School of Business and Economics.
“In my classes, I talk with students about how they can incorporate gen AI in their work,” says Mattalo.
One example is designing an assignment in which students work with gen AI to produce a product and having them submit their conversation and prompts with Chat GPT as part of the assessment.
“Gen AI is going to require of all of us to use higher-order thinking,” says Mattalo. “Students will have to use AI to implement their ideas. It will become increasingly important to learn how to strategize, rather than just complete tasks.”
The other area Mattalo sees great promise for gen AI is in using it as an individualized tutor. He jokes about a recent teaching experience, where he was explaining a concept in employment law and referenced the movie Office Space.
“No student got the reference – they weren’t born when the movie came out in 1999,” he says.
In this instance, the student could ask ChatGPT for a more generationally or culturally relevant metaphor to explain the concept.
“You can do this in multiple ways until you find an explanation that works for you,” says Mattalo.
Over a year into the broad adoption of gen AI, the future is uncertain.
“It’s a bit scary to me, in the sense of how quickly this is evolving,” says Humphries. “AI is already doing things now that I thought would be years away last winter.”
In the face of this uncertainty, Heidi Northwood, provost and vice-president: academic and chair of Laurier’s Generative AI Committee, says it is critical that across the university faculty, staff and students are engaging in open conversations about the technology. Northwood says Laurier’s recently released institutional statement on the use of generative AI for faculty and staff and principles of use will help in those conversations.
“We need to foster a culture of transparency and a shared commitment to learning as we explore gen AI’s implications for teaching, learning, research and privacy,” says Northwood. “I am looking forward to engaging in this important work with colleagues.”