With artificial intelligence (AI) software becoming more sophisticated, Lakehead University officials are considering how it might be positively used in classrooms.
Faculty and staff met in February for a panel discussion on the rapidly emerging technology. They discussed AI’s potential as a learning tool, while also considering ways to limit academic misconduct as its use becomes more widespread.
It’s a conversation taking place at universities around the world, and Lakehead is trying to walk the middle road between preventing AI’s use in plagiarism and harnessing it as a tool to enrich its learning environment.
“It comes with its challenges, but this is really something that we are going to need to embrace,” said Rhonda Koster, deputy provost of teaching and learning. “(Plagiarism) becomes the fundamental concern for most folks, but at the same time, there is a real recognition that these new AI (technologies) provide a really interesting opportunity to engage it as a teaching tool.”
Koster said the university is working on strategies and guidelines to outline the acceptable use of AI in its classrooms and to prevent its nefarious uses — for example, using AI to generate essays students might pass off as their own.
“Our student code of conduct already would cover the unapproved use of AI,” Koster said, noting there are AI detection tools being developed to crack down on plagiarism. “There’s definitely going to be elements that we’re going to have to figure out how to put down so that … it’s very clear for students how it can or cannot be utilized.”
Although there is no timeline for implementing any policy or guidelines yet, there are plans to host further panel discussions on the topic, with another planned for March specifically to engage students.
Despite the looming issue of plagiarism, Koster said some Lakehead professors are already finding positive ways to use AI in the classroom, and she highlighted ways the technology could be used.
One possible application for AI in the classroom would be for students to critically analyze the work it generates on a variety of topics.
“AI is pulling on open-access published materials, much of which can be outdated or even be erroneous thinking. Let’s look at that versus contemporary research, and where are you seeing how this isn’t meeting what current thought is? It might be racist or it might be outdated technology,” she said.
“I think that can be a very powerful tool and something that our educated students would be able to take forward as they move into whatever kinds of things they’re going to be doing in society.”
For computer science professor Vijay Mago, AI presents an opportunity to help students keep up with course work.
If a student misses a class, for example, they could use ChatGPT to help generate explanations for work done in class.
“If that student needs some additional support, he can go on ChatGPT, use the slide deck (from class) and generate some sample codes,” Mago said. “ChatGPT is very good at generating explanations, but it is not good with generating code … so it could be an assistant.”
Because AI software isn’t perfect, a big part of preventing plagiarism, Mago said, could be understanding its weaknesses and ensuring his assignments lie beyond the scope of what it can do.
“The faculty members have to innovate; they have to keep up with ChatGPT. It is going to be an integral part of our learning tools now,” he said.
With a research background in large language models, in which ChatGPT is rooted, Mago has spent time testing the limitations of the software.
“On the surface level, it looks really cool, but if you go further into details and try to give some challenging pushes to ChatGPT, it does not do very well, especially when it comes to programming,” he said.
“People say that ChatGPT is going to replace software developers because it can generate the code very elegantly. It does, to some extent, but it cannot generate complex codes that require a lot of logic. It cannot generate a code that requires human intelligence.”
To date, Mago has not seen any of his students use AI for plagiarism, and he said the technology should be embraced in post-secondary learning.
“These AI technologies are not bad. It’s evolution and we should embrace it in a positive way,” he said. “People who have used ChatGPT in the right manner — I think they will be more competent as well.”
For now, AI is permitted in classrooms at the discretion of professors, Koster said, and she views it as an integral part of preparing Lakehead’s students for the world.
“We can’t ban it because then our students are going to be graduating, having never used it, and go into practice where it’s being used,” she said. “We need that critical lens to help students understand its appropriate applications and what it’s actually generating for us.”
Simcoe North MPP Jill Dunlop, the Minister of Colleges and Universities, was asked to comment on this story. She provided a statement, saying Ontario's post-secondary institutions are responsible for determining the appropriate uses of AI themselves.
"Ontario’s colleges and universities are autonomous institutions and are responsible for developing policies and procedures to govern their institutions, including the use of technology in educational settings," Dunlop said.
"That said, the government supports postsecondary institutions as they manage emerging technologies through initiatives like eCampusOntario, an online portal which increases access to technology-enabling learning, and Contact North | Nord, a free, local bilingual service to help residents in rural and remote Ontario communities access online programs and courses from colleges, universities, and training providers without having to leave their communities," she continued.
"As publicly assisted colleges, universities and Indigenous institutions continue to explore new and emerging technologies, eCampusOntario is available to assist in these activities."
Lakehead’s next panel discussion on AI will be held March 23.