New Brunswick

Combatting use of AI by New Brunswick university students — or not

While AI can help students with things like brainstorming and grammar checking, concerns remain about its impact on critical thinking and academic integrity.

Many professors resigned to a future with artificial intelligence

The worlds AI and Artificial Intelligence are seen in this illustration, with a metal, female bust and computer keyboard.
Universities have always struggled with cheating but artificial intelligence has taken that battle to another level. (Dado Ruvic/Reuters)

Like it or not, artificial intelligence is here to stay and many universities are trying to figure out how to live with it — and even take advantage of it. 

"It's not going away," said Toni Roberts, director of the Purdy Crawford Teaching Centre at Mount Allison University in Sackville. 

"So if it's not going away, how do we handle it?"

The key, he said, is to use it "to our advantage" rather than letting students use it to avoid learning and putting in the work. 

Roberts, who teaches in the sociology department, has looked at the research into the use of AI by students. He said more than 80 per cent of students use generative AI. 

Yet, "only about 30 per cent of assignments, papers, what have you, are being identified by faculty as having used generative AI," he said. So a lot of them are slipping past professors. 

WATCH | See how N.B. universities address AI use:

Universities grapple with AI, seek ways to adapt and benefit

1 month ago
Duration 2:41
Most New Brunswick universities allow individual professors to determine how much AI is allowed in their classes — some embrace its use, while others shun it completely.

One of the fears, said Jennifer Tomes, the dean of science and graduate studies at Mount Allison University, is that AI interferes with the development of critical thinking skills.

"The reason that a professor creates an assignment or a particular piece of writing or whatever it may be is because we want the students to go through the process of doing that, to develop the skills to do that particular kind of thing," she said.

Tomes agrees there are ways for students to effectively use AI. She said some employers even expect students to learn how to use it "appropriately and ethically."  

Roberts said professors have the "academic freedom" to decide how — or whether — to use AI. 

A man holding a small brown and white dog squats down in front of a stately brick building.
Toni Roberts, director of the Purdy Crawford Teaching Centre at Mount Allison University, says he works with his students to 'co-create' a policy on how to use AI. (Submitted by Toni Roberts)

Some have set parameters around the use of AI, while others have banned it completely, said Tomes, the university's academic integrity officer. 

While Mount Allison doesn't have a formal policy about AI, it is working on a set of guidelines or best practices, and it does have an academic integrity policy that says students aren't allowed to cheat or misrepresent information.

So if they're caught using AI in a class that bans it, the student goes before Tomes and an academic integrity committee. 

Potentially skewing grade levels

If professors do lean into it, there's an expectation that students will do better work, said St. Thomas University associate professor Andrew Klein.

He said if AI can reliably perform at a C level, then a C level is no longer a passing grade. 

"If the grade scale is shifting because we're no longer expecting C level work, that does mean that we're expecting all students to be able to produce work beyond what they could before. So the work will get better. I mean, it's possible that it will make us all smarter." 

Like a lot of professors, Tomes has opted for the middle ground. She said she's clear with students about how they can — and cannot — use AI. 

In his syllabus, Klein tells students that while using such technology may be "cool and part of our world now," using it without his specific say-so isn't allowed and will be considered plagiarism.

One of the goals in his English class is "to guide you through the learning process of becoming a stronger reader and writer, he said. "This requires consistent practice, like most things worth learning, and using generative AI short-circuits that process and shortchanges you on your investment in your education."

Man smiles at the camera, standing in front of a book shelf full of books.
Andrew Klein, an associate professor in the English department at St. Thomas University, says AI cannot be used in his classes unless he specifically says it's OK. (Submitted by Andrew Klein)

Klein said he's pretty confident that he can detect what a "ChatGPT 3.5 level paper looks like. It's very easy to see the telltale signs once you've seen a few." 

"On the other hand, I'm also very confident that I've been fooled many times by the higher level ones," said Klein.

Tomes said some professors use software to detect AI. 

"But we are also aware that those are notoriously bad as well, that many of the programs that are designed to check for artificial intelligence don't produce reliable results."

She said they also tend to flag students whose first language isn't English more frequently. 

Tomes said she often tries to create "AI-proof assignments" by asking students to "find things that are a little bit more obscure." 

A short-haired woman with glasses smiles at the camera.
Jennifer Tomes, the dean of science and graduate studies at Mount Allison University, is also the university’s academic integrity officer. (Submitted by Jennifer Tomes)

Roberts said he works with his students to create their own policy. It's a conversation they have during the very first class. 

He said students "most of the time" decide "we cannot and should not use generative AI to do our work, but that we can use it for things like idea generation, brainstorming, checking grammar, coming up with ideas for the assignment they're working on, that sort of thing."

So students agree that it can't be used to write an entire paper, but it can be used to "scaffold" or support their work, said Roberts. 

No one was available from the University of New Brunswick for an interview, but in an emailed statement, the university said it allows professors to decide whether to allow generative AI tools in their courses.

"This flexibility allows them to better support diverse student learners and enhance the overall learning experience," said Petra Hauf, UNB's provost and vice-president academic.

"UNB maintains a strict stance on academic offences, including plagiarism and cheating and our Academic Offences policy includes allegations involving AI under its definition of plagiarism."

ABOUT THE AUTHOR

Mia Urquhart is a journalist with CBC New Brunswick, based in Saint John. She can be reached at mia.urquhart@cbc.ca.

With files from Raechel Huizinga