AI textbooks and chatbots are already changing the way students learn. Should they?
New AI tools will be part of the curriculum at over 100 Canadian universities and colleges this fall
Whenever Queen's University student Ava Ansari struggles with a multiple choice question on an assignment, an AI chat window immediately pops up on the side of her screen.
"It's almost as if I have my own personal teacher sitting with me," said Ansari of the artificial intelligence tool that's built into her digital textbook.
Ansari was involved with beta testing the AI chatbot last year as a student ambassador for textbook publisher Pearson Education. She's one of many testers for the AI tools that will become part of the curriculum at over 100 Canadian universities and colleges this fall.
"If I were to get a question wrong, instead of just giving me kind of a standard little piece of feedback, it really goes into depth and says, 'This is why, exactly, you got it wrong. This is how we can understand the topic differently.'"
- Is AI helping you in your classes, personal life or elsewhere on campus? Let us know in an email: ask@cbc.ca.
This year, Pearson fully launched three generative AI tools, one for instructors and two that will be embedded into digital textbooks in subjects such as biology, chemistry, business and economics. Another textbook publisher, U.S.-based McGraw Hill, announced earlier this month that it's launching two AI tools for students. Three of their textbooks for business students will be used in Canada during the initial rollout.
Though there's enthusiasm about the possibilities AI offers in education — including its ability to simplify difficult-to-understand concepts and provide immediate feedback — there are also concerns about issues such as bias, misleading information and the lack of student-teacher interaction.
What are AI-powered textbooks?
In Pearson's case, the new AI tools pull from content vetted by the company such as the digital textbooks, as well as ChatGPT, a chatbot developed by OpenAI that is trained on data scraped from the internet.
Students can highlight sections or concepts they find confusing and ask the AI tool to generate a simplified explanation. It can also generate quizzes and questions based on what students highlight.
In addition to AI features within the textbooks, Pearson has also launched AI chatbots meant to provide immediate feedback for students as they study and complete assignments. If a student gives an incorrect answer, the chatbot can provide a breakdown of the mistake and guidance on how to fix it
The publishers suggest that the AI tools being added to the digital textbooks are useful because students don't always reach out to professors when they need help, especially when they're studying late at night.
They also note that AI textbooks can help students when they get distracted after coming across a challenging concept. Once the students are more able to understand what they've been stuck on, the publishers say, they can engage more deeply with the material.
AI hallucination concerns
But some educators have concerns about something called AI hallucination, which is when AI provides false information based on non-existent patterns it perceives, according to Joycelyn Kelly, an instructor at Ontario Tech University's artificial intelligence program.
"It's important for students and the general population to understand the ethical implications of using AI and how that can impact them and whether the results that come with AI use are true and factual," Kelly said.
"It's a critical thinking perspective that we really need to focus on for students in this 21st century."
The publishers say there are guardrails in place for the AI tools that will minimize hallucinations. And if they do occur, users can flag issues, including by reporting answers that don't make sense.
The publishers say they also analyze requests being made of the AI tools — without capturing any identifiable user information — to see how they handle those requests to ensure students are getting what they need.
Concerns about AI reflecting human bias
Nikkolas Trillo, a professor in the health, wellness and sciences department at Georgian College in Barrie, Ont., where his courses use McGraw Hill texts, says he will leave it up to students if they want to use the new AI tools.
He's also a graduate student at Ontario Tech University whose Master's thesis focuses on generative AI in higher education. He says the AI textbooks and tools would be helpful if they allow students to better understand their assignments and readings and provide immediate feedback.
"The teacher is obviously not going to respond to you at 2 a.m. if you're kind of cramming for something, but the chatbot might."
Trillo also notes that authors of textbooks aren't infallible and already have inherent biases. So even if AI tools are trained from the textbook's vetted content, he says, they might "reflect or amplify" the biases within that content.
He thinks the tools should be limited to hard sciences, because "that way, some of the information is less likely up to interpretation and subjective nature."
He says the tool may be a cost effective alternative for students who can't afford tutors, but he's also concerned that AI chatbots might replace human tutors and teaching assistants in the future, meaning a loss of jobs.
The friction in human interaction
There's still one thing AI-powered textbooks can't replace — human interaction.
For her diploma in information and computer systems at Camosun College in Victoria, Purvi Dubey used AI assistance, but also consulted with her teaching assistant. She says she preferred the in-person experience because of the "emotional support that instructors provide."
"AI can't really be there for you and be like, 'Hey, you got this right.' It's something that only the instructors can do."
Christopher Snook, a lecturer in the department of classics at Dalhousie University in Halifax, is deeply concerned with how technologies shape our understanding of what education is about.
"An AI bot promises a relationship without friction," he said. "But there is the necessity of a kind of friction in order to help us think more carefully about the works that we're encountering and the larger questions we're considering."
For example, he says, teaching ancient philosophy is predicated on the teacher asking questions and not providing answers so students are prompted to reflect more deeply on the topic.
"But the chatbot works on exactly the opposite premise," Snook said. "The chatbot will provide endless answers."
He says he understands that it's inevitable that schools will use AI as part of their curriculum.
"With respect to universities, my concern is, before becoming quick adopters of new technologies, they ought to become early questioners of new technologies."