Manitoba

Use of AI in Manitoba court documents must be disclosed, chief justice says

Attorneys in Manitoba will now have to disclose if they’ve used artificial intelligence to prepare court documents in the Court of King’s Bench, following a legal mishap south of the border where an AI program generated fake case law. 

Experts say AI presents dilemmas for legal system, but also opportunities

A photo of the Manitoba Law Courts building.
Manitoba lawyers will now have to disclose to the Court of King's Bench if they have used artificial intelligence when preparing submissions. (John Woods/The Canadian Press)

Attorneys in Manitoba will now have to disclose if they've used artificial intelligence to prepare court documents in the Court of King's Bench, following a legal mishap south of the border where an AI program generated fake case law. 

Manitoba Chief Justice Glenn Joyal issued the practice direction last week, which acknowledged that artificial intelligence might be used in court submissions in the future. 

"While it is impossible at this time to completely and accurately predict how artificial intelligence may develop or how to exactly define the responsible use of artificial intelligence in court cases, there are legitimate concerns about the reliability and accuracy of the information generated from the use of artificial intelligence," the order states. 

Though Joyal's directive doesn't make mention of it, the directive does come after a case in New York where attorneys blamed ChatGPT for their submission of fictitious legal research in an aviation injury claim. 

Lawyers in Manitoba who spoke to CBC news said they welcomed the decision, but they weren't aware of AI being used to generate court documents within the province. 

Chris Gamby with the Criminal Defence Lawyers Association of Manitoba said he thinks the directive is a good proactive measure given AI's increasing popularity. 

"I think that this technology at this point is in its infancy. It has great potential, it also has the possibility of having a number of drawbacks and we really don't know what those are yet and it's developing so quickly," he said. 

However, he said he would be surprised if any lawyers in Manitoba were actually using AI to help with their court submissions at this point. 

"There really is very much a human component to the profession and if you are going to persuade, you know, jurors sitting in a case, it's probably your own voice that you're going to want to use to do that."

Representatives from the Law Society of Manitoba and Manitoba Bar Association also said they weren't aware of any instances of AI being used to create court documents. 

Ethical concerns 

As the use of AI continues to expand, experts say it could create ethical dilemmas for the legal profession down the road.

But they also said it presents some potential benefits, such as helping lighten lawyers' workload, and could even increase access to the justice system.  

AI has already been used in the legal profession for years to help lawyers sift through piles of documents, said Maura Grossman, a research professor in the school of computer science at the University of Waterloo who studies AI ethics, among other things.

The New York case raises the question of whether it should be used in drafting them, she said. 

"I think that's what they're worried about, is getting filings where they can't count on that the cases are real," she said. 

Grossman believes that AI art will be appealing to some people over art created by humans.
Maura Grossman is an AI ethicist and professor at the University of Waterloo. She says the use of open-source AI to prepare legal document presents serious privacy concerns. (Submitted by Maura Grossman)

Another concern is private, confidential information being uploaded to open-source software like ChatGPT, which could potentially violate attorney-client privilege, she said. 

What really concerns Grossman, however, is whether AI usage could result in fake evidence being used in court, such as AI-generated audio or video that shows something that didn't really happen. 

"We are now moving into a world where I can ... take a few minutes of your voice and put it into a a synthesizer and I can take your voice and have you screaming and threatening your children," she said. 

"That's what I worry about, is what is this going to do to a legal system that relies on evidence and on humans being able to make assessments about the value of of evidence and who's telling the truth, when we're no longer going to be able to use our eyes and ears and to do that."

Access to justice

Abdi Aidid, an assistant professor at the University of Toronto's faculty of law, says he thinks the issue in the New York case was ChatGPT being used as a research tool, when that's not what it's meant for. 

If used correctly, he said he's not sure why AI-generated court documents should be considered less reliable than others. 

"It's not entirely clear to me the court needs to interrogate the result of a final product, right? At least not more than they would for a traditional legal filing," he said. 

"I mean, is a court asking what components of a legal filing were done by a summer student and which ones were done by a first-year associate and which ones were done by a senior partner?"

Amid the concerns, Aidid said he thinks AI could help the legal system become more accessible to the average person. 

He said he thinks it will become more common to see people representing themselves in court who might use AI software to draft legal documents, something that might have been too difficult to do themselves. 

Though that might make some in the legal profession nervous, Aidid said he thinks it should be embraced. 

"If we do have an access to justice crisis, and if so many people can't afford lawyers, then why wouldn't we as a public embrace a tool that helps people present their grievances in a way that's legible to our legal system?" he said. 

"There's a lot of legal issues that a layperson can't begin to wrap their head around, and technology can help by making that information more accessible, " he said.

ABOUT THE AUTHOR

Sarah Petz

Reporter

Sarah Petz is a reporter with CBC Toronto. Her career has taken her across three provinces and includes a stint in East Africa. She can be reached at Sarah.Petz@cbc.ca.

With files from Josh Crabb and Associated Press