British Columbia

B.C. couple referenced non-existent, AI-generated court rulings in condo dispute, tribunal finds

A B.C. couple hoped artificial intelligence would help them find legal precedent to win a condo dispute, but it turns out almost all of the court rulings the chatbot generated for them didn't exist.

Chatbot gave couple false information about condo alterations, says Civil Resolution Tribunal

Selective focus on two hands typing on a laptop.
A new Civil Resolution Tribunal decision says nine out of 10 cases produced as precedent by a chatbot were fake. (silvabom/Shutterstock)

A B.C. couple hoped artificial intelligence would help them find legal precedent to win a condo dispute, but it turns out almost all of the court rulings the chatbot generated for them didn't exist.

Robert and Michelle Geismayr went to the Civil Resolution Tribunal in a bid to get their strata corporation to approve unauthorized alterations in their condo unit, according to a tribunal decision issued late last week.

In strata housing, people own their individual strata lots and, together with other residents, own common property and assets as a strata corporation. All strata corporations are required to have bylaws, which are enforced by a strata council.

The Geismayrs referenced 10 decisions as part of their argument to allow the unit changes and listed Microsoft Copilot as the source of the cases.

Nine of the cases were fake, according to tribunal member Peter Mennie.

"I find it likely that these cases are 'hallucinations' where artificial intelligence generates false or misleading results," Mennie wrote in his Feb. 14 ruling.

Speaking to CBC News Monday, Robert said he hopes his experience can serve as a cautionary tale for other people. 

He said the legal cases produced by AI appeared to be legitimate. 

"It was very disappointing and puts ambiguity about what we can trust," he said. 

He said he will continue to use AI to get general information on a topic but not for future legal or other serious matters. 

The Microsoft logo is seen on display at an event in Barcelona, Spain, in March 2023.
The Geismayrs used Microsoft Copilot, a generative AI chatbot, to help in their dispute against their strata corporation. (Joan Mateu Parra/The Associated Press)

The incident is the latest example of artificial intelligence giving false legal information.

Wyoming lawyers representing plaintiffs in a lawsuit against Walmart over alleged injuries from a defective hoverboard toy said last week that they inadvertently included fake, AI-generated cases in a court filing, according to Reuters.

Meanwhile last year, a B.C. lawyer was ordered to personally compensate her client's ex-wife's lawyers after she included two AI "hallucinations" in a court application related to a family matter.

Alterations made without permit

The Geismayrs bought their Kelowna strata unit in 2020. Before completing the purchase, they were told that the previous owner made alterations to the condo that were not approved by the strata corporation, according to the tribunal decision.

The previous owner added a loft, fire sprinkler heads and moved a fire alarm. They received a stop work order because the alterations were done without a permit, the decision said.

The couple's building is near a ski resort and operates as a hotel condominium, meaning the units can be used as short-term rentals.

Man snowboards on mountain with chairlifts in the background.
The Geismayrs tried sealing off an unauthorized loft in their unit so they could rent it out to people visiting the nearby ski resort. (Darren Calabrese/The Canadian Press)

The Geismayrs were aware that the alterations meant their condo was not rentable. But, the decision said, the couple believed if they aligned the unit with rental guidelines, then the strata corporation would retroactively approve the alterations the previous owners had made. 

On the advice of the rental management organization in charge of their building, the Geismayrs made a number of changes, including sealing off the loft so guests couldn't access it. They were then added to the rental pool for the next three ski seasons.

When the couple went to the strata corporation with this though, the corporation denied their alteration approval and demanded that they fully remove the loft. 

The corporation said it didn't want to set a precedent for other condo owners to also make alterations without permission and then ask for retroactive approvals.

WATCH | Some U.S. lawyers have been caught using false legal briefs created by ChatGPT:

How A.I. is already impacting courtroom proceedings

1 year ago
Duration 0:44
Lawyers in the United States have been caught using false legal briefs created by ChatGPT. But that doesn’t mean that artificial intelligence can’t help in justice proceedings.

Case dismissed

The Geismayrs argued that their 10 court decisions produced by AI said a strata corporation could not force people to remove alterations.

The cases had the parties' names and the years they were published but did not include legal citations, the tribunal decision said.

The only case that was not fake was not related to unauthorized alterations, according to the decision.

"The state of the law is very different than what Copilot reported," wrote tribunal member Mennie.

Multiple previous Civil Resolution Tribunal decisions have found that owners "cannot reasonably expect retroactive approval for alterations done without the strata's prior authorization," according to Mennie, who dismissed the Geismayrs' case.

"I find that the strata's refusal to retroactively approve the strata lot's alterations does not rise to the level of significant unfairness."

ABOUT THE AUTHOR

Yasmine Ghania is an Egyptian-Canadian reporter with CBC News, currently based in Vancouver. She covers the courts, sex crimes and more for local and national audiences. She previously reported in Ottawa, Toronto and all over Saskatchewan and was a finalist for a Canadian Association of Journalists award. Reach her at yasmine.ghania@cbc.ca