As generative AI becomes more commonplace, people are (naturally) using it to get access to important information, but often trust it too much and don’t bother to double-check, as one Canadian couple learned the hard way when they tried to win a condominium dispute.
Specifically, Robert and Michelle Geismayr from British Columbia went to the Civil Resolution Tribunal in a bid to get an approval for unauthorized alterations in their condo unit and used AI to help them find legal precedent to win the case, according to a tribunal decision issued on February 14.
Notably, the unauthorized alterations happened under the previous owner, making the condo unrentable under local regulations. However, the Geismayrs bought it knowingly, assuming they would get the approval retroactively and then rent the place, which is near a popular ski resort.
Generative AI hallucinations
However, it turns out the generative AI chatbot – in this case, Microsoft Copilot – fed them a bunch of false legal information, as almost all of the 10 court rulings it generated for them and which they submitted to the court as part of their argument to allow the unit changes – didn’t exist.
Per tribunal member Peter Mennie’s reasoning used to dismiss the case:
“The Geismayrs listed the source of these cases as a ‘Conversation with Copilot’ which is an artificial intelligence chatbot. I find it likely that these cases are ‘hallucinations’ where artificial intelligence generates false or misleading results. (…) The state of the law is very different [from] what Copilot reported.”
As for the Geismayrs, they will have to tear down the loft (a wooden platform just below the lot’s ceiling and increases its habitable area) in their condo that the previous owner built without a permit if they want to rent it, and they consider their experience a cautionary tale for other people.
Robert Geismayr said he would continue to use AI to gather general information on a topic but not for future legal or other serious matters.
Meanwhile, generative AI remains a powerful force in content creation and can be useful in retrieving information. That said, its responses shouldn’t be taken at face value, as it is still a relatively young technology and developers are yet to discover an efficient way to eliminate hallucinations.