Navigating the Legal Implications of AI Chatbots: A Case Analysis of Misrepresentation in Moffat v Air Canada 2024 BCCRT 149
Image on Freepik
Introduction:
The
intersection of artificial intelligence (AI) and legal liability has become an
increasingly difficult terrain to navigate, especially concerning the actions
of AI chatbots. As technology advances, the legal landscape grapples with novel
challenges, and one such concern revolves around the potential liability for
misrepresentation by AI chatbots. This discussion delves into the complex legal
issues surrounding misrepresentation through AI chatbots, employing a focused
analysis of the British Columbia Civil Resolution Tribunal (BCCRT) decision in
the case of Moffat v Air Canada (2024 BCCRT 149).
The
use of AI chatbots in various industries has proliferated, offering efficient
and automated user interactions. However, this surge in technological reliance
has raised questions about accountability when such AI entities are involved in
misleading or inaccurate communications. The case of Moffat v Air Canada serves
as a pertinent illustration of the legal intricacies involved when users seek
recourse for alleged misrepresentation perpetrated by an AI chatbot.
Background of the Case
Mr.
Moffat was booking a flight to Toronto after the death of his grandmother and
had asked the chatbot about the airline’s bereavement rates – reduced fares
available to a person who needs to travel due to the death of an immediate
family member.
Mr.
Moffat claimed that the chatbot told him these fares could be claimed
retroactively by completing a refund application within 90 days of the date the
ticket was issued. Mr Moffat submitted a screenshot of his conversation with
the chatbot as evidence supporting this claim, which said, in part, as follows:
Air
Canada offers reduced bereavement fares if you need to travel
because of an imminent death or a death in your immediate family
If
you need to travel immediately or have already travelled and would like to
submit your ticket for a reduced bereavement rate, kindly do so within 90 days
of the date your ticket was issued by completing our Ticket Refund Application
form. (emphasis in original)
He
submitted his request within a week of his travel, supported by a copy of his
grandmother’s death certificate as required. The airline denied his request,
saying that such requests could not be submitted retroactively. Mr. Moffatt’s
attempts to receive a partial refund continued for another two-and-a-half
months.
In
February 2023, Mr. Moffatt emailed the airline and included the screenshot from
the chatbot that set out the 90-day window to request a reduced rate and
confirmed he had filled out the refund form and provided a death certificate.
An
airline representative responded and admitted the chatbot had provided
“misleading words.” The representative pointed out the chatbot’s link to the
bereavement travel webpage containing the airline’s full policy stating that
bereavement fares could not be claimed retroactively, and said Air Canada had
noted the issue so it could update the chatbot.
However,
Mr. Moffatt was still unable to get a partial refund, prompting him to file the
claim with the tribunal.
Analysis
The
airline argued that it could not be held liable for information provided by the
chatbot, an argument that was emphatically rejected by the Members of the
Tribunal.
“In
effect, Air Canada suggests the chatbot is a separate legal entity that is
responsible for its own actions. This is a remarkable submission. While a
chatbot has an interactive component, it is still just a part of Air Canada’s
website. It should be obvious to Air Canada that it is responsible for all the
information on its website. It makes no difference whether the information
comes from a static page or a chatbot’’, read part of the tribunal’s decision
at Paragraph 27.
The
airline also argued that the chatbot’s response had included a link to a
section of its website that outlined the company’s policy, which said refund
requests after travel had occurred were not permitted. This, claimed the
airline, was the controlling information.
This argument was as well rejected by the
Tribunal while stating at Paragraph 28 of the decision that:
While
Air Canada argues Mr. Moffatt could find the correct information on another
part of its website, it does not explain why the webpage titled “Bereavement
travel” was inherently more trustworthy than its chatbot. It also does not
explain why customers should have to double-check information found in one part
of its website on another part of its website.
The
tribunal accepted Mr. Moffat’s claim that he had relied upon the chatbot to
provide accurate information, and found this to be reasonable in the circumstances.
The Member noted that “[t]here is no reason why Mr. Moffatt should know that
one section of Air Canada’s webpage is accurate, and another is not.”
The
compensation the tribunal awarded was the equivalent of the difference between
what Moffatt paid for his flight and a discounted bereavement fare.
Mr.
Moffat was awarded $650.88 in damages for negligent misrepresentation. In
addition, the airline was ordered to pay $36.14 in pre-judgment interest and
$125 in Tribunal fees.
Conclusion
It
is clear from this decision that organizations will be held responsible for the
representations made by their chatbots. Organizations should check that they
have processes in place to verify (and update) the information being provided
by chatbots, and ensure that such information aligns with various policies and
positions. It will not be sufficient to rehabilitate a chatbot’s
misrepresentation/inaccurate summary by merely by providing a link to the
actual policy or document.
When
chatbots are provided by third parties, both the organization and the chatbot
provider should ensure contractual terms appropriately set out which party is
responsible (and liable) for the chatbot’s content, training, and responses.
Otieno Omondi is an editor at the University of Nairobi Law Journal.
Comments
Post a Comment