British Columbia Civil Resolution Tribunal rules against Air Canada in a case involving misinformation from its AI chatbot, setting a precedent for accountability in the airline and travel industries. The ruling highlights the importance of accurate AI tools and the need for improved regulatory frameworks.
A recent decision by the British Columbia Civil Resolution Tribunal has held Air Canada liable for misinformation provided by its AI-powered chatbot. The case centers around passenger Jake Moffatt, who was told by the chatbot in 2022 that he could book a full-fare flight for his grandmother’s funeral and apply for a bereavement discount afterward. However, Air Canada later denied the discount, stating that the request needed to be made before the flight and argued that the chatbot, as a “separate legal entity,” was responsible for the error.
The Tribunal ruled against Air Canada, ordering the airline to pay Moffatt $812.02 in damages and tribunal fees. Tribunal member Christopher Rivers stated that the airline is responsible for all information on its website, whether provided by a static page or a chatbot.
This ruling could set a precedent for the airline and travel industries, which are increasingly incorporating AI technology. Consumer advocacy group Air Passenger Rights highlighted that companies cannot evade responsibility for erroneous information provided by their tech. The case underscores the importance of accurate AI tools, especially as more travel companies, including Expedia, integrate ChatGPT and similar technologies for customer service.
While AI has its advantages, experts caution that passengers should still verify information and seek human assistance when necessary. The incident illustrates potential pitfalls for businesses heavily reliant on AI, emphasizing the need for better regulatory frameworks and improved accuracy in AI outputs.