After months of resistance, Air Canada was forced to offer a partial refund to a grieving passenger who was fooled by an airline chatbot that inaccurately described the airline's bereavement travel policy.
The day Jake Moffatt's grandmother died, Moffat immediately went to the Air Canada website and booked a flight from Vancouver to Toronto. Unsure of how Air Canada's bereavement rate works, Moffatt asked Air Canada's chatbot for clarification.
Because the chatbot provided inaccurate information, Moffatt booked the flight immediately and requested a refund within 90 days. In fact, Air Canada's policy explicitly states that the airline does not provide refunds for bereavement travel after a flight has been booked. Moffatt dutifully followed the chatbot's advice and attempted to request a refund, but was shocked when his request was rejected.
Moffatt said he had been trying for months to convince Air Canada that he needed a refund, and shared screenshots of a chatbot that clearly made the claim:
Air Canada argued that Moffatt should have known it could not request bereavement rates retroactively because the chatbot response linked to a page containing the actual bereavement travel policy elsewhere. Instead of a refund, the best Air Canada could do was update its chatbot and promise to give Moffatt a $200 coupon for a future flight.
Unsatisfied with this resolution, Moffatt rejected the coupon and filed a small claims lawsuit in the Canadian Civil Resolution Tribunal.
According to Air Canada, Moffat should never trust the chatbot and the airline should not be held liable for any misleading information from the chatbot. That's because Air Canada essentially argued that “chatbots are separate legal entities that are responsible for their own actions.” The order said.
experts say vancouver sun Moffatt's case appears to be the first time a Canadian company has claimed it is not responsible for information provided by its chatbot.
Christopher Rivers, a member of the tribunal that decided the case in Moffatt's favor, called Air Canada's defense “remarkable.”
“Air Canada asserts that it cannot be held liable for information provided by one of its agents, servants or representatives, including chatbots,” Rivers wrote. “It doesn’t explain why you believe that.” Or “It doesn’t explain why a web page titled ‘Bereavement Journey’ is inherently more trustworthy than a chatbot.”
Rivers also found that Moffatt had “no reason” to believe that one part of the Air Canada website would be accurate and another part incorrect.
Air Canada “does not explain why customers find information in one part of the website and need to check it again in another part of the website,” Rivers wrote.
Ultimately, Rivers ruled that Moffatt was entitled to a partial refund of $650.88 in Canadian dollars of the original fare of $1,640.36 CAD (about $1,216 USD) (about $482 USD), plus additional damages to cover interest on the airfare. and Moffatt's tribunal costs.
Air Canada told Ars it would comply with the ruling and considered the matter closed.
Air Canada's chatbot appears to be disabled
When Ars visited Air Canada's website on Friday, it found there was no chatbot support available. This suggests that Air Canada has disabled the chatbot.
Air Canada did not respond to Ars' request to confirm whether the chatbot is still included in the airline's online support service.