Air Canada has to honor the refund policy created by its chatbot

[ad_1]

After months of protest, Air Canada was forced to give a partial refund to a grieving passenger who was misled by an airline chatbot by incorrectly explaining the airline's bereavement travel policy.

The day Jake Moffat's grandmother died, Moffat immediately went to the Air Canada website to book a flight from Vancouver to Toronto. Unsure of how Air Canada bereavement rates work, Moffat asked Air Canada's chatbot for clarification.

The chatbot provided incorrect information, encouraging Moffatt to book the flight immediately and then request a refund within 90 days. In fact, Air Canada's policy clearly states that the airline will not provide refunds for bereavement travel after the flight has been booked. Moffat dutifully attempted to follow the chatbot's advice and requested a refund, but was shocked to find that the request was denied.

After Moffat spent months trying to convince Air Canada that a refund was due, he shared a screenshot from the chatbot that clearly claimed:

If you need to travel immediately or have already traveled and wish to redeem your ticket for a reduced bereavement rate, please do so within 90 days from the date of issue of your ticket by completing our Ticket Refund Application Form. Do it.

Air Canada argued that because the chatbot's response linked to a page containing the actual bereavement travel policy elsewhere, Moffatt should have known that bereavement rates could not be requested retroactively. Instead of a refund, the best Air Canada would do would be to promise to update the chatbot and provide Moffatt with a $200 coupon to use on a future flight.

Unhappy with the offer, Moffat refused the coupon and filed a small claims complaint with the Canadian Civil Resolution Tribunal.

According to Air Canada, Moffat should never have relied on the chatbot and the airline should not be liable for the chatbot's misleading information because, Air Canada essentially argued, “the chatbot is a separate legal entity that operates independently of its own actions.” Is responsible for,” a court order said.

experts told vancouver sun Moffatt's case appears to be the first time a Canadian company has tried to argue that it is not liable for information provided by its chatbot.

Tribunal member Christopher Rivers, who decided the case in Moffat's favor, called Air Canada's defense “remarkable”.

“Air Canada argues that it cannot be held liable for information provided by any of its agents, servants or representatives – including chatbots,” Rivers wrote. “It does not explain why it believes this” or “why a webpage titled 'Mourning Journey' was inherently more trustworthy than its chatbot.”

Furthermore, Rivers found that Moffat had “no reason” to believe that one part of Air Canada's website would be accurate and another would not.

Air Canada “did not explain why customers should double-check information found in one part of its website in another,” Rivers wrote.

Ultimately, Rivers ruled that Moffat was entitled to a partial refund of $650.88 Canadian dollars from the original fare (approximately $482 USD), which amounted to $1,640.36 CAD (approximately $1,216 USD), plus additional damages to cover interest on the airfare. Was also. and Moffat's tribunal fees.

Air Canada told ARS that it will comply with the decision and considers the matter closed.

Air Canada's chatbot appears to be disabled

When Ars visited Air Canada's website on Friday, there was no chatbot support available, indicating that Air Canada had disabled the chatbot.

Air Canada did not respond to Ars' request to confirm whether the chatbot is still part of the airline's online support offering.