Air Canada, the largest airline in Canada, has been ordered to pay compensation to a passenger who was misled by its chatbot feature on its website. The chatbot gave the passenger the wrong information about the eligibility for a bereavement fare, which is a discounted ticket for people who need to travel due to the death of a family member or a close friend.
Key Takeaway Box
- Air Canada’s chatbot told Jake Moffatt that he could apply for a bereavement fare refund within 90 days of booking his ticket.
- However, Air Canada’s website stated that bereavement fares do not apply to completed travel and must be requested over the phone.
- Moffatt sued Air Canada for the fare difference and won the case, as the tribunal ruled that the chatbot was part of Air Canada’s website and the airline was responsible for its actions.
- Air Canada argued that the chatbot was a separate legal entity and was responsible for its own actions, but the tribunal rejected this claim.
- This case raises questions about the reliability and accountability of chatbots and other AI tools that are used by airlines and other companies to interact with customers.
How It All Started|Air Canada
Jake Moffatt, a resident of British Columbia, needed to travel to Toronto in 2022 to attend the funeral of a family member. He contacted Air Canada through its chatbot feature on its website to find out what documents he needed to qualify for a bereavement fare and if he could get a refund after his travel.
According to the screenshot of the chatbot conversation that Moffatt provided, the chatbot told him that he could book his ticket online and then apply for a bereavement fare refund within 90 days of the date his ticket was issued. The chatbot also provided him with a link to an online form that he could fill out to request the refund.
Moffatt followed the chatbot’s instructions and booked his ticket online. He paid $1,800 for a round-trip ticket from Vancouver to Toronto. After his travel, he filled out the online form and submitted it to Air Canada, expecting to receive a refund of about $650, which was the difference between the regular fare and the bereavement fare.
However, Air Canada denied his refund request and told him that the chatbot had given him wrong information. The airline pointed out that its website clearly stated that bereavement fares do not apply to completed travel and that they must be requested over the phone before booking the ticket.
Moffatt was shocked and frustrated by Air Canada’s response. He felt that he had been misled by the chatbot and that the airline was not honoring its promise. He decided to sue Air Canada for the fare difference and filed a complaint with the Civil Resolution Tribunal of British Columbia, which is a low-cost and online dispute resolution service.
How It All Ended
The tribunal heard Moffatt’s case and ruled in his favor. The tribunal member, Christopher Rivers, wrote that Air Canada was responsible for all the information on its website, including the chatbot feature. He said that it was obvious to Air Canada that the chatbot was part of its website and that it should not have used misleading words in its conversation with Moffatt.
He also said that there was no reason why Moffatt should have known that one section of Air Canada’s website was accurate and another was not. He said that Moffatt had relied on the chatbot’s information in good faith and that Air Canada had breached its contract with him.
Rivers ordered Air Canada to pay Moffatt a partial refund of $650.88, which was the amount of the bereavement fare refund that he was entitled to under the Canadian Air Passenger Protection Regulations (APPR). The APPR is a set of rules that protect the rights of air passengers in Canada and provide them with compensation and standards of treatment for flight disruptions, such as delays, cancellations, and denied boarding.
Rivers also ordered Air Canada to pay Moffatt an additional $100 for his tribunal fees and $500 for his inconvenience and frustration. The total amount of compensation that Air Canada had to pay Moffatt was $1,250.88.
Air Canada tried to defend itself by arguing that the chatbot was a separate legal entity and that it was responsible for its own actions. The airline said that the chatbot was an AI tool that learned from its interactions with customers and that it was not controlled by Air Canada.
However, Rivers rejected this argument and said that it was not credible. He said that Air Canada could not escape its liability by blaming the chatbot for its mistake. He said that Air Canada had created the chatbot and had the ability to monitor and update it. He said that Air Canada had admitted that the chatbot had used misleading words and that it had updated the chatbot after Moffatt’s complaint.
What It All Means
This case is the first of its kind in Canada and it raises important questions about the reliability and accountability of chatbots and other AI tools that are used by airlines and other companies to interact with customers. Chatbots are becoming more popular and common as they offer convenience, efficiency, and cost savings for both customers and businesses. However, chatbots are not perfect and they can make errors or give inaccurate or inconsistent information.
Customers who use chatbots should be aware of the potential risks and limitations of these tools and should always verify the information they receive from them. Customers should also know their rights and options if they encounter a problem or a dispute with a chatbot or a company that uses a chatbot.
Companies that use chatbots should ensure that they provide accurate and consistent information to their customers and that they comply with the relevant laws and regulations. Companies should also monitor and update their chatbots regularly and take responsibility for their actions. Companies should not try to avoid their liability or shift the blame to their chatbots or other AI tools.
Source: (1) Flight Disruptions. https://www.aircanada.com/ca/en/aco/home/fly/flight-information/flight-disruptions.html. (2) 1. INFORMATION 5. COMPENSATION. https://www.aircanada.com/content/dam/aircanada/portal/documents/PDF/en/passenger_rights/Long_Delay_Flight_Cancellation_under_Canadian_Air_Passenger_Protection_Regulations.pdf. (3) Air Canada ordered to pay passengers $2,000 for flight cancellation …. https://www.cbc.ca/news/business/air-canada-cta-compensation-1.6583231.