After its chatbot hallucinates and lies, Air Canada is sued and loses the case
If a chatbot lies to you, it’s not its fault – it’s the company that has “employed” it that is responsible. So cut the poor chatbot some slack (but go after its employer and go to the court, if it’s needed)!
A couple of years back, a man named Jake Moffatt talked to Air Canada’s chatbot, but got extremely untrue and frustrating information which cost him money and time. He took matters to court, and now, the ruling is in his favor (via Mashable).
The story follows Moffatt who was trying to get information on how to qualify for bereavement fare for a last-minute trip to attend a funeral. Air Canada’s chatbot proceeded to explain that the customer could retroactively apply for a refund of the difference between a regular ticket cost and a bereavement fare cost, as long as it was within 90 days of purchase.
However, Moffatt later learned that this is completely untrue and, in contrast, this is the airline's actual policy on their website: “Air Canada’s bereavement travel policy offers an option for our customers who need to travel because of the imminent death or death of an immediate family member. Please be aware that our Bereavement policy does not allow refunds for travel that has already happened.”
Nice try.
“While a chatbot has an interactive component, it is still just a part of Air Canada’s website”, responded a Canadian tribunal member. “It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot”.
Does that mean if one tricks a given company’s chatbot into giving out false information, it could translate to easy money from a lawsuit? Interesting times ahead.
A couple of years back, a man named Jake Moffatt talked to Air Canada’s chatbot, but got extremely untrue and frustrating information which cost him money and time. He took matters to court, and now, the ruling is in his favor (via Mashable).
However, Moffatt later learned that this is completely untrue and, in contrast, this is the airline's actual policy on their website: “Air Canada’s bereavement travel policy offers an option for our customers who need to travel because of the imminent death or death of an immediate family member. Please be aware that our Bereavement policy does not allow refunds for travel that has already happened.”
When Air Canada refused to issue the reimbursement that its chatbot promised, Moffatt took them to court. The airline tried to argue that they were not responsible for the “misleading words” of its chatbot and the chatbot was a “separate legal entity” that should be held responsible for its own actions, claiming the airline is also not responsible for information given by “agents, servants or representatives — including a chatbot”.
Nice try.
“While a chatbot has an interactive component, it is still just a part of Air Canada’s website”, responded a Canadian tribunal member. “It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot”.
Things that are NOT allowed: