Air Canada's AI chatbot misleads customer, airline says ‘responsible for its own actions’

Posted on:
Key Points

Air Canada has defended its use of an automated chatbot that was accused of misleading a customer, by stating that the chatbot was responsible for its own actions...

In 2022, Jake Moffatt reached out to Air Canada, seeking information about bereavement fares following the passing of his grandmother..

During the conversation with Air Canada's support chatbot, Moffatt inquired about the possibility of retroactively applying bereavement fares...

Despite the chatbot's earlier indication that he could apply for a refund within 90 days of ticket issuance, Air Canada later stated that bereavement rates wouldn't be applicable retroactively on completed travel when Moffatt sought the refund...

In response to Moffatt's lawsuit for the fare difference, Air Canada argued that the chatbot was a "separate legal entity" and therefore was "responsible for its actions," as reported by The Guardian..

You might be interested in

Air Canada ordered to pay customer who was misled by airline’s chatbot

16, Feb, 24

Company claimed its chatbot ‘was responsible for its own actions’ when giving wrong information about bereavement fare

Air Canada Has to Honor a Refund Policy Its Chatbot Made Up

17, Feb, 24

The airline tried to argue that it shouldn't be liable for anything its chatbot says.