close
close

The use of AI in customer service faces legal challenges that could affect banks


The use of AI in customer service faces legal challenges that could affect banks

An Air Canada plane and Patagonia signage

Air Canada was penalized for misinformation generated by its chatbot; Patagonia was accused of letting a vendor’s AI model listen in on and analyze customer service conversations without consent. These cases have parallels with banking, where many institutions use AI-based chatbots and contact center software.

Two recent lawsuits are charging companies that use artificial intelligence in their chatbots and call centers, such as many U.S. banks.

In one case, a customer whose grandmother had just died booked a flight with Air Canada and the airline’s generative AI-based chatbot assured him that he had 90 days to apply for a bereavement discount. However, this was not in line with the airline’s policy and it refused the large bereavement discount on the grounds that its policy was correctly stated on its website. A civil court ruled, orders Air Canada to grant the customer the discount and pay the fees.

In the other case, several customers sues Patagonia after it was discovered that Talkdesk, a contact center technology provider that Patagonia uses, was recording and analyzing customer support calls and using them to train its AI model. Customers say they would not have made those calls if they had known Talkdesk was listening in on them and that this is a violation of California privacy law. The lawsuit was filed on July 11 and has not yet gone to court.

The first case raises doubts about the use of AI in call centers, which many US banks do, mainly to analyze customer sentiment and call center agent performance and summarize calls for their records. The second case questions any use of a customer-facing generative AI model by a retail entity such as a bank.

Hallucinating chatbot

When Jake Moffat’s grandmother died, he visited Air Canada’s website to book a flight from Vancouver to Toronto and take advantage of the airline’s bereavement fares. While searching for flights, he used a chatbot on the airline’s website that told him he could apply for bereavement fares retroactively.

But when Moffatt later requested the discount, the airline said the chatbot was mistaken – the request had to be made before the flight – and it would not grant the discount. The airline said it could not be held liable for information provided by any of its agents, servants or representatives – including a chatbot. The airline said the chatbot was a “separate legal entity responsible for its own actions” and that Moffatt should have clicked on a link provided by the chatbot, where he would have seen the correct policy. Air Canada did not respond to an interview request.

“This is a remarkable motion,” noted juror Christopher C. Rivers. “Even though a chatbot has an interactive component, it is still just one part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all information on its website, whether the information comes from a static page or a chatbot.”

Rivers also found that Air Canada did not exercise due diligence to ensure the accuracy of its chatbot. “Although Air Canada argues that Mr. Moffatt could have found the correct information on another part of the website, this does not explain why the webpage titled ‘Traveling for Bereavement’ was inherently more trustworthy than the chatbot,” Rivers wrote in his decision. “Nor does it explain why customers need to verify information they find on one part of the website on another part of the website.”

A civil court awarded the customer $650.88 in damages, $36.14 in interest, and $125 in court costs.

This case will likely make Air Canada think twice about the chatbots it is “hiring” and prompt AI chatbot vendors to improve their features, said Greg Ewing, a member of the Washington, DC, law firm Dickinson Wright.

“One way to start is to put restrictions on what a chatbot can talk about,” he said. “I think that will both drive innovation and motivate companies like Air Canada to be careful about choosing their chatbots.”

Ewing also pointed out that humans also make such mistakes.

“This is not a unique circumstance,” he said. “It is unique only because of the person who actually wrote the words.”

Many banks offer AI-based chatbots on their websites, but most do not use generative AI today (many have said they would like to in the future). Bankers interviewed for this article say they are cautious about pushing generative AI to customers.

“At Citizens, we have run our first use cases internally under human oversight as we actively and safely advance AI adoption,” said Krish Swamy, Chief Data and Analytics Officer at Citizens. “We see the potential of AI to help us serve our customers while helping our peers innovate. Smart financial institutions should put appropriate safeguards in place, including human safeguards, protecting customer data, and adhering to data protection obligations, to best support the deployment of AI at scale.”

AI models can be validated and tested with other AI models, said Sri Ambati, CEO of H2O.ai.

“Curation is going to be important, and the best way to do it is with different AI,” he said. His company offers a framework called Eval Studio that helps companies create assessments that test for vulnerabilities.

AI eavesdropping in Patagonia

In their class action lawsuit against Patagonia, filed in California Superior Court in Ventura County, customers accused Patagonia of violating California privacy law by having a Talkdesk AI model analyze customer support conversations in real time.

“When callers call one of Patagonia’s support lines, they are told that the conversation ‘may be recorded for training purposes,'” the complaint states. “This tells reasonable consumers that Patagonia itself may use the recording to train its customer service representatives or improve its products. Reasonable consumers are not told that a third party (Talkdesk) will intercept, listen to, record and use the conversation for its own purposes.”

Patagonia and Talkdesk did not respond to interview requests.

Ewing pointed out that California has a wiretapping law that makes it illegal to record conversations without consent.

“I think they have a pretty strong case on that front because Talkdesk is recording these conversations at Patagonia’s request, and at least according to the complaint and what I’ve seen, there was no real consent to that recording,” he said. “We all know those preambles like ‘This conversation may be recorded for training purposes.’ That doesn’t sound to me like we’re going to share it with anyone.”

The complaint alleges that Talkdesk uses the data to train its AI model, raising questions about potential bias.

“If I were Talkdesk, I’d be worried that the accusation will essentially be, ‘Look, there are Patagonia users who call Patagonia and are angry and have a Southern accent that the AI ​​recognizes,'” Ewing said. “What information is the AI ​​going to use to give its recommendations to the customer service representative the next time that person calls?”

This lawsuit will force Talkdesk and its customers to think about what they disclose, what consent they obtain and how they use AI models in call centers, Ewing said.

Many U.S. banks are using AI to analyze customer support calls, analyze customer sentiment and customer service agent performance, and even rethink products that customers may complain about. Some are using generative AI to help call center agents provide informed responses that take into account AI-generated information.

Perhaps more disclosure could prevent lawsuits like this one. In addition to the standard message that calls are monitored, companies could add a note that the software analyzes conversations “to help our agents provide the best customer service possible,” Ewing suggested.

Ultimately, customers would own their own data, Ambati said.

“If you own the data, you can rent it out,” he said. “You get all the ownership rights you have to lend out a large language model to be refined. You could let it be used to fight Alzheimer’s, for example, but not for political purposes.”

Leave a Reply

Your email address will not be published. Required fields are marked *