Generative AI is finding a new home in call centers, and leaders need to balance promised CX improvements with customer trust and ethical concerns.
With the excitement around generative AI’s potential to improve outcomes for customers and workers alike, potential ethical concerns often fall by the wayside, according to Mario Matulich, president and managing director of Customer Management Practice.
Chabots can make up information, or “hallucinate,” which can lead to high-profile embarrassment. Other important concerns include managing the data needed to fuel AI-powered tech stacks and maintaining consumer trust in how their information is used.
“It's still a very new technology,” Matulich told CX Dive. “It's still in the test phase. A lot of the [vendor] community will explain how they can overcome these challenges, but you have to do your homework.”
Customers are already mistrustful. Only half of customers believe the benefits of AI outweigh the risks, according to a January KPMG survey. Another 3 in 5 say they’re wary of AI.
Take care with customer data
Data is the lifeforce of generative AI, and in call centers, that data comes from customers. Smooth handoffs and personalized service require information ranging from personal preferences to records of conversations with chatbots, and companies must safeguard it carefully.
“I think top level concerns are definitely, obviously and without a doubt data privacy and confidentiality,” Julie Geller, principal research director at Info-Tech Research Group, told CX Dive. Leaders are thinking through how to gather information and serve up the results.
As a baseline, companies need to protect customer data with rigorous access protocols, according to Geller. Access permissions should be based on role so only those who need the data can access it, and security best practices like multi-factor authentication are a must.
Risk planning and incident response strategy development are also important, Geller said. Companies can analyze what went wrong in case studies from real data breaches as they develop their own escalation procedures and communication protocols.
Continuous monitoring is vital as well. “We don't want to set it and forget it,” Geller said. “Are we deploying advanced analytics and digging deeper into the data to detect anything that might be an issue?”
Call center agents, who are already taking on more complex roles in AI-powered call centers, also have a role to play in security, Geller said. Workers need to be on the lookout for customer complaints that could point to chatbot glitches or data breaches.
As a result, managers will need to provide extra training to help workers better watch out for security and privacy red flags that could erode customer trust.
“Detecting and reporting any suspicious activities and making sure that agents understand how to do that adequately is really important,” Geller said.
Trust is built into the foundations
Strong security protocols are a start, but call centers still need to build trust with customers as they use AI more in their operations. Consumers aren’t privy to a company’s internal data practices, so further steps are needed to assuage their privacy concerns.
A good first step is for a company to measure its audience’s baseline trust as well as its own internal capabilities to maintain security, according to Amelia Dunlop, CXO at Deloitte Digital.
Leaders should try to get a sense of how consumers feel about their company’s level of transparency, Dunlop said. Customers want to know how their information is collected, how it is being protected and what it is being used to do, and failing to meet this need can breed distrust.
Companies should also be aware of what their new technology can and cannot do reliably, Dunlop added. A clunky chatbot experience can be a source of serious frustration for customers, even if the AI isn’t outright hallucinating.
From there, leaders can work to shore up their shortcomings and continue measuring results to ensure trust is moving in the right direction, Dunlop said. Maintenance is an ongoing process — so long as companies are using customer data, they need to ensure it’s being put to proper use, she said.