Generative AI

How can banks and fintechs get employees’ buy-in on generative AI? | PaymentsSource


From left: Cathy Beardsley, president and CEO of Segpay; Carolyn Homberger, president of the Americas for Featurespace; and Katie Whalen, head of North America issuer processing for Fiserv.
From left: Cathy Beardsley, president and CEO of Segpay; Carolyn Homberger, president of the Americas for Featurespace; and Katie Whalen, head of North America issuer processing for Fiserv.

Marcy Vanegas

The tasks that generative AI is most likely to take over are those typically assigned to entry-level employees, and thus companies must be careful not to let the technology cut out the lowest rungs of the career ladder, experts say.

Already, financial services providers have made noise about AI outright replacing certain roles, or eliminating the need to outsource, particularly in areas such as customer service. Additionally, AI tools are being explicitly marketed for their ability to handle tasks such as generating art, writing and code, making them either a boon or a threat to the people who specialize in those skills.

One way to address these concerns is to seek the buy-in of the very people who are most likely to be affected by the addition of AI, said Katie Whalen, head of North America issuer processing at Fiserv.

Of the 60 people whose role it is to service Fiserv’s largest clients, the bank technology giant chose four to five individuals to pilot new uses for AI, Whalen said. Instead of seeing AI as a threat, these people came to “almost serve as evangelists within the broader organization,” she said.

Whalen, one of American Banker’s Most Influential Women in Payments for 2024, spoke with other honorees at this year’s Payments Forum on the topic of AI in the workplace.

Fiserv’s evangelists either raised their own hands for the role, or were nominated by their peers for their leadership skills and their enthusiasm for working with new AI tools, Whalen said. 

“Getting those evangelists to … [become] early adopters and then kind of teach other people is really important because of the narrative that is around AI” automating or replacing jobs, she said. 

A company’s use of AI is as much about its culture as anything else. Whalen advises against the idea of managers pushing AI from the top down, and says organizations can have better results if “it’s something that can be adopted from the people that are actually doing the work.”

Cathy Beardsley, president and CEO of Segpay, agreed that culture is an important part of how an organization implements AI, especially among employees who are in the earliest stages of their careers.

“You’re still going to be bringing in entry-level, junior people to learn your system, your culture, what your business is about, but AI is going to be augmenting those [roles] to help you be a little bit more efficient,” she said. “And hopefully, the junior people start to learn, and they can move up to the next step into a new role.”

Carolyn Homberger, president of the Americas for Featurespace, emphasized that AI still needs to be managed, and that this management process is part of how people develop their skills.

“Very early in my career, I was an accountant and then went into corporate finance, and still had to … be able to explain what it is you want the technology to do, what outcome you’re looking for,” Homberger said. As entry-level people incorporate AI and other technologies into their jobs, “I look for them to articulate that, and how they’re applying it with the tool,” she said.

On the other end of the spectrum, employees can be a bit too enthusiastic about AI. A major risk is “bring your own” AI — any of the large language models being offered to consumers — and feeding it data without regard to how that data is being exposed.

“At Fiserv, because we deal with a lot of client data and proprietary information, we’ve had a policy that no employee should be using any external kind of AI, like a ChatGPT, for usage in … the work that we do, especially from a client perspective,” Whalen said. “But that also means that you need to make sure that you’re giving your employee base an alternative.”

This is the part where management plays a vital role, Whalen said. Fiserv’s president and chief technology officer jointly oversee a council that advises on AI policies within the company, she said. As a result of this process, Fiserv offers a version of ChatGPT that’s hosted on its own servers — not the cloud — enabling the company to control the tool’s use and guard its data.

Homberger emphasized the importance of implementing internal policies on AI, especially because regulation of generative AI is very open-ended at the moment. A strong stance on risk management will make it more likely that an organization is compliant with any regulations that come down, she said. 

Beardsley suggested that regulation will provide clarity to the organizations that most need it.

“This is like crypto six years ago,” she said. “Regulations came in place and made it more mainstream, weeded out bad players. So I think [it will be the] same thing for AI.”



Source

Related Articles

Back to top button