Fintech

The challenges of GenAI in fintech


Due to the cybersecurity disclosure rules the Securities and Exchange Commission (SEC) has adopted in 2023, public entities in the US are required to disclose any material cybersecurity incidents. Moving forward, these organizations will need in-depth knowledge of the impact, nature, scope and timing of any security incidents. In the age of generative artificial intelligence (GenAI), this is even more complicated.

GenAI fintech

The financial services industry has historically been slow to adopt new technologies into their offerings, due to the incredibly sensitive nature of personal identifiable information (PII) that they handle daily. But GenAI’s rapid-fire spread across all industries and ease of access to the public makes it difficult to ignore. Public fintech organizations are among those already struggling with the SEC’s reporting requirements, and GenAI adds a new layer or uncertainty.

GenAI in fintech

Fintech is just one of many industries wondering how to best approach GenAI use. Its capabilities can lead to increased productivity and enhanced efficiency, and it may allow employees to focus more on priorities. Specifically, GenAI can speed up critical processes like fraud detection, customer service, and poring over massive collections of PII and other data.

To do that, GenAI must be trained with the correct and niche data for each use case; otherwise, the model will hallucinate or show underlying bias.

GenAI is already known for making companies the subject of unfavorable news stories. Most recently, Canada Air’s notorious chatbot caused issues when a passenger bought a plane ticket after speaking with the AI and being reassured they would receive a refund for the inflated last-minute fare costs due to their bereavement policy. When the passenger later went to collect their refund, Canada Air informed them the chatbot provided incorrect policy information and no refund would be given. The courts decided otherwise, and said AI chatbots are extensions of their associated companies.

No one wants to be the next big headline due to an AI malfunction, but fintech companies may need to exercise more caution to stay ahead of such scenarios with the SEC reporting requirements.

The security implications of GenAI

While some organizations and their boards have an all-in mindset on GenAI’s usage, others are watching and waiting. Those fintech companies who have already begun to utilize GenAI’s power will need to lay the groundwork to ensure they have total visibility of its usage across networks. And those who are taking a slower approach to GenAI will need the capability to ensure shadow AI hasn’t infiltrated workflows.

As threat actors continue to pursue data exfiltration and ransomware attacks aggressively, industries with valuable PII will also need to worry about AI-driven attack capabilities used by cybercriminals, including AI’s exploitation to find vulnerabilities that could result in extreme data breaches. Threat actors have already been experimenting with AI-generated spear-phishing campaigns with realistic deepfakes and other content to exploit human employees, and we’re seeing evidence of AI-written malware.

Organizations must be prepared for the worst. To both meet the transparency requirements set by the SEC and ensure GenAI isn’t a risk to overall security posture, the task of laying down foundations for AI infrastructure is a top priority for organization leaders and their boards.

The foundations of AI infrastructure

Boards and executives pursuing solutions that align with SEC’s rules and account for the public availability of GenAI should consider emphasizing infrastructure tailored to holistic visibility and education: forensics, auditability, AI governance and employee training.

You can’t manage what you can’t see, meaning risks like shadow AI will run rampant until organizations can get a bird’s eye view of how, if at all, GenAI is being leveraged across internal processes. Any AI activity on internal networks should be easily viewable and monitored for abnormal or unwanted use.

Furthermore, the ability to log and track GenAI usage across internal networks as part of AI forensics automatically allows fintech companies to identify, trace and mitigate potential security risks from GenAI. As SEC’s requirements include providing full details of security incidents, the ability to audit AI activity through AI forensics on internal networks will be a paramount skill moving forward.

Another aspect of AI forensics and the auditability of GenAI that will prove to be key is having the capability to provide forensics information down to singular prompts. Right now, companies don’t have the infrastructure built to track and monitor AI usage. In instances where employees accidentally or purposefully provide sensitive information to AI in the form of prompts, having the GenAI history stored showing each prompt used internally will be invaluable for reporting purposes.

Employee education and training in GenAI use and how to responsibly harness its benefits are other key factors in complying with SEC’s regulations. Many popular large language models (LLMs) like ChatGPT and Copilot are public repositories of data sourced from the language it’s fed, meaning any PII accidentally input to the model has the potential to be a data leak. With the proper education and training, employees will better understand how to appropriately use GenAI and minimize the potential for data breaches caused by misuse.

As boards and organization leaders continue to consider the implications of GenAI across fintech and whether they should charge full speed into its adoption or wait, the SEC’s impacts on GenAI’s adoption are clear. The onus is now on public companies to better track and mitigate security risks, forcing high-value industries to reconsider their security and AI strategies.

By creating the foundation for GenAI’s governance and auditability, fintech companies can better prepare themselves for the inevitable risks of halting and pushing for GenAI’s adoption. In fact, it’s the next logical step.



Source

Related Articles

Back to top button