AI governance and cybersecurity certifications: Are they worth it?
The International Association of Privacy Professionals (IAPP), SANS Institute, and other organizations are releasing new AI certifications in the areas of governance and cybersecurity or adding new AI modules to existing programs. These may help professionals find employment, but with the area being relatively new, experts warn certifications could be out of date almost immediately.
Data protection and privacy makes up about a third of AI governance, J. Trevor Hughes, IAPP’s founder and CEO, tells CSO. The rest includes algorithmic bias and fairness, intellectual property rights and copyright for both training data and AI outputs, content moderation issues, trust and safety issues, and the management aspects of putting together a team to oversee all these issues. “When we look out at the broad world, we see privacy professionals and cybersecurity professionals have incredibly transferable skills. If they can layer in some AI governance training and awareness, we can scale much more quickly to respond to a need for hundreds of thousands of governance professionals in the next decade.”
The case for AI governance and cybersecurity certifications
As adoption of generative AI proceeds at a breakneck pace, companies will be increasingly more desperate for AI governance and cybersecurity experts. And, since the field is so new, few people will have any actual work experience in the area. So, training and certification programs will proliferate, to help fill the gap.
Forrester analyst Jess Burn calls this a “certification industrial complex”. “Everyone wants a piece of this,” she says. But the certifications come at a steep price when all the necessary training is added in. And just because someone has a certification doesn’t mean that they are competent in the subject.
This is the absolutely right time to start thinking about AI governance, Dan Mellen, EY’s principal and cybersecurity CTO, says. “Generative AI is becoming real and starting to move forward. The level of sensitivity requires some sort of baseline understanding — and generative AI certifications do that.”
David Foote, chief analyst at Foote Partners, says his company is tracking eight dedicated AI certifications that have enough data to be included in the company’s IT Skills and Certifications Pay Index. In addition, AI-related content is being added to other cybersecurity certifications he’s tracking. But, he says, companies typically pay more for demonstrated abilities than for certifications. “They don’t care if there’s a certification or not, as long as the candidate can demonstrate that they have acquired and can apply skills in AI governance or cybersecurity. They trust their ability to identify and reward skills a lot more than they trust passing a certification exam.”
Other critics say that the space is too new, best-practices not yet defined, the laws and regulations still evolving. Even if the certification covers useful material, it would be out of date almost immediately.
On the other hand, the new AI governance and cybersecurity certifications cover the basics needed to get up to speed, create a foundation layer on which people can build later, create a common language for practitioners to use, and will typically include ongoing training requirements to help people stay current.
AI governance training and certificates
The first AI Governance Professional (AIGP) test from IAPP was taken by 200 people in April, but future tests will be held at testing centers around the world, same as many other exams, and virtually.
The IAPP test, which costs $649 for members and $799 for non-members, is 100 questions and takes about three hours. In addition, the training program costs $1,000 and up, depending on whether it’s in-person or online, with eight different modules to go through.
ISACA has an AI Fundamentals certificate that includes risks and ethical requirements. Tonex offers a Certified AI Security Practitioner certification course. GSDC offers a Generative AI in Cybersecurity certification that covers not only how generative AI can be used to help in cybersecurity, but also covers ethical considerations and best practices for responsible use.
Here, in alphabetical order by organization are all the AI governance training and certificates known at the time of publishing.
Benefits of AI governance and cybersecurity certifications
The ability to have a common language and a set of core foundational principles was why Wipro sent its entire AI taskforce to take the IAPP’s AIGP training, Wipro chief privacy and AI governance officer Ivana Bartoletti tells CSO. “We have people who come from a legal background, people who come from a technical background, a risk management background,” she says. “Whether it’s change management, or programmers, or lawyers, it’s important for them to be aligned on terminology and key points in our governance.”
Bartoletti believes it is not too early as “AI needs to be governed”. In fact, there are already some laws on the books, such as Europe’s AI act and President Biden’s executive order. But even without that, there are privacy laws that also apply to generative AI, there are security controls, non-discrimination legislation, and much more.
“Governance is, of course, an evolving matter,” Bartoletti says. “But we can’t just look at it in relation to legislation or wait for standards. Governance is really about saying: How do I, as an organization, put controls around the development and deployment of the systems?”
The first step is alignment, so that everybody from HR to coding have the same core understanding of what the principles of AI governance are.
For Bartoletti, the advantage of the IAPP’s AIGP certification is that it comes out of the privacy side, so it was a natural choice for the taskforce. “To me, and maybe I’m biased because privacy is my baby, but I feel that privacy professionals are very well equipped to deal with AI governance. We know the risk-based approach to technology. We have to have the controls in place — but we also have to have the business keep running.”
Bartoletti has gone through the training herself. She says it took two weeks and was conducted by a real human — all remotely, since Wipro is a global company. She says that clients like to see that their consultants have official certifications. It demonstrates that a person has taken the time to study and isn’t just improvising as they go along. “The area of AI governance and risk is an area where you don’t want to improvise.”
When the certifications are combined with a strong history of putting it into action, then you get a strong competitive advantage. Once formal standards are released, Bartoletti expects to see a lot more certifications coming out, covering specific topics like how to comply with the EU AI Act, or with NIST, or with other rules and regulations. “I think there will also be a lot of attention on specific sectors, like governing AI in healthcare, or in financial services.”
Certifications like the AIGP are particularly valuable for consultants, agrees Steve Ross, director of cybersecurity for the Americas at S-RM Intelligence and Risk Consulting. “Our clients are feeling the uncertainty,” Ross tells CSO. “They would like to increase the use of AI but nobody knows how to use it safely, securely, ethically — and are looking for someone they can trust to do that.”
As a result, clients will be looking for the certifications over the next two or three years. “I don’t have one of these certifications, but am thinking of pursuing them,” Ross adds, such as the AIGP certification, which he finds interesting. “I attend IAPP events and appreciate that the community isn’t just focused on data privacy but the legal implications. That’s the certification that I’ll pursue first.”
Ross is also interested in the SANS AI Security essentials training, as he likes “the quality of content SANS puts out.” And he will consider certifications when hiring people. “I prefer folks who have a well-rounded skill set. A background in AI is fantastic, but so is an understanding of governance, risk and compliance.”
But not all firms find that having the certification itself is the most important thing. “Our reputation tends to precede itself,” EY’s Mellen tells CSO. “I’m more interested in folks having hands-on technical experience than having letters after their name in their LinkedIn profile. I’ve encountered a number of career certification folks in my 25 years working in this space. And sometimes it’s great to understand the textbook answer, but the textbook answer doesn’t always fly in reality.”
The cons of AI governance certifications
For critics of the new AI certifications, the space is just too new. “It is extremely important to have governance in place. But it’s also a place that needs to evolve more,” says Priya Iragavarapu, VP of data science and analytics at AArete, a management consulting firm.
Meanwhile, companies have data governance, and technology governance, and industry-specific risk management requirements such as those in the financial services. There are also specific technical certifications for individual cloud platforms, for machine learning, and for data security. “I would stick with the technical specifications for now,” she says. “Get the technical capabilities going, but the AI governance is still not there yet.”
“I’d put more value on someone’s experience with data governance than a certification in AI governance,” says Nick Kramer, vice president for applied solutions at SSA & Company, a management consultancy. “With these new skills, by the time you finish the course, they probably already changed.”
Taylor Dolezal, CIO and head of ecosystems at Cloud Native Computing Foundation, says he’s appreciative of the efforts to create AI governance standards, but that the space is too new and changing too fast. “We’re still trying to figure out how to compose everything together. We haven’t had those standards come out yet.”
It’s too early to say what the path is that an organization should follow to bring about the outcomes that it wants to see.
Another problem is that certifications typically last for two or three years. The IAPP’s AIGP lasts for two. “My concern would be how long that certification is good for. The space is changing so fast,” Dolezal says.
“One of the basic premises of any certification is that it has to be a mature domain,” says Chirag Mehta, analyst at Constellation Research. “You cannot certify someone until you’re sure what it is. We’re not there yet. To some extent, it’s smoke and mirrors.” AI certifications don’t provide much value because the technology is changing — not on a monthly basis, or weekly basis, but on a daily basis. “Our guidance to CISOs is that if you want someone who has been exposed to generative AI, a certification shows their potential to learn new technology and embrace what is coming,” Metha says. “Take that as a positive signal but not as evidence that they know something.”
Industry should not wait for standards to stabilize
IAPP’s Hughes admits that AI governance is a moving target but says that waiting for the standards and best practices to crystallize is a silly argument. “There’s enormous risk in AI. We know there’s an enormous risk. Should we stop building governance controls? Should we stop training professionals? I don’t want to wait for a perfectly formed future to arrive. We need to exert good governance and control on AI from the outset, not at some point in the future when courts and other slow-moving public policy systems have made decisions. And you don’t need settled law to run an AI impact assessment. You don’t need a court to tell you that a particular outcome is discriminatory. You should be looking at that regardless of what the law says, because it’s good business and building trust and safety into these technological innovations allows them to move quickly and in a more stable way.”
Hughes says IAPP is trying to develop safety standards to allow this technology to be adopted faster, to move faster in a safe way. “If we launch AI with good assessments and controls, the technology will move more smoothly in society, and we’ll be able to accelerate more quickly into a positive beneficial future.”
That’s an attitude that Christopher Paquette, chief digital transformation officer at Allstate, completely agrees with. “It’s important to have those warning signs thought out. And to worry about that stuff before the bad stuff happens.”
Being thoughtful about responsible AI and paying attention to AI governance is important for companies today. “Whether it’s a certificate or something else, it is something we absolutely need,” Paquette says.