Generative AI

Boards Need to Weigh Risk and Value With Generative AI Deployment


The generative artificial intelligence era is here to stay—as both a table-stakes technology and transformative differentiator in the marketplace. For boards, AI proliferation has unlocked a landscape of business opportunity and complex risks, and it’s prompting questions around how to prepare corporate boards to effectively govern this new terrain.

While some enterprises are in the early stages of considering generative AI adoption, others are developing a strategy for implementation. Still more are already deploying generative AI use cases throughout the organization.

The board’s role is to protect the brand and help the organization flourish into the future. But board members must look beyond base technology value and should more broadly view risk, reputation, and long-term value creation. The challenge is to leverage experience and business acumen to steer strategy and deployment decisions as this technology permeates the organization.

Boards have a proven track record of successfully guiding technology adoption through periods of change while weighing impacts on all stakeholders—and humanity at large. This has positioned boards to help determine generative AI guardrails.

They’ve also had to learn how to align messaging for using generative AI—inflated expectations can give way to a more grounded generative AI vision. In collaboration with management, boards should take up a strategic and intentional mindset, particularly as it relates to structure, skills development, and trust in technology.

Oversight Structure

How boards will oversee and govern generative AI is a foundational question and partially depends on where an organization is in its generative AI adoption and maturity.

Boards should consider whether and how generative AI, like other technologies and enablers, are embedded in an organization’s enterprise strategy, including short- and long-term operating plans and enterprise risk management. They should also understand how generative AI’s pervasiveness will impact strategic and operational planning, as well as how the management team is organizing around it.

When determining responsibility, boards will need to assess which aspects of generative AI oversight and governance are relevant for the full board—those topics that are generally more pervasive—or more appropriate for a committee. If the latter, it’s crucial to determine which committee and how oversight will be shared when some topics transcend committees.

Emerging priorities often fall to an audit committee as it oversees process, policy, and financial risk. But it doesn’t own all risk oversight. For generative AI efforts, consider whether the audit, risk, technology, or compliance committee (or some combination) is the right fit, or if establishing a generative AI-specific committee is appropriate.

Stay flexible during this period of change. What is appropriate and strategic today may shift over time. A commitment to an adaptable approach that meets the short- and long-term needs of the organization is necessary.

Skills and Qualifications

One of a board’s greatest contributions is lending an outside-in perspective to help calibrate what management encounters in operation. As a starting point, develop a matrix pairing business strategy with existing skills and experience to reveal gaps in qualifications. The approach to filling those gaps may take different forms.

Establishing AI literacy among board members is crucial. Tactics such as learning modules, self-directed education, and expert-led briefings can help board members develop the working knowledge of generative AI they will need to be effective in their oversight and decision making.

As enterprise strategy and needs change, boards might also consider retiring an existing member to create an open seat at the decision-making table. Similarly, boards may see expansion as the best approach.

No matter the approach, it’s important to consider the relevancy and timeliness of all board members’ expertise. Thoughtful consideration should be given to affirm that current and prospective board members are properly educated on generative AI and can capably uphold board oversight responsibilities.

Maintaining Trust

Trust is one of the major barriers to large scale generative AI deployment—and it’s directly related to generative AI risk. Each use case is unique and needs to be assessed on its own merits and function. To do this successfully, a common ethics architecture should be incorporated into the integrated strategy.

Nuanced determination of AI risk supports governance that enables the organization to use and manage AI in a trustworthy way. It informs guardrails, such as documented accountability, validation by the emotional intelligence and creativity of humans, and assessment and communication channels to maintain responsible deployment.

Ultimately, boards must consider the human element and generative AI’s impact on people. If it’s approached as a tool to replace humans rather than support and enhance their work, it will harm trust and limit effective use—hindering business value.

Boards with a deep understanding of an organization’s purpose and intended value of generative AI can help propel a human-centric approach. Diverse perspectives and lived experiences are important—a diverse group of business leaders (and board members) are best-positioned to consider generative AI’s range of outcomes and societal impact.

By focusing the vision for generative AI on human trust and net benefits for society, boards can help their organizations maximize business benefit, mitigate enterprise and societal risks, and promote generative AI value that is equitable and sustainable.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

Lara Abrash is chair of Deloitte US. She leads the board of directors in governing all aspects of the organization.

Write for Us: Author Guidelines



Source

Related Articles

Back to top button