How to improve analytics maturity
Organizations face many challenges when adopting analytics to extract maximum value from data infrastructure. It’s important to facilitate data literacy and consider how employees or stakeholders with different technical or business expertise collect, process and reuse data. It’s a combination of people, processes and technology.
“An advanced analytical capability can provide a strategic advantage in the form of real-time insights that enable and support better, faster decision making,” said Donncha Carroll, partner and chief data scientist at Lotis Blue Consulting.
Increasing maturity as a higher level of capability in people, processes and technology can affect business operations and results. Developing data assets, infrastructure and skills is a starting point for achieving higher performance levels. It is equally important to develop a culture for deriving insights from data rather than only trusting the intuition of a few individuals, which is fraught with risk.
What is analytics maturity?
Software developers use the capability maturity model to evaluate the evolution of the software development process. The five stages of maturity are initial, repeatable, defined, managed and optimized.
“Given that analytics deployment fundamentally involves software development, a similar framework can be adopted for an analytics maturity model,” said Ram Bala, associate professor of business analytics at the Leavey School of Business, Santa Clara University.
Analytics maturity is not synonymous with software maturity because the value proposition of analytics and software aren’t the same.
“An organization that is very mature in software development could be quite primitive in terms of its analytics deployment and usage,” he said.
An analytics maturity model can help an organization become proficient in using data and advanced analytics to drive business value, said Bharath Thota, partner at the consulting firm Kearney.
Analytics maturity escalates from reactive insights to more advanced forms of analytics that provide foresight and enable an organization to be more proactive. Thota finds it helpful to summarize each maturity level by its capability:
- Descriptive analytics looks at past data to figure out what happened and identify historical trends or patterns.
- Diagnostic analytics takes descriptive analytics results and provides insight into why it happened.
- Predictive analytics uses current and historical data to forecast activity, behavior and trends.
- Prescriptive analytics helps determine the best outcome or action to take based on forecasted possibilities.
- Cognitive analytics simulates human thinking by applying cognitive computing technologies to analytics. It uses AI and machine learning (ML) to automatically analyze large amounts of unstructured data and generate insights for the analyst.
It’s helpful to include measures of efficiency, effectiveness, resiliency, privacy, ethics and adoption of analytics capabilities within an organization to an analytics maturity model, said Mark Carson, managing director at Protiviti.
Maturity is a sliding scale that can vary across organizations. At the same time, it is also important to ensure that the ways of assessing key factors are flexible enough to accommodate the ever-changing analytics and AI landscape. Consider how technical and functional aspects of analytics differ from software and security maturity models. Analytics maturity should focus on the ability to provide business value and ROI.
“What one organization considers low maturity may be the target goal of another,” Carson said. “The same usually cannot be said, for example, in the context of security, where it’s more of a yes or no scale.”
The five levels of maturity is a relatively arbitrary number that arises mostly from the fact that humans have five fingers on each hand. The analytics consultancy DAS42 uses a six-stage maturity model that starts with a totally data-blind organization, moving on to data awareness, and ultimately culminating in an effective data-driven organization, said Teresa Kovich, principal consultant at DAS42.
Kovich likes to compare the maturity model to Maslow’s hierarchy of needs from the world of human development. The basic idea is that people can’t fully actualize their potential until they have a solid foundation. In DAS42’s approach, the focus is on outcomes rather than inputs. “Just because you’re doing machine learning doesn’t automatically mean you’re higher on the spectrum,” she explained.
Challenges of improving analytics maturity
Developing analytics maturity requires commitment and persistence. Even if the appropriate technical systems are in place, it’s important to foster a culture that enables analytics maturity efforts to succeed. Engaging stakeholders with the tangible value of analytics and growing the data of employees determines how successful maturity efforts are.
Teresa KovichPrincipal consultant, DAS42
Agility
Organizations should take an agile approach to assessing their analytics maturity, Carson said. They must assess specific business areas independently because different areas may have different definitions of maturity. Areas can include employee skills, data literacy, cost, security, privacy and whether to rent or buy analytics capabilities. Focus on aspects of analytics and AI because they should mature at different paces.
Balancing time and investment
People often want the shiny new tech, but don’t always want to spend the time it takes to get it right, Kovich said. Many DAS42 clients try to cut corners on the tools they use or build in-house. Understanding data from end to end is critical to avoid wasting money on new analytics tools or processes that don’t deliver value.
Stakeholder engagement
Analytics maturity requires top-to-bottom engagement. It’s important to include frontline employees in the analytics maturity journey.
“If your team members — not just your executives, but decision-makers and day-to-day players across your business — don’t believe in the value of data, your organization is going to be limited in what it can accomplish,” Kovich said.
Overreliance on tools
Many organizations rely too heavily on tools for operations such as governance and cataloging, said Kovich. Tools are useful, but building a data-driven organization requires significant human effort.
“There’s no technology that can replace the difficult and immensely gratifying work of talking about your company, your processes, your definitions, your measurements and your goals,” she said. Organizations should establish a center of excellence and education programs to bring the human element into data programs.
Skills
One of the main challenges organizations have is a lack of skilled professionals who can effectively use data analytics tools and techniques, said Robert Parr, chief data officer at KPMG US advisory. Lack of skills can be an issue when there is a shortage of qualified candidates in the job market, a lack of training and development opportunities for existing staff, or difficulty in retaining top talent.
Parr recommends organizations invest in training and development programs for their team, partner with educational institutions to develop new talent pipelines and create a culture of continuous learning and development.
Data literacy
Another challenge organizations face is a lack of data literacy among their employees. If employees don’t understand how to interpret and analyze data, it becomes difficult to effectively use data to drive business decisions. Organizations can improve employee data use through data literacy training programs. They can also develop data visualization tools and foster a culture of data-driven decision-making, said Parr.
Regulatory compliance
Organizations may face challenges in ensuring that their data analytics practices comply with relevant regulations and standards, particularly in industries such as healthcare and finance. Parr said data analytics leaders must work closely with legal and regulatory experts to ensure their practices align with relevant requirements.
Starting too big
Sometimes, data leaders don’t understand how limitations in their current data infrastructure constrain analytics programs. As a result, they enthusiastically embark on large-scale programs that fall apart because high-value data assets are not stored, structured or made available for use by different teams, Carroll said.
He finds that business leaders are often unaware or don’t fully appreciate more abstract constraints. They can get frustrated when the team cannot move quickly and easily to extract value from sales, customer or operational data.
Building a compelling narrative
Data leaders don’t always effectively communicate analytics benefits to executives and boards in a way that drives investment. It is important to make a clear connection between investments in data infrastructure and the top or bottom line. Data strategists and practitioners must build a compelling narrative around data benefits that can serve different needs simultaneously. Without a compelling narrative, leadership may be skeptical of large investments that can take years to pay off.
“I believe you need to build trust in leadership by demonstrating success with smaller investments that deliver meaningful results in a few high impact use cases on a shorter timeline,” Carroll said. “Then, you need to market those successes internally far and wide.”
Slower development
It’s also important to consider how analytics maturity may progress slower than software development, said Bala. Software projects can track immediate gains in transactional efficiency. In contrast, the value of analytics projects is more strategic and accrues over a longer time horizon.
“This makes investment in analytics quite sporadic and evolutionary progress quite slow,” he said.
Technology evolves quickly
Analytics technology changes at a rapid rate.
“Just when organizations were getting comfortable with basic descriptive analytics, they encountered the data science revolution, which has now been supplanted by the generative AI wave,” said Bala.
Although progress may seem fast, it is not the same as evolution, which is about moving toward increased standardization. Fast changes in technology might inhibit the process of standardization.
The future of analytics maturity
Carson expects analytics maturity assessment to become more frequent, agile and focused on specific areas of the business to drive ROI. New analytics and AI tools could improve processes for capturing, integrating and analyzing unstructured data. Advanced tools can suggest correlations and causations from the raw data and support the generation of high-quality, useable output.
Despite data scientists’ excitement around ML and generative AI, organizations don’t necessarily get as much value out as they put in. Kovich recommends focusing on larger opportunities in data sharing, data applications and smart strategies, such as user segmentation and feature testing.
Maturation should include evolution in the work different roles perform and how people can engage more effectively with data and technology, Carroll said. As computational resources become less expensive, place increasing focus on their application in a human context. Organizations must acknowledge technology is just one piece of the answer. They have to do a better job at designing analytical systems around the user.
“We need to think about how these new tools and methods fit into broader business systems that are governed by humans and shaped by human behavior,” Carroll said.
George Lawton is a journalist based in London. Over the last 30 years, he has written more than 3,000 stories about computers, communications, knowledge management, business, health and other areas that interest him.