Harnessing AI and Analytics for Advanced Procurement Strategies
In today’s tumultuous business landscape, where price volatility, geopolitical tensions, and sustainability imperatives converge, one function stands at the forefront of navigating these complexities: procurement. In this context, the procurement function will act as a strategic lever for value protection and creation.
Procurement sits at the confluence of huge quantities of data, flowing from within the organization (for example, spend, demand patterns, specifications) and from without (suppliers, market insights databases, and the wider web). Today’s digitized, connected organizations must tap into this data and develop new tools to make faster, better sourcing decisions. Mastering the data will empower procurement teams to achieve strategic objectives that go far beyond traditional cost, quality, and delivery metrics.
What data can deliver
Better data can support activities and decisions across the sourcing life cycle, from the development of category strategies and the assessment of potential suppliers to the execution of negotiations and ongoing supplier performance management. Done well, that can increase the pipeline of value creation initiatives by up to 200 percent. Let’s look at five areas where data will have the most impact:
- Optimizing spend and demand. With the application of AI and generative AI (gen AI) technologies, category management can be automated and accelerated in multiple ways. First, spend categorization algorithms can create cleaned spend cubes seamlessly. Second, demand forecasts and optimization will see a step change in accuracy, making sourcing, demand, and supply chain control and optimization much more relevant. Similarly, gen AI interfaces allow procurement leaders to interrogate spend, market, or specifications data, asking questions regarding, for example, the share of spend exposed to a particular climate or geopolitical event, the should-cost increase due to oil price fluctuations, or the possible alternative sources to a supplier in distress. For standardized items with highly competitive markets such as transportations or temporary labor, buyers would not need to interfere, leaving bots to make trade decisions autonomously based on predefined objective functions. Gen AI can also be used to automate contract generation for the category, as some airline companies are starting to do at scale. Another disruption will touch supply market analysis and strategy optimization, identifying sources that expose the company to high risks and automatically finding alternative sources within a defined price range and lead-time. Finally, companies can use machine learning to analyze usage patterns and forecast demand to automatically generate sourcing scenarios and strategies based on inventory-level intelligence, especially for frequently bought items.
- Managing external drivers of profitability. Procurement teams will be able to combine internal data with external market reports and databases and use machine learning algorithms to uncover patterns and trends in commodity prices. Chief procurement officers and category managers will rely on such predictions to stay at the forefront of industry profitability with real-time transparency on price volatility exposure. They will be able to dynamically compute the should-cost of their most volatile commodities and negotiate with suppliers based on facts. Procurement will also be able to assess the impact of any variation in input prices on the product margins, and analyze multiple scenarios to define the right actions to protect those margins. Reactive actions may include shifting to alternative approved recipes or value chains, adjusting planning or stock levels, financially hedging commodities, or passing pricing changes to the customer. For example, a consumer company developed a price forecasting model that was combined with a hedging optimizer to recommend monthly the volume of palm oil derivatives to hedge for. Mid- and long-term actions may include optimizing contracting strategy by adjusting contract lengths and start dates in negotiations with suppliers (for example, spot buy versus fixed price for a few months), redefining supplier regional footprint by taking advantage of price evolution differences across countries, or obtaining cost advantages through strategic agreements with selected Tier-N suppliers, including partial acquisition or vertical integration with value chain players obtaining disproportional margins.
- Managing supplier performance. Digital dashboards can combine contract, invoice, and supplier delivery performance data to provide a comprehensive picture of supplier adherence to service-level agreements. By providing early warnings of performance deviations, these systems can steer operational interventions or supplier collaboration projects. Parametric cleansheet tools can automatically calculate the should-cost for thousands of items, helping companies optimize specifications, and allowing them to conduct robust, fact-based negotiations with their suppliers. Gen AI technologies are changing supplier management too, with the emergence of automated tools that can produce smart intelligence on supplier risk profiles based on public data such as social media reports. Gen AI can also help procurement personnel optimize their dialogue with suppliers, for example, by automating the creation of negotiation scripts, reports, emails, and contracts.
- Managing supply risks. In best practice by 2030, procurement will be equipped with a digital twin of their supply chain, modeling all nodes across the globe, from sub-tier suppliers of raw materials and their direct suppliers to the internal manufacturing network, the customers, and all the logistic channels connecting these nodes. These digital twins will be developed by combining two approaches. Material flows from tier-one suppliers will be mapped in collaboration with suppliers, while web data mining will be used to close data gaps by building a picture of flows from across tier two to tier-n. Each node will have a near-live view of the associated supply risk, cost, and carbon intensity, computed based on logic that allows filtering of signals by their actual risk and the availability of mitigation actions. Procurement will not only have a thorough understanding of the present state of its supply chain but will also be able to simulate the risk level given forecasted business growth, the occurrence of risk events, and the effect of mitigation actions. The digital supply chain twin will allow procurement to assess the implications of any type of change or disruption in highly complex and interconnected value chains and react to adverse events much earlier. In addition, teams that adopt this technology will react faster than their peers to changes in supply signals and, as a result, have the right product at the right place and at the lowest cost and carbon footprint.
- Leading on sustainability. Sustainability-centric data and effective analytical tools are vital as companies strive to meet demanding goals for carbon reduction, pollution prevention, and the elimination of unfair labor practices in the supply chain. In terms of environmental sustainability, procurement data based on expenditure can be utilized to estimate the baseline for carbon emissions within the supply chain. This approach has been adopted by many companies to evaluate their upstream Scope 3 emissions. Enhancing bilateral carbon transparency between suppliers and customers necessitates additional efforts, however, including collaboration with suppliers to establish product-level emission reporting (for instance, consumption-based emission factors) and sharing standards through existing procurement data infrastructure. Integrating these sustainability metrics with procurement data and advanced analytics can assist companies in engaging with, evaluating, and monitoring the sustainability progress of their suppliers. Consequently, the approach can help companies reduce their supply chain emissions by selecting products or suppliers that align with their sustainability goals.
The data-driven procurement revolution is already under way. The World Economic Forum (WEF) Global Lighthouse Network is a diverse community of manufacturers united by their leadership in the use of Fourth Industrial Revolution technologies to transform factories, value chains, and business models. Two recent members of the network earned their place by transforming their procurement operations with digital and AI technologies.
Pharmaceutical company Sanofi applied should-cost modeling to inform make-versus-buy design across multiple categories, achieving an average 10 percent reduction in spend. An advanced analytics platform reduced the time required to evaluate tenders by two-thirds, and digitally enabled negotiations helped it increase the savings achieved by 281 percent.
Teva Pharmaceuticals used analytics-driven procurement supported by spend intelligence and an automated spend cube to achieve a more-than-tenfold improvement in supply resilience. The company’s smart spend category creation systems cut the time required to develop category strategies by 90 percent. Teva’s global procurement unit, based in Amsterdam, is the main contributor to its ambitious gross margin improvement program.
Significant challenges remain
Senior procurement leaders understand that more-effective use of data is vital. Yet many procurement functions still struggle to transform themselves into data- and technology-driven organizations. In a recent McKinsey survey, CPOs highlighted three key problems that they believe are holding back their digital ambitions: issues with data quality and access, lack of clarity over the business case for new digital or AI applications, and difficulty driving adoption of the new tools at scale.
Data quality and access challenges. CPOs expect data, analytics, and gen AI to play a core role in every business decision by 2030, but respondents to our survey admit that their data infrastructure is not ready to support this ambition. Twenty-one percent say their data infrastructure maturity is low, with less than 70 percent of spend data stored in one place. An additional 30 percent think they have average levels of data maturity, and even those who have implemented systems to give them a single source of truth for all spend data admit that this data is not cleaned and categorized. These systems may also lack important information from outside the procurement function, such as quality or specification data, or external data from suppliers, customers, and the wider market.
Difficulty articulating the business case. Procurement teams also find it difficult to secure funding for analytics and AI projects, often due to the lack of a compelling business case. This challenge is typical in organizations that follow a “technology-back” approach, that is, selecting software and solutions without a clear link to business value creation opportunities.
Low levels of adoption. Organizations that overcome the first two challenges often run into the third. Even when they have built a business case and proved the effectiveness of a digital use case in tests, they find it difficult to embed its use in their core processes and teams’ ways of working across the organization. This is a common challenge in data analytics transformations regardless of business function, leaving many organizations stuck in pilot purgatory. It is especially common in procurement, where teams are often focused on delivering quarterly results, or submerged by short-term obligations, and do not take the time to understand and adopt new technical solutions.
The recipe for success at scale
Scaling a data analytics transformation is what differentiates Lighthouses from the rest of the pack. Companies that have achieved this tend to get a few vital things right. They focus on a small number of high-value digital and AI use cases. They build and, crucially, own a robust data infrastructure to support those use cases. And they spend as much time thinking about people and processes as on technology: adapting their core business processes and operating model, upskilling and reskilling their people, and steering change across the organization.
A focus on high-value use cases
Many procurement organizations have road maps that aim to deploy fifteen or more data products every year, including the rollout of complex data architecture and end-to-end suites. In our experience, however, organizations have limitations on their capacity to test, validate, industrialize, and scale such a volume of technical solutions. Similar limitations apply to the IT function’s ability and capacity to steer and industrialize new tools, and category and buying teams’ ability to integrate those tools while delivering on their annual business objectives.
Companies that have been able to scale analytics successfully have focused their annual road maps on a prioritized set of five or six technical solutions selected based on the value potential each can create and how each addresses core business questions and users’ needs. The Pareto principle applies, with a handful of data products delivering 60 percent to 80 percent of the value at stake and making investments net positive in the eight-to-twelve-month range.
One WEF Lighthouse organization prioritized six use cases for its procurement analytics transformation: category analytics, parametric cleansheets, predictive pricing, and digital trackers to monitor input costs, supplier performance, and supply risks. Using just those use cases, the company was able to double the value creation opportunities identified by the procurement function.
Dedicated data platform/domain and dedicated technical resources
An encouraging number of companies have started to disrupt the way data transforms procurement by enriching their spend data with a mix of AI-powered data categorization and rigorous master data management practices. This allows them to build data models that integrate a comprehensive set of relevant data sources, both internal and external, such as market insight databases.
Leading organizations create their own procurement data model with a dedicated team. Rather than attempting to fix all data at once, leaders focus on the data they need for high-priority use cases and work backward from there. This helps every data component that is processed create value for the organization, versus ingesting diverse data sets before evaluating their uses. And while systems exist to cleanse data, strengthening data governance processes remains a priority.
Equally important is partnership with the IT function from the beginning. Driving a holistic analytics transformation is as much a technology transformation as a business practices transformation. It requires the support of the company’s best technical architects and engineers to succeed. Partnering with IT and digital at the outset is critical to ensure all design choices are made in line with best practice, and to get sufficient technical capacity and expertise to build the required data models and pipelines.
Putting users at the center
Considering user-centricity from day one is crucial to driving fast adoption throughout the life cycle of a data product. This starts by ensuring the needs, pain points, and preferences of procurement professionals are understood early on. Unnecessary features that drive complexity and ultimately deteriorate user experience must be avoided. In addition, the user interface needs to be intuitive and easy to navigate, ensuring that users can quickly grasp the functionality and benefits. It is important to consider existing procurement systems and processes so that new data products are integrated seamlessly, reducing disruption for users and the need to mitigate processes with change management.
Meanwhile, communication and training are key to the successful launch of data products. Clearly articulating how new data products can enhance efficiency, streamline processes, and improve decision making motivates early adoption. This communication needs to be bidirectional, actively listening to users, incorporating their input, and upgrading features, while conveying a clear change narrative from CPOs and their leadership teams.
Talent and skills
Procurement teams don’t typically have enough people skilled in data, analytics, and AI to support their digital ambitions. Our survey highlighted a direct correlation between the level of advancement of an organization and its share of available analytical resources. Best-in-class companies place 22 percent of procurement employees in analytics teams. This suggests that companies will need to invest and increase the number of data profiles available to scale through external hiring of data-savvy profiles or reskilling of existing teams.
Track impact and manage performance
Leading organizations invest resources to keep their data transformations on track. Typically, they set up a transformation office that monitors progress versus the initial road map and tracks impact delivery rigorously, raising flags when solutions underdeliver. This allows procurement to step back and analyze what may not be going as planned, enabling course correction. One WEF Lighthouse is tracking value creation and the number of users of each data product on a weekly basis, reviewing with the CPO monthly to ensure the right progress and to unlock the pockets of resistance early on. In the same organization, the impact delivered on categories by each data product is tracked and recorded in the value tracking system, to ensure full transparency on which data products work and can be scaled, versus which ones are stalling and require support.
We have seen a leading company interrupt the development of a data product that required heavy manual work to compile product specifications data, for which the organization and technical architecture was not ready. In another example, a leading consumer packaged goods player entrenched in developing digital design-to-value capabilities identified early on that business partners had low ownership of the value creation ideas it generated. Stepping back, it adapted its approach by bringing the business to the core of solution deployment.
From vision to transformation
Turning the procurement function into a data-driven, AI-enabled organization is a process of transformation that typically takes six to eighteen months to make a step change. Every successful transformation requires vision, ambition, and sustained commitment from senior leadership. It also depends upon teamwork, engagement, and excitement from across the organization. Any CPO embarking on such a journey should begin as they mean to continue: by collaborating with internal and external stakeholders.
As a first step, engage with stakeholders across the business to understand what they need from a high-performing procurement function, and where their major pain points are today. That will help procurement get its priorities right, allowing it to identify the data products that will deliver significant value quickly. Technology partners will be a second key group of collaborators. They include the organization’s internal IT function and external suppliers of data platforms, AI technologies, and analytical tools.
Armed with a clear picture of business needs and potential solutions, procurement can revise its technology road map. It should do that with twin objectives in mind: early implementation of AI and analytics solutions that create value, while building the foundations of a data platform that will meet the organization’s long-term needs. Quick wins from the first use cases are critical in building momentum for the transformation. By showing end users and business leaders what data and analytics can deliver, they help foster excitement and drive engagement across the organization.
And, as they begin to deploy those high-priority solutions, procurement leaders should keep another group of collaborators at front of mind: the procurement teams who translate data-driven insights into value for the business. Focusing on the adoption of AI technologies from day one helps procurement build solutions that work better, scale faster, and create more value for the organization.