Generative AI

Stanford AI Index 2024 Report: Growth of AI Regulations and Generative AI Investment


Stanford University’s Institute for Human-Centered Artificial Intelligence (HAI) has published its 2024 AI Index annual report. The report identifies top trends in AI, such as 8x growth in Generative AI investment since 2022.

This year marks the seventh edition of the AI Index report, which is compiled by an interdisciplinary team in cooperation with government, industry, and academia. The report contains nine chapters, and the editors have distilled several key takeaways from the Index, including: that the number of AI regulations in the USA has increased 56.3% in the last year; that model training costs, especially for LLMs, have increased “significantly” over recent years; and despite growth in Generative AI investment, overall private investment in AI has decreased since 2021. The Index’s co-directors Ray Perrault and Jack Clark wrote:

The 2024 Index is our most comprehensive to date and arrives at an important moment when AI’s influence on society has never been more pronounced. This year, we have broadened our scope to more extensively cover essential trends such as technical advancements in AI, public perceptions of the technology, and the geopolitical dynamics surrounding its development. Featuring more original data than ever before, this edition introduces new estimates on AI training costs, detailed analyses of the responsible AI landscape, and an entirely new chapter dedicated to AI’s impact on science and medicine.

The report is organized into nine chapters: Research and Development, Technical Performance, Responsible AI, Economy, Science and Medicine, Education, Policy and Governance, Diversity, and Public Opinion. The Science and Medicine chapter is a new addition this year, and covers the increasing role of AI models in scientific and medical reseach, calling out models such as DeepMind’s AlphaDev model, which produced a more efficient sorting algorithm. The report also notes a 12.1% increase since 2021 of FDA-approved AI-related medical devices.

In the chapter on Research and Development, the report delves into the training costs for foundation models, particularly LLMs. The report notes that “detailed information on these costs remains scarce,” and collaborated with AI research institute Epoch AI to estimate the costs. The report includes a chart showing exponential growth in training cost over time, with Google’s original Transformer model estimated to have cost less than $1k to train, compared to recent models such as GPT-4 and Gemini costing $100M or more.

Estimate Training Cost for Notable Models

Model Training Costs over Time. Image Source: 2024 AI Index Report

According to the report, this growth in training cost has “effectively excluded universities” from developing models. The report’s data shows that in 2023, industry labs produced 51 “notable” models, compared to 15 from academia; before 2016, by contrast, academia produced as many or more than industry. On the other hand, industry-academia collaborations created 21 notable models in 2023, which is a new high mark.

Notable Models by Sector

Notable Models by Sector. Image Source: 2024 AI Index Report

Artur Skowroński, editor of JVM Weekly Newsletter, wrote about the report on LinkedIn:

For someone who wants to understand what’s happening, but doesn’t have time to follow it continuously (and recently I had this type of thought, it is extremely difficult with this amount and pace of announcements, especially when you want to verify anything), it’s essential reading. 500 pages, but accessible and well-operated resolution – each topic is presented from the general overview to the fine details.

The full report can be downloaded from the AI Index website. The report’s raw data and charts are publicly available on Google Drive. The report is licensed under the Creative Commons Attribution-NoDerivatives 4.0 International license.





Source

Related Articles

Back to top button