Generative AI

People don’t trust the news media to use generative AI responsibly, RISJ finds


A new report from the Reuters Institute offers up insight into how readers around the world are thinking about the place of generative AI in their news. The study cast a wide net, surveying the general public in six countries: Argentina, Denmark, France, Japan, the U.K., and the U.S.

There has been much debate about the real impact generative AI has already had on news publishing. But among the public, the perception is relatively strong that generative is currently being widely used, and will only continue to change the industry.

On average across the six countries, the majority responded that journalists are at this moment “always” or “sometimes” using generative AI “with some human oversight.” That remains true across 11 different editorial tasks, including writing the text of an article, creating AI-generated images to use in place of photographs, doing data analysis, copyediting and writing headlines.

Looking ahead, 66% of respondents also said they expect generative AI will have a “very or somewhat large impact” on the news media industry.

The study’s outlook is bleak though when it comes to perceptions of whether generative AI will make the news media better or worse. Scientific research, healthcare, and shopping all received largely positive responses. News and journalism, meanwhile, received the second least positive outlook among the 14 categories, sandwiched between “equality” and “job security.”

Despite that overall pessimism, comfort levels with AI adoption do change substantially depending on what task is being done by journalists. The study was able to get granular, breaking down the editorial process into its different parts. People were, on average, most comfortable with AI tools being used to do basic copyediting (+38%) and translating stories into other languages (+35%). Still, more people were comfortable with using AI to write headlines (+16) or producing an audio or video version of a written article (+15) than were not.

When it comes to producing the core elements of a story though, that sentiment begins to change. -1% of people were comfortable with AI tools being used to write the body of text. -13% for using an AI-generated image when a photograph is not available. And -24% for creating an artificial author or persona to write and present the news.

Those comfort levels also varied greatly based on the beat at hand. People were relatively comfortable with generative AI being used to produce stories on fashion, sports and arts and culture coverage. That changed substantially when it came to using generative AI to produce stories on topics like international affairs, and most especially, politics.

Whether or not respondents were comfortable with stories written using AI tools, there was an overwhelming opinion that AI-generated news isn’t as deserving of their dollar, yen, or euro. Only 8% of people thought that news produced by AI is “more worth paying for,” compared to news produced by a human.

Respondents also said, at large, that they thought news produced using AI, “with some human oversight,” would be cheaper to make. 33% of respondents answered as such on average across the six countries. That’s one window into the perceived intention behind AI adoption in news media. From the outside, readers appear to think a driving factor in bringing generative AI into the newsroom is cost cutting.

News publishers who adopt AI may be flirting with eroding the already delicate trust they’ve built with their readers. Respondents thought that news produced using AI would be, on net, less trustworthy.

Across 12 different industries, news media was also on average one of the least trusted industries to use generative AI responsibly. Those who “strongly” or “somewhat” trusted news media on this point varied from 12% in the U.K. to 30% in Argentina. In most cases, its trustworthiness was trailed only by “the national government,” “social media companies” and “politicians and political parties.”



Source

Related Articles

Back to top button