AI Evaluations: The Lessons Readers Are Telling Publishers

← Back to blogPublishing

ChatGPT turned a whole 1 year old last week, and proud parent OpenAI rounded out that inaugural year in the spotlight with enough boardroom drama to fast-track a Netflix series.

In that first year, study after study has come out revealing how individual businesses and entire industries are either tiptoeing or cannonballing into the AI waters. I’m of the belief that publishers have much to gain from harnessing a wide variety of AI’s current capabilities, and with the launch of Google’s Gemini, I can’t wait to see what 2024 has in store for the technology.

AI within the confines of journalism, however, still seems to be eluding acceptance.

A recent report from the JournalismAI at the London School of Economics and Political Science (LSE) said more than three-quarters of news organizations are using AI in “at least one of the areas across the news value chain of news gathering, production, and distribution.” But it seems any actual real-world examples of AI use in the newsroom — whether it was CNET producing stories it later had to issue corrections on or, more recently, Sports Illustrated using AI-generated authors for articles it later deleted — is met with immediate backlash by insiders and readers alike.

One experiment, conducted jointly by researchers from the University of Minnesota and the University of Oxford, sought out how audiences perceived AI-generated news. “On an 11-point trust scale,” Nieman Lab’s Sarah Scire writes, “survey respondents who saw the news stories labeled as AI-generated rated the mock news organization roughly half a point lower than those shown the article without the label — a statistically significant difference.”

The researchers found that more than 80% of respondents believe news organizations should disclose if AI was used, with 78% calling for an “explanatory note” describing how it was used exactly. “I wonder if there are ways of describing how AI is used that actually offer audiences more assurances in the underlying information being reported,” says researcher Benjamin Toff, “perhaps by highlighting where there is broad agreement across a wide range of sources reporting the same information. I don’t think all audiences will inevitably see all uses of these technologies in newsrooms as a net negative … and I am especially interested in whether there are ways of describing these applications that may actually be greeted positively as a reason to be more trusting rather than less.”

Until then, trust remains an issue for audiences. Readly recently found that only 7% of Brits it surveyed believe AI can enhance journalism, with 38% saying it’s potentially harmful. On this side of the pond, Readly found that only 10% thought AI should be used in news. (As for the larger AI benefits outside of journalism, 19% of Brits were optimistic, while 62% had mixed feelings or were worried.)

“These insights underpin the importance of human touch and oversight particularly in sectors like journalism, teaching and banking,” says Chris Couchman, Readly’s Head of Content. “Brits are happy to embrace technology when it has clear benefits to our daily lives, but are wary of overreliance on AI in areas where human judgment plays an important role.”

I’m reminded of Google’s reported pitch of a proprietary AI tool to executives at The New York Times. According to sources in the room, Google tried to sell it as a personal assistant for journalists. And on paper, that’s exactly how users, including those in publishing, should consider AI, its capabilities, and any role it might play: an assistant to human efforts.

That Google’s larger pitch was seen by NYT execs as “unsettling” and taking for granted “the effort that went into producing accurate and artful news stories” speaks more to a Big Tech undervaluing of human judgment than it does an indictment on AI as a whole.

After the year we’ve seen and the future we’re staring down, it should be heartening that readers haven’t lost focus on the human value.

SEE FOR YOURSELF

The Magazine Manager is a web-based CRM solution designed to help digital and print publishers manage sales, production, and marketing in a centralized platform.

Request a Demo

Schedule a free demo with an experienced software consultant to help make your publishing efforts successful.

Related Articles