As newsrooms experiment with artificial intelligence to create greater efficiency, one question looms large: Are their audiences comfortable with them using AI?
A new national survey funded by Walton Family Foundation and conducted by Local Media Association and Trusting News offers one of the clearest answers yet — and it comes directly from engaged local news consumers.
The findings were unpacked on a recent episode of Keep It Local, featuring John M. Humenik, Chief Strategy and Operations Officer for Local Media Association, and Lynn Walsh, Assistant Director of Trusting News.
Together, they discussed what the survey data reveals about trust, transparency and the non-negotiable role of humans when AI is used by newsrooms.
Inside the survey: Who was asked and why it matters
LMA’s AI Community Journalism Lab newsrooms invited their audiences through articles, editors columns and social posts to participate in an 18-question survey titled, “How should newsrooms use AI? More than 1,400 local news consumers responded. In all, survey responses came from 16 states and Washington D.C.
Key points:
- Nearly half of the survey respondents consume local news multiple times per day
- About 50% were age 65 or older
- 49.8% responded that they have used AI features in online search
In short: this wasn’t theoretical feedback. It came from people who actively rely on local news. Overwhelmingly, 97.8% responded that they want to know if AI was used by the newsroom.
The big takeaway: Humans are non-negotiable
If there is one data point newsrooms should highlight, it’s this: Nearly 99% of the survey respondents said that it was important to have humans involved in the process to review content before it was published.
While respondents were open to more behind-the-scenes uses or experimentation of AI, it also was clear: 85% responded that writing and compiling stories without human review is not acceptable at all or mostly unacceptable.
This finding reinforces why many newsrooms have established internal policies regarding how and when AI is used. The survey results provide audience-backed evidence to support increased community outreach and transparency about AI use.
What respondents are (and aren’t) comfortable with
The survey uncovered a clear hierarchy of AI acceptance:
More acceptable uses
- Language translation
- Converting text to audio
- Editing stories to improve clarity, spelling, and grammar
- Transcribing interviews
Less acceptable uses
- Writing and compiling stories without human review
- Creating and producing images
- Creating and producing audio and video
Even when AI use was guided and verified by newsrooms, 47.6% responded that they were uncomfortable with AI use in news.
Comparably, 46.4% responded they would support even greater use of AI by newsrooms if the work created was held to the same standards as their other work.
Transparency isn’t optional, but it doesn’t have to be complicated
One consistent expectation from respondents: newsrooms should disclose how they use AI.
That doesn’t mean a dense policy page. Walsh emphasized that transparency can take many forms:
- A clearly written AI ethics statement
- A newsroom explainer story
- Ongoing disclosure about AI-assisted workflows
Trusting News is also testing whether AI literacy content — explaining what AI is, how it works, and its limitations — can help audiences feel more confident and less fearful.
The goal isn’t promotion. It’s understanding.
A surprising insight: Familiarity breeds trust
One of the most unexpected findings was the gap between AI users and non-users.
Respondents who already use AI tools in their own lives were significantly more comfortable with newsrooms experimenting with AI. That insight points to a powerful opportunity:
Educating audiences about AI may increase trust — not decrease it.
For newsrooms, that reframes AI communication as community outreach and an opportunity for newsrooms to bring their audiences along with them, as they can learn about AI use together. “AI is the unknown for a lot of them (news consumers). Let’s be their introduction to it,” Lynn Walsh said.
What newsrooms should do next
Based on the survey and AI Lab experience, Humenik and Walsh highlighted several strategies and next steps for 2026:
- Keep experimenting — deliberately and ethically
- Document and explain AI use clearly
- Involve humans at every step
- Ask audiences directly what they expect
- Revisit policies as technology and norms evolve
The survey is designed to be a working tool — something newsrooms can bring into meetings, use to shape workflows and adapt for their own communities.
Where to read the full survey
The complete survey report is available on the Local Media Association website at localmedia.org. It includes:
- Full survey questions
- Detailed response breakdowns
- Practical guidance for newsroom discussions
- Transparency and disclosure insights
Trusting News also offers a reusable version of the survey through its AI Trust Kit, allowing any newsroom to run a similar audience check-in.
Why this matters
Audiences are paying attention. They’re curious, cautious and not yet settled on what AI should mean for newsrooms and local news.
This survey makes one thing clear: trust with audiences will be earned through transparency, human judgment when AI is used, and ongoing dialogue.
🎧 Listen on Spotify: https://open.spotify.com/episode/5tcGqSGdWhXfAjOC8bvrIX
🎧 Listen on Apple: https://podcasts.apple.com/us/podcast/how-news-audiences-feel-about-ai-use-by-newsrooms-what/id1808196993?i=1000744038248
🎧 Watch on YouTube: https://www.youtube.com/watch?v=TFNMDLmASrM
If you’ve listened to and enjoyed this podcast, please take a moment to leave us a review. That helps platform algorithms surface the content so that it can benefit other newsrooms.
Editor’s note: Artificial Intelligence was used to transcribe and create an initial summary of this article, which was then edited by LMA staff.
