Artificial intelligence is no longer theoretical in journalism. By early 2026, it’s already embedded in many newsroom workflows, whether formally acknowledged or not.

In the latest episode of the Keep It Local podcast, Local Media Association board member and Draper Digital Media vice president Ethan Holland joined host Ryan Welton to discuss how AI is actually being used, what newsrooms tend to misunderstand, and where the technology is heading next.

AI is becoming infrastructure

Holland compares today’s AI moment to the early days of the internet. There may be financial hype around the technology, but its underlying impact is real and lasting.

Roughly half of workers now use AI tools in some form during their workday. For newsrooms, that’s significant.

Moving beyond the chatbot mentality

One of the most common mistakes Holland sees is treating AI primarily as a writing tool.

He argues that AI’s greatest value lies in helping journalists process information more efficiently. That includes summarizing long documents, analyzing audio and video and helping reporters make sense of large amounts of data.

Used this way, AI acts as a partner rather than a replacement for editorial judgment.

Why context matters more than prompts

Prompt engineering often gets framed as a technical skill, but Holland says most traditional prompting tricks no longer make a meaningful difference.

Instead, he focuses on providing AI with as much context as possible, even if the input is informal or unpolished. More information generally leads to better results than carefully structured prompts.

Tools that reduce friction in newsrooms

Beyond chatbots, Holland highlights a growing ecosystem of AI-powered tools already improving newsroom efficiency.

These include tools for audio cleanup, image editing, visual analysis and converting unstructured information into usable data. Many of these tools don’t generate content themselves but remove time-consuming steps from the reporting process.

Ethics come down to accountability

Holland emphasizes that AI ethics in journalism are not fundamentally new.

If a journalist’s name is on a piece of work, that journalist is responsible for its accuracy and authenticity. AI can assist with editing and production, but it should never be used to fabricate facts, images or events.

He also cautions against overly restrictive policies that prevent journalists from learning how to use these tools responsibly.

One area where Holland draws a clear line is the use of AI-generated likenesses.

Journalists should always control whether their image or voice is used in synthetic media. Any use of avatars or AI-generated representations should be voluntary and transparent.

The larger risk, he notes, comes from bad actors using powerful tools without ethical constraints.

Predictions for AI in 2026

Holland expects AI’s next phase to be less visible but more impactful. Advances in areas like medicine, automation and agent-based systems point toward AI taking on routine tasks rather than replacing human expertise.

For journalism, that could mean faster research, better analysis of public records and more time for enterprise and investigative reporting.

Key takeaways for newsrooms

• AI is becoming part of newsroom infrastructure
• Its biggest value is in processing information, not writing stories
• Context matters more than clever prompts
• Ethics hinge on individual responsibility and accuracy
• Journalists should control their own likeness
• Time saved should be reinvested in reporting

🎧 Listen on Spotify: https://open.spotify.com/episode/4OvDNIi8fwFyhO56IoPx3c
🎧 Listen on Apple: https://podcasts.apple.com/us/podcast/looking-ahead-to-ai-in-2026-ways-newsrooms-can-use-it-well/id1808196993?i=1000746878914
🎧 Watch on YouTube: https://www.youtube.com/watch?v=F2X_0SEBRA4

If you’ve listened to and enjoyed this podcast, please take a moment to leave us a review. That helps platform algorithms surface the content so that it can benefit other newsrooms.

Editor’s note: Artificial Intelligence was used to transcribe and create an initial summary of this article, which was then edited by LMA staff.