Artificial intelligence is rapidly changing the world, and news is no exception. News organizations are using AI to automate many tasks for employees. Still, the question on many people’s minds at the Local Media Association’s recent LMA Fest was, “Can we move fast enough to keep up with new technologies?”

Frank Mungeam, LMA chief innovation officer; Dorrine Mendoza, lead director of LMA’s Family and Independent Media Sustainability Lab; and Aimee Rinehart, senior product manager, AI strategy, The Associated Press, led an interactive workshop to help attendees identify and explore the most promising AI opportunities. Here are the top takeaways:

Automate the boring stuff

“AI can save you time today in every single function in your organization,” said Mungeam. A great way to start is to “figure out the boring, repetitive, annoying stuff in your job and everyone else’s, and you can use AI to chip away at that.”

AI can automate many tasks for newsroom employees, such as transcribing audio and video and summarizing large amounts of data. This automation can free journalists to focus on more creative and strategic work, such as investigating complex stories.

In fact, in the latest McKinsey Global Survey, one-third of respondents said their organizations are using generative AI regularly for tasks such as summarizing, analyzing, drafting, composing, reviewing and optimizing data and text.

Here are some specific examples of how AI is used in journalism:

  • The Associated Press uses AI to automate stories such as sports coverage and corporate earnings reports.
  • ARLnow uses AI to summarize the news in its morning newsletter.
  • Semafor used eyewitness accounts as text prompts for an AI image creator in combination with the style of an artist in a series called “Witness.”

Establish guidelines and guardrails

AI is a powerful tool that can be used to improve the efficiency and effectiveness of journalism; however, one key takeaway was that AI should be used responsibly and ethically.

“AI can take you 80% of the way there. The rest should be human,” Rinehart said. “Nothing should [be published] without a human review.”

It is important to know the risks before using AI. AI can return confident but false results; it is can repeat and amplify bias; and there are copyright and privacy implications from using these models that we don’t yet fully understand.

Experiment and encourage experimentation

The workshop leaders agreed news organization leaders should experiment with AI and encourage staffs to do the same.

Rinehart expressed frustration at taking marching orders from big tech. “This is our opportunity to set the agenda for ourselves about how it should be,” she said.

She said journalists are smart, curious and engaged with the world — the best people to experiment with AI and help shape its development. She added that news organizations should create a way to document the learnings from these experiments, such as spreadsheets or checklists.

Note: Trint, an AI transcription service, was used to record and transcribe the audio from the workshop, and Nota, an assistive AI tool for journalism, was used to pull out summaries, key points and quotes from the transcription.