Newsroom 2.0: How AI is changing the game

In today's digital age, AI is transforming the way news is produced and consumed. With its ability to quickly analyze data and automate tedious tasks, AI helps journalists to uncover stories and deliver news faster and more efficiently than ever before. However, as with any new technology, the use of AI in journalism must be approached with caution and responsibility.

At the 2023 News Product Alliance Summit, three insightful sessions centered around AI’s application in journalism. In this article, we will discuss:

  • Important AI terms mentioned in the NPA summit AI sessions 

  • How AI can help newsrooms in editorial content generation and automation

  • The potential pitfalls newsrooms need to be aware of while adopting AI into work processes

Key AI terms to familiarize yourself with

Before we dive into today's article, let's familiarize ourselves with some key concepts about AI. Understanding these will make it easier for you to grasp the topics we'll be discussing.

Prompt: A prompt refers to the input or instruction provided to the model to generate a desired output. It serves as a starting point or a set of guidelines for the model to follow when producing a response or completing a task.

Response: The output generated by a machine learning model in response to a prompt 

Generative AI: A type of artificial intelligence that is capable of generating content, such as text, images, or music, without explicit human input.

Large-language model: A type of machine learning model that uses natural language processing (NLP) to digest large quantities of text data and infer relationships between words within the text.

Metadata (in the context of newsrooms): It was often explained as “data about data.”  Prior to the integration of AI, the task of managing digital assets like images and videos falls on the shoulders of editors and journalists, which proved to be a tedious and unappealing task. However, with the help of AI, processes such as asset tagging and keyword extraction can be much easier, enabling newsrooms to create and distribute content with ease.

Hallucination: In the context of artificial intelligence, it occurs when an artificial intelligence system generates information or content that is not based on real or accurate data, which involves the outputs that are plausible sounding but are actually fabricated or invented by the AI model itself. In the realm of journalism, it often raises concerns due to the potential generation of misinformation or biased content.

AI use cases in the newsroom

In the session "Let AI do the boring stuff," Joe Amditis, assistant director of products and events at the Center for Cooperative Media, and Simon Galperin, founder, and director of Community Information Cooperative, delved into the potential of AI for local news organizations. They discussed how AI can streamline workflows, enhance efficiency and provide reporters with more time to excel in their primary role: serving the community.

Here is a list of AI use cases in the newsroom, curated by Joe Amditis:

Content creation

Joe's "Beginner's Prompt Handbook: ChatGPT for Local News Publishers" provides valuable insights on streamlining content creation in the newsroom. According to Joe, utilizing these sample prompts can significantly enhance workflow efficiency:

  • Simplifying and cleaning interview transcripts:

    • "Please summarize the key points made in the following interview and provide a cleaned-up transcript. [PASTE TRANSCRIPT]"

    • "Can you extract the most important quotes from this interview and turn them into a concise summary? [PASTE INTERVIEW/TRANSCRIPT]"

    • "Create a brief news article from the information gathered in this interview.” [PASTE CONTEXT/TRANSCRIPT]

  • Generating outlines, agendas and documentation:

    • "Please generate an outline for an upcoming town hall meeting on the topic of [XYZ]."

    • "Can you create an agenda for an editorial board meeting, including sections for the topics to be discussed and a breakdown of the order in which they will be addressed?"

    • "Create a summary report of [EVENT OR MEETING], including details on attendance, key speakers, and important takeaways. [PASTE NOTES/TRANSCRIPT]"

  • Creating social media and promotional copy:

    • "Write a series of social media posts promoting [XYZ], including key details and information on how to register or attend. [PASTE CONTEXT/EVENT INFO]"

    • "Can you create a promotional video script for an upcoming [EVENT/CAMPAIGN/PROJECT], highlighting the goals, featured speakers, and ways to get involved? [PASTE CONTEXT]"

    • "Generate a press release promoting [PROJECT/ORG/BUSINESS/EVENT], including information on the mission, goals, services, and unique value proposition. [PASTE CONTEXT]"

    • “What are some unique angles or perspectives that we can take on local news stories about [SUBJECT/TOPIC]?"

    •   "What are some tips for conducting effective interviews for local news stories about [SUBJECT/TOPIC]?"

    •  "How can we incorporate multimedia elements such as photos, videos, and audio into our local news stories about [SUBJECT/TOPIC]?"


Content automation

In addition to editorial content creation, AI can expedite manual tasks in the newsroom through content automation.

Metadata, often referred to as “data about data,” serves multiple purposes within the newsroom. It plays a vital role in content management, search functionalities, information retrieval, and more. Below are some commonly used types of metadata in newsrooms:

  1. Title and headline

  2. Author and contributor information

  3. Data and Timestamp

  4. Tags and keywords

  5. Source and attribution

  6. Location and geographic metadata

  7. Multimedia metadata

  8. Copyright and usage rights

The async Global Conversations on Slack, hosted by Samya Ayish, the communication manager of Arab Reporters for Investigative Journalism, focused on the implementation and usage of AI in small and medium-sized newsrooms. Here are some examples of newsrooms that have adopted an AI strategy:

  • ARIJ leveraged AI to create a maturity test for other organizations which want to test their knowledge, infrastructure, and capabilities.

  • Gannet uses natural language generation to automate stories off templates.

The perils of AI applications in newsrooms

Harnessing the vast potential of AI in the newsroom requires careful consideration of its limitations. Defining guardrails and building a rules-based hierarchy for things to flag/block should be a priority sense for these systems.

In the session "Master Class: My boss thinks AI is cool. What do I do now?", led by Eric Ulken, head of product at The Baltimore Banner, and Mutale Nkonde, founding CEO of AI For the People (AFP), a nonprofit communications agency, an important concern came to light: the potential for AI to generate biased, harmful, or unsafe content, highlighting the critical importance of exercising caution in its implementation.

When using AI-generated content, intellectual property and plagiarism become significant considerations. While AI can assist in creating news articles and other content, it is essential to ensure the content's originality and attribute the correct sources.

Another risk associated with AI in the newsroom is the potential for incorrect facts, particularly in the form of hallucinations. Hallucinations occur when an AI system produces information or content that isn't based on actual data or facts due to imperfect technology. To maintain accuracy and reliability, human oversight and fact-checking are crucial when using AI for news generation.

The curation and personalization of stories through AI also pose risks. While this approach may boost conversion rates, it can limit readers to content aligned with their preferences, ultimately harming the quality of journalism by limiting exposure to diverse perspectives.

To ensure the ethical and responsible use of AI in newsrooms, it is necessary to develop an AI Ethics Code specifically tailored for this purpose. Such a code would address concerns and guidelines unique to the news industry, ensuring that AI technology is harnessed effectively while maintaining ethical standards.

"It is crucial to remember that AI should work for people, not the other way around," Nkonde said.

Aria Yang

Aria Yang is a second-year graduate student at UC Berkeley, School of Journalism, with a focus on multimedia and data visualization. She is currently based in Berkeley, CA and loves writing blogs about pop culture, Gen-Z and all things technology. Aria has worked in audience engagement and product management. During the Summit, she hopes to hone in on her understanding of news products as she wants to work in a newsroom product role in the future.

Previous
Previous

Learn what makes Platform Product Managers different!

Next
Next

The News Product Alliance Mentor Network Program Concludes on a High Note