Generative Artificial Intelligence

WBUR, as a public media organization, prioritizes trust and transparency in our journalism and engagement, particularly in an era of declining media trust and rampant misinformation. In response to the transformative impact of generative artificial intelligence (GAI) on journalism, WBUR has developed the following ethical guidelines to navigate this evolving landscape.

The guidelines serve as ethical standards and best practices, drawing from WBUR's existing Ethical Guidelines and NPR's GAI guidelines. WBUR will only use GAI services when we understand how they work and know their strengths and vulnerabilities. The following guidelines are dynamic and subject to frequent updates to adapt to evolving knowledge and technological advancements.


WBUR’s journalism rests on the pillars of integrity, accuracy, independence, humanity and transparency. Our audience counts on our journalists to maintain the highest standards in all of these areas, through their knowledge, thoroughness, and creativity. As we approach artificial intelligence, every tool evaluated must be able to answer affirmatively how it will strengthen and maintain these pillars, without endangering or putting any of them at risk.

WBUR’s editorial work is created by human journalists: reporters, editors, producers, hosts, illustrators, and photographers. Under extremely rare occasions, content generated by GAI may be used under the WBUR name only with the expressed approval of the editorial employee directly involved and a senior manager. With these exceptions, we commit to appropriately disclosing the use of GAI with our audiences.

Our work in GAI should be guided by what will be helpful to audiences as we serve them. We have made a promise to our audiences to produce high-quality journalism and enriching experiences that foster understanding, connection and community for an expanding circle of people.


Guideline: You are responsible for the content you create, with or without the use of GAI

  • GAI-assisted results must undergo thorough scrutiny, and should be fact-checked and supplemented through the organization's internal editorial judgment and processes. Ultimately, WBUR journalists bear the full responsibility of their reporting work and are subject to the same values and tenets prescribed in our Ethical Guidelines – regardless of whether GAI plays a role in our journalistic processes.
  • As technology evolves our use of GAI will evolve as well, but our use will always be within the framework of the standards that guide WBUR’s journalism.


Guideline: If GAI played a significant role in your reporting you should share that fact with your audience

  • WBUR will adhere to a “no-surprises strategy.” WBUR will clearly inform the public in the rare cases where our journalism does include GAI-generated material.
  • If a journalist is using GAI to assist certain aspects of the reporting workflow such as helping transcribe an interview, organizing large data sets, or brainstorming interview questions, it may not be necessary to disclose these applications to the audience. When in doubt, staff should consult with an editor or a supervising editor.

Guideline: Our GAI use should be plainly understood

  • Within our organization, we will keep our colleagues and editorial leadership informed as we integrate GAI into our workflow processes. This transparency will foster shared learning and enable us to develop relevant, adaptable policies as these technologies continue to advance. Whether utilizing GAI for research, headline generation, or mining public databases, it's essential to ensure that your supervisor is informed so you can discuss best practices for the use of GAI outputs and whether any public disclosures are suitable.

Guideline: We must protect our own work

  • GAI systems often by their nature incorporate any information provided into their knowledge base. Therefore, WBUR employees must not input any confidential information into any GAI system without prior consent from WBUR leadership. WBUR's intellectual property encompasses everything generated in the course of work, including notes, story drafts, memos, and internal documents. Consider anything unique and internal to WBUR as proprietary. For clarification on what constitutes proprietary and confidential information at WBUR, please consult your manager and/or Boston University’s Office of the General Counsel.


Guideline: WBUR will not use GAI to replicate, reproduce or replace a journalist's human voice, image or likeness in its journalism

  • In the rare cases where there is a compelling need for doing so (to report on GAI itself, for example), WBUR will obtain the express consent of the journalist involved.
  • There may be occasions when GAI content is newsworthy and WBUR may choose to publish and or broadcast the material, such as to fact-check mis/disinformation (see examples). WBUR should practice caution and be clear in alerting the audience that the material was generated by AI.
  • WBUR will not use GAI to replicate, reproduce or replace a journalist's voice, image or likeness in marketing or other business uses without the express consent of the journalist involved.


Guideline: Editorial staff are expected to flag GAI-related concerns as well as calls for consideration of new tools and potential solutions

  • WBUR Editorial commits to upholding a committee dedicated to ensuring the ethical and responsible use of GAI throughout our journalism. This committee will consist of three (3) senior managers and three (3) staff journalists who are members of SAG-AFTRA. Its responsibilities include convening regularly (quarterly, starting in Q4 FY24) to assess existing GAI use cases, address any recently identified ethical concerns, and evaluate proposals for new services or solutions. This committee can also gather on an immediate basis, should the need arise.
  • Staff are encouraged to contact WBUR’s JLMC-AI committee with any questions, input or recommendations via the following email address: ai (at)


Guideline: GAI when used responsibly can provide support to journalists

  • GAI can help journalists complete tasks that are "challenging yet verifiable," such as summarizing legal documents, optimizing social media copy, creating templates for internal documentation and discovering synonyms, on top of transcript tools already commonplace throughout WBUR’s production environments.
  • Journalists who want to use GAI tools should seek guidance from their editor and/or a member of the JLMC-AI task force. Through continued usage, we will gain deeper insights into the capabilities and limitations of these tools, prompting revisions to our guidelines.
  • Undoubtedly, there will be other media outlets exploring the use of GAI for content generation. This may lead to a bifurcated media landscape, with some outlets seeking to supplant humans with GAI. At WBUR, humans will always create our journalism.
Listen Live