Generative AI at the BBC

11th October 2023
Rhodri Talfan Davies, the BBC’s Director of Nations, sets out the latest on the approach the BBC is taking to working with Generative AI. 
LONDON- APRIL, 2019: The BBC or British Broadcasting Corporation headquarters building on Portland Place. Credit: Willy Barton/
This blog was originally published by the BBC.

Rhodri Talfan Davies is the BBC’s Director of Nations. Recently, he’s taken on the responsibility for bringing teams together across the BBC to shape our response to an emerging area of technology called Generative AI (or Gen AI). Here he sets out the latest on our plans.

Innovation has always been at the heart of the BBC. From the very first radio broadcasts in 1922 to colour television in the 1960s and the rapid development of our online and mobile services over the last 25 years – innovation has driven the evolution of the BBC at every step.

Whenever the BBC embraces new technologies, we do so in a way that puts our values first. We want to use new technologies to benefit all audiences and help us deliver our public mission in new and exciting ways. More than that, we want to positively influence how new technology develops to support the supply of trusted public media and information.

Read more: Swedish Radio publishes policy for generative AI

The emergence of Generative Artificial Intelligence, or Gen AI, is expected to herald a new wave of technology innovation that could impact almost every field of human activity. The new tools can generate text, images, speech, music and video in response to prompts from a user. The current capabilities are impressive and they are expected to develop rapidly. You may have heard of some of these tools, like ChatGPT or Midjourney – which are two of the most well-known.

[The] risks [of AI] are real and cannot be underestimated. … But we believe a responsible approach to using this technology can help mitigate some of these risks and enable experimentation.

We believe Gen AI could provide a significant opportunity for the BBC to deepen and amplify our mission, enabling us to deliver more value to our audiences and to society. It also has the potential to help our teams to work more effectively and efficiently across a broad range of areas including production workflows and our back-office.

Alongside these opportunities, it is already clear that Gen AI is likely to introduce new and significant risks if not harnessed properly. These include ethical issues, legal and copyright challenges, and significant risks around misinformation and bias.

These risks are real and cannot be underestimated. This wave of innovation will demand vision and vigilance in equal measure. But we believe a responsible approach to using this technology can help mitigate some of these risks and enable experimentation.

With all this in mind, today we’re outlining three principles that will shape our approach to working with Gen AI:

  1. We will always act in the best interests of the public – We will explore how we can harness Generative AI to strengthen our public mission and deliver greater value to audiences. At the same time, we will seek to mitigate the challenges Generative AI may create, including trust in media, protection of copyright and content discovery. We will also seek to work with the tech industry, media partners and regulators to champion safety and transparency in the development of Gen AI and protection against social harms.
  2. We will always prioritise talent and creativity – No technology can replicate or replace human creativity. We will always prioritise and prize authentic, human storytelling by reporters, writers and broadcasters who are the best in their fields. We will work with them to explore how they could use Generative AI to help them push new boundaries. Creators and suppliers play a vital role in our industry. The BBC will always consider the rights of artists and rights holders when using Generative AI.
  3. We will be open and transparent – Trust is the foundation of the BBC’s relationship with audiences. Our leaders will always remain accountable to the public for all content and services produced and published by the BBC. We will be transparent and clear with audiences when Generative AI output features in our content and services. Human oversight will be an important step in the publication of Generative AI content and we will never rely solely on AI-generated research in our output.

Subscribe toour newsletter

Keep updated with the latest public
media news from around the world

In the next few months, we will start a number of projects that explore the use of Gen AI in both what we make and how we work – taking a targeted approach in order to better understand both the opportunities and risks. These projects will assess how Gen AI could potentially support, complement or even transform BBC activity across a range of fields, including journalism research and production, content discovery and archive, and personalised experiences.

At the same time, we are taking steps to safeguard the interests of Licence Fee payers as this new technology evolves. For example, we do not believe the current ‘scraping’ of BBC data without our permission in order to train Gen AI models is in the public interest and we want to agree a more structured and sustainable approach with technology companies. That’s why we have taken steps to prevent web crawlers like those from Open AI and Common Crawl from accessing BBC websites.

We are also looking at how Gen AI may influence the media industry more broadly. For example, how the inclusion of Gen AI in search engines could impact how traffic flows to websites, or how the use of Gen AI by others could lead to greater disinformation.

Throughout this work, we will harness the world class expertise and experience we have across the organisation, particularly in BBC R&D and our Product teams who are already exploring the opportunities for public media.

We look forward to sharing more with you – and providing updates about what we learn from these experiments in the coming months.

Thanks for reading.