Skip to content

Your browser is no longer supported. Please upgrade your browser to improve your experience.

Storytelling at the BBC’s R&D department

Doctor Who stories in linear time, powerful archive searching, and using technology to tell the story of the Radiophonic Workshop… Our Research and Development season continues as Chris Lowis, Senior Researcher at BBC R&D, shares some of the projects the corporation has been working on recently.

The BBC is the largest broadcaster in the world, but did you know that under the terms of the Royal Charter (by which the BBC is given licence to operate), it is also constitutionally required to carry out research? BBC R&D has been involved in many of the technological advances in broadcasting of the last 100 years and, as the Internet emerged and later, the web became an important platform for our content, so R&D helped prepare the organisation for this future.

In the Internet Research and Future Services team at BBC R&D, we explore the future of broadcasting in an increasingly connected and multi-platform world. We work with standards organisations to make sure the enabling technology is open and available to all, and we work with our colleagues at the BBC to find new ways to bring their content to life. We’re a multi-disciplinary team comprising of designers, engineers and producers, some with a background in R&D or academia, and others with more traditional broadcasting, design, or software engineering background. In this post I’ll talk about a few of our current projects and discuss some of the benefits and challenges of the way we work.


The Snippets team are working hard to help storytellers in the rest of organisation find what they are looking for in the BBC’s large TV and Radio archives. With only titles and broadcast dates to search on, finding what you need can be very difficult. The Snippets project helps with this task by indexing the subtitles of every TV show. You can search for a key phrase or half-remembered piece of dialogue and then jump to the exact point in the programme where it occurs. If you want to use the piece of footage in a rough cut you can snip out a segment and export it in the correct format. To reuse the footage in a broadcast you can order up a high-quality version of the content from the main archives.

The team are also exploring new ways to show the structure of programmes to help people navigate. A “grid view” is in development which shows a key frame every few seconds in grid and allows snippets to be created by “lassoing” a region of the programme. For audio-only programmes a “waveform” view is in development which will allow various overlays such as identified speakers, or regions of music.

World Service Archive 

Making archive content findable is a recurring theme for the team. We’ve been hard at work on a knotty problem for the BBC and other owners of large archives – how people can find fantastic content, produced by colleagues in production divisions, when the metadata attached to it is patchy or non-existent. Alongside a start up company, MetaBroadcast, we successfully pitched for a grant from the UK Technology Strategy Board, to examine automatic metadata generation of large digital archives. We’ve used speech-to-text systems to help extract key words from old radio programmes and we are now concentrating on getting our prototype in front of people to help refine it, and see if a crowd-sourced approach to adding metadata can work.

Radiophonics Workshop and the Web Audio API

One of the responsibilities of our team is to engage with the W3C to help define standards for use on the web. We’ve been involved with the Audio working group since its inception. The standards that this group is defining will bring low-latency game audio, synthesis and even MIDI control to your browser.

In this project, we wanted to build something to help test the emerging APIs, and also to use the BBC’s influence to raise awareness of the work. We decided to tell the story of the early days of the Radiophonic Workshop, the BBC department once responsible for sound-effects and theme tunes, by recreating some of their sounds in the browser.

This project was particularly challenging as we wanted to tell the story of the Workshop in an engaging way, but also leave space for the code samples and technical detail that would satisfy those who wanted to dig deeper into the technology. We tackled it by building retro-styled “machines” which could be played with and added atmosphere using BBC library photography.

Mythology Engine

The Mythology Engine is a prototype exploring new ways of telling stories on the web. It began as an experiment to bring storylines from Doctor Who to the web. Rather than create a basic page for each episode and character in Doctor Who, the Engine allows the storyline to be described using an Ontology. This storyline can then be presented either in a linear way, mapped to the traditional TV structure of episodes and series, or deconstructed to let you examine a single story arc within a complex narrative, for example. This is particularly interesting when considering the time-travelling escapades of the Doctor. His stories can be viewed at in linear time, or as they are presented in an episode, for example.

The promise of such a system is that it could be reused by other programme teams to present their own stories on the web. The team demonstrated this as a proof-of-concept by modelling some EastEnders stories and loading them into the Engine with a different visual treatment. There are many remaining challenges: how do you present cliff-hangers or uncompleted storylines without giving away the ending before broadcast, for example? Our colleague, Paul Rissen, investigated this in a follow up project called StoryBox.

One of the major challenges is to understand and address how much work is involved for production teams to model their stories in a suitable fashion. TV programme makers often prioritise their TV output and the online experience sometimes falls by the wayside. For story telling projects like Mythology Engine to work, the description and modelling has to be tightly integrated with the production workflow. Or, as we have yet to fully explore, whether this work can be made easier by automatically examining scripts and video files, or by asking our
audiences to help.


I hope this post has given you a flavour of the kind of projects and challenges that our team is involved with. If you’d like to find out more, our blog, where we post regular weeknotes, is a good place to start.

Back to Archive