Structured machine listening for soundscapes with multiple birds

Lead Research Organisation: Queen Mary, University of London
Department Name: Sch of Electronic Eng & Computer Science

Abstract

In this Early Career fellowship I will establish a world-leading capability in automatic inference about songbird communications via novel "machine listening" methods, working collaboratively with experts in machine listening but also experts in bird behaviour and communication. Automatic analysis has already shown benefit to researchers in efficiently characterising recorded bird sounds, but there are still many limitations in applicability, such as when many birds sing together. The techniques developed will specifically be designed to handle noisy multi-source audio recordings, and to infer not just the presence of birds but the structure of the signals and the interactions between them. Such methods will be a leap beyond the current state of the art in bioacoustics, allowing researchers to study not just sounds recorded in the lab under controlled conditions, but also field recordings and archive recordings found in public audio archives.

I will develop my techniques through specific application case studies. First through collaboration with David Clayton, an international expert on zebra finch behaviour and genetics, who recently moved his lab to my proposed host institution. The zebra finch is an important "model organism" in biology, because its genome is fully sequenced and it is a useful bird for probing aspects of songbird vocal development. I will collaborate with the Clayton lab to develop methods for automatically inferring the social interactions implicit in audio recordings of zebra finch colonies. Second, I will conduct international research visits to collaborate with other research groups who analyse bird sounds and bird social interactions. Third, I will study the case of automatically detecting bird activity in arbitrary sound archives, such as the soundscape recordings held by the British Library Sound Archive.

Importantly, not only will I apply modern signal processing and machine learning techniques, but I will also develop new techniques inspired by this application area. This fellowship is not about contributing from one field to another, but about building up UK research strength in this cross-discplinary research topic. In order to make the most of this possibility, I will host research workshops and an open data contest to serve as focal points for research attention, and I will also conduct a public engagement initiative to engage the widest possible enthusiasm for this exciting field of possibility.

This fellowship directly aligns with the "Working Together" priority, which is EPSRC's current overriding priority for ICT fellowships.

Planned Impact

The prime beneficiaries outside my immediate field will be in research fields benefitting from the structured analysis of animal sounds and interactions. For example the improved techniques in zebra finch analysis will complement ongoing research into songbird genetics and individual differences, or research into conversational interactions in linguistics: the availability of more structured naturalistic data about animal communication could provide stronger empirical foundation to considerations of the evolution of communication systems. (This impact overlaps to some extent with that described in the Academic Beneficiaries section.)

The availability of these sound analysis techniques is also of interest to wildlife monitoring organisations such as the British Trust for Ornithology (BTO). They largely use manual surveying by volunteers and professionals to quantify the distributions of species: however, if high-quality automatic analysis were available their work could be made more efficient. Current academic and commercial software (examples include Raven, XBAT, Seewave, Praat, Sound Analysis Pro) allow users to inspect and detect bird sounds but are unable to analyse communication networks, nor can they use models of communication interactions to ensure high-quality detection. Analysing not just the presence of bird song and calls, but the networks of interaction between them, could be used as an indicator of the population health, reflecting issues such as habitat fragmentation which can impact the viability of a bird population. Downstream, detailed analysis of animal sound can thus form a strong evidence base for ecological policy decisions.

The application to audio archives shows another direct route to impact. Large audio collections such as those in the British Library Sound Archive are highly valuable to society, yet a lot of their value remains locked away because there is very little metadata associated that would facilitate different types of query. This research will directly enable the unlocking of some of this value, helping people to discover the presence of birds in large audio archives which may not be annotated for their bird sounds, indeed may have been collected for entirely different reasons.

The fellowship will also have an impact on the public understanding of bird sounds, bird social interactions, and signal processing and machine learning. These will be explicitly encouraged through the public engagement activities: through engaging talks, articles and exhibits I will aim, not to place the technology between the public and the birds, but to enchant the public with both the wonders of technology and the wonders of bird vocal communication.
 
Title Robotic bells birdsong 
Description Collaboration with composer/roboticist Sarah Angliss, to create a musical score to render the sound of birdsong through her robotic bells (carillon). First shown at Listening in the Wild 2015 workshop. Second shown at "SoundCamp" in a park in London, on International Dawn Chorus Day. 
Type Of Art Artefact (including digital) 
Year Produced 2015 
Impact Public engagement with sound, computation and birds through a novel medium. Reached approx 100 people in public park. 
 
Description Warblr bird recognition app: Dan, together with an external business partner, secured a £10K grant from QMUL's EPSRC Innovation Fund in summer 2014, which allowed them to work with developers to build a social enterprise involving the British public in a citizen science project to identify and collect bird sound recordings. Following that, the Warblr team launched a Kickstarter campaign, founded a spinout company, and in Summer 2015 launched the smartphone app. It has amassed over 5,000 paying users. To date, Warblr has had more than 45,000 submissions to its database, and an average of 80 submissions per day.
First Year Of Impact 2015
Sector Environment
Impact Types Societal,Economic
 
Description QMUL Innovation Fund
Amount £10,000 (GBP)
Organisation Queen Mary University of London (QMUL) 
Sector Academic/University
Country United Kingdom of Great Britain & Northern Ireland (UK)
Start 07/2014 
End 07/2015
 
Description QMUL Innovation Fund Supplementary Award
Amount £10,000 (GBP)
Organisation Queen Mary University of London (QMUL) 
Sector Academic/University
Country United Kingdom of Great Britain & Northern Ireland (UK)
Start 07/2015 
End 09/2015
 
Title freefield1010bird 
Description Annotated dataset of audio clips from the "freefield1010" collection, sourced from Freesound. Annotations indicate the presence/absence of birds. 
Type Of Material Database/Collection of data 
Year Produced 2016 
Provided To Others? Yes  
Impact Facilitated the "Bird Audio Detection challenge", which directly stimulated 30 research teams from around the world to develop and test new algorithms for detecting bird sounds. 
URL http://machine-listening.eecs.qmul.ac.uk/bird-audio-detection-challenge/
 
Title warblrb10k 
Description Annotated dataset of audio clips from the Warblr birdsong app. Annotations indicate the presence/absence of birds. 
Type Of Material Database/Collection of data 
Year Produced 2016 
Provided To Others? Yes  
Impact Facilitated the "Bird Audio Detection challenge", which directly stimulated 30 research teams from around the world to develop and test new algorithms for detecting bird sounds. 
URL http://machine-listening.eecs.qmul.ac.uk/bird-audio-detection-challenge/
 
Description Clayton Lab zebra finch recordings 
Organisation Queen Mary University of London (QMUL)
Country United Kingdom of Great Britain & Northern Ireland (UK) 
Sector Academic/University 
PI Contribution Provided recording equipment, my time for running the recording sessions, and paid annotator time to annotate the data.
Collaborator Contribution Provided access to zebra finch facility, advice on study design, and practical support in setting up the recording sessions.
Impact One journal publication, introducing and evaluating a new method to animal communication analysis. One conference paper, on a method for detecting overlapping audio events. Dataset of audio recordings and annotations.
Start Year 2014
 
Description MPIO Seewiesen 
Organisation Max Planck Society
Department Max Planck Institute for Ornithology
Country Germany, Federal Republic of 
Sector Public 
PI Contribution Data, collaboration time, hosting research visit, and novel analysis methodology
Collaborator Contribution Data, collaboration time, hosting research visit.
Impact 2 journal articles and 1 peer-reviewed conference paper. Collaboration is multi-disciplinary, across computer science and animal behaviour / ornithology.
Start Year 2015
 
Description BBC Radio 4 - Costing The Earth - Acoustic Ecology 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Media (as a channel to the public)
Results and Impact 2016-03 BBC Radio 4 "Costing The Earth" programme, with a feature interview with me about our "Warblr" birdsong app, and sound recognition
Year(s) Of Engagement Activity 2016
URL http://www.bbc.co.uk/programmes/b071tgby
 
Description Media coverage of Warblr app kickstarter and launch 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact While launching the Warblr app, which is both a spinout company and a citizen-science data gathering initiative, we were featured in many national and international press outlets, through TV/radio/newspaper interviews as well as second-hand coverage. Warblr has been featured across the BBC (including 5 Live, BBC London News, Radio 4, and the BBC News homepage), on Sky News, in print and online through newspapers such as The Telegraph, The Times, The Sun, The Guardian, The Metro and The Daily Mail, and in publications such as Stylist, Shortlist, Wired, Stuff.TV and Engadget.

The project has gained support from the likes of Stephen Fry, Chris Packham, the Urban Birder and the Royal Society of Arts (RSA), as well as thousands of tweets, posts, mentions and shares across social media, blogs and forums.

The Warblr team have also spoken at conferences including Digital Shoreditch, London National Park, UnLtd Living It Festival, and Stylist Live.

Warblr has won a Queen Mary University of London Public Engagement Award, and was shortlisted for the TechCityinsider's TechCities awards and the IAB (Interactive Advertising Bureau) Creative Showcase. Warblr is one of TechRadar's "Best iPhone apps of 2015" and The Next Web's "Apps of the year".
Year(s) Of Engagement Activity 2015,2016
URL http://warblr.net/
 
Description Media coverage of automatic bird classification results (Summer 2014) 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact Issued press release, participated in media interviews (live on-air discussion on Irish national radio RTE1; print interviews for BBC and Science Magazine); plus secondary press resulting from those.



Science
Computer becomes a bird enthusiast
By Kelly Servick
http://news.sciencemag.org/plants-animals/2014/07/computer-becomes-bird-enthusiast
Tweeted at least 40 times

BBC
Software can decode bird songs
By Claire Marshall BBC environment correspondent
http://www.bbc.co.uk/news/science-environment-28358123
Tweeted at least 519 times

The Sun
A little bird told me..

Europa Press
Un 'shazam' para identificar qué pájaro está cantando
http://www.europapress.es/ciencia/laboratorio/noticia-shazam-identificar-pajaro-cantando-20140717173434.html
Tweeted at least 105 times

El Economista
Un 'shazam' para identificar qué pájaro está cantando
http://ecodiario.eleconomista.es/ciencia/noticias/5949497/07/14/Un-shazam-para-identificar-que-pajaro-esta-cantando.html
Tweeted at least 22 times

Live Science
'Voice Recognition' System for Birds Can Tell Two Chirps Apart
http://www.livescience.com/46840-bird-songs-decoded.html
Tweeted at least 26 times

Science 2.0
Birdsongs Decoded
http://www.science20.com/news_articles/birdsongs_decoded-140731
Tweeted at least 2 times


I have received various email/twitter contacts, both from members of the public and from academics/industry enquiring about the state of the art and possible future deployments, spinouts, etc.
Year(s) Of Engagement Activity 2014
URL http://www.bbc.co.uk/news/science-environment-28358123
 
Description The Conversation article 
Form Of Engagement Activity Engagement focused website, blog or social media channel
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact Wrote article published in The Conversation long-form news website. Received over 100 Twitter shares, over 300 Facebook shares, and more:
https://theconversation.com/we-made-an-app-to-identify-bird-sounds-and-learned-something-surprising-about-people-65742
Year(s) Of Engagement Activity 2016
URL https://theconversation.com/we-made-an-app-to-identify-bird-sounds-and-learned-something-surprising-...