Mood Conductor is a system that allows audience members to interact with performers during musical improvisations. The term 'conductor' is thereby used metaphorically: rather than directing a musical performance by way of visible gestures, spectators in this instance act as conductors by communicating emotional intentions to the performers through our smartphone-friendly Mood Conductor app. Performers and audience members can visualise on a screen the audience's emotional directions which evolve over the course of the performance. The emotions selected by the audience are represented by coloured bubbles in a two-dimensional space. This space represents the core dimensions of emotions, the "arousal" or excitation (vertical), and the "valence" or pleasantness (horizontal). When similar emotions are chosen by various audience members, the size of the bubbles grow. The audience's votes are averaged in a democratic way leading to "elected" emotions performers aim to follow. The performers shape their musical techniques and vocabulary over the course of the performance to interpret the emotions to be conveyed musically.
Mood Conductor was invented and developed by Mathieu Barthet and George Fazekas (Centre for Digital Music, Queen Mary University of London).
Selection of photos from various Mood Conductor performances
+ International Conference on Affective Computing and Intelligent Interaction with the VoXP vocal quartet (Geneva, Switzerland, 2013) :
+ Hack The Barbican festival with a quintet (Barbican Arts Centre, London, 2013)
+ New Musical Interfaces festival with a jazz/rock trio (Queen Mary University of London, 2013)
+ Cathedral of Strasbourg with the VoXP vocal quartet (France, 2012)
If you are interested by the project and wish to collaborate please contact Mathieu Barthet (email@example.com) and/or George Fazekas (firstname.lastname@example.org).
We acknowledge the kind contribution of the vocal quartet VoXP and Matthias Georgi from SoundCloud Ltd. who implemented the first version of the mobile client interface. We also thank Ting Lou who worked on Mood Conductor during her Media and Arts Technology MSc placement.
This work was partly funded by the EPSRC Grant EP/K009559/1 and the TSB project 12033- 76187 Making Musical Mood Metadata (TS/J002283/1).