I'm slowly backdating my list of projects

The Harmonium Project: Visualising the Festival Chorus from the inside out and the outside in.

Official event webpage: http://www.eif.co.uk/2015/harmonium

A novel arts and science project involving the University of Edinburgh, 59 Productions and the Edinburgh Festival Chorus, to produce a visual accompaniment for John Adams’s Harmonium (performed by the Royal Scottish National Orchestra and the Edinburgh Festival Chorus). This will be projected onto the outside of the Usher Hall (the location of the performance) on Friday 7 August.

An image of me and a chorus volunteer ready for an EEG experiment

My work involves gathering bio- and psycho-physical data from individual members of the chorus, correlating and combining them to investigate the sort of organic entity that a chorus or group such as this becomes when performing together. Previously I’ve researched the cognitive processes of reading, dialogue and human collaborative behaviour, so Harmonium now gives me the opportunity to study how people read music and interact musically together, as well as how they synchronise with each other and the music. So far, the data gathered has included monitoring eye movements (EyeLink remote), body movements (Microsoft Kinect), electroencephalography (single and 64-channel EEG), basic electrocardiography (heart rate) and ultrasound recordings of the tongue.

This will then be all woven together into a visual narrative by 59 Productions, who are also the projection tech-wizards. See their fantastic previous work on Sydney Opera House and Hampton Court, for example.

A sample of data:

Here is an example of some eye-tracking data of a soprano singing from the score. The first line shows the fixations (blue circles) and saccades (gold lines). The second line shows traditional heat-map images of visual attention and the third line gives a kind of sense of what information was selected. Don't worry, you can click on the images for a larger version.

Eye movement data for a soprano.

And now where a bass looks during the same piece. Note that there is no Bass Section line until page 9 (third image) and so he tends to follow the Alto Section, perhaps because this is closest to his own in terms of pitch and spatial location on the page, until three bars prior to his entry.

Eye movement data for a bass.

However, a different bass seems to prefer listening to the sopranos and leaves it to the last moment to switch to his line.

Eye movement data for another bass.

And an example screenshot from an ultrasound image of a soprano's tongue while singing. I'll try and put up a video clip soon.

Ultrasound image of a soprano's tongue

 

Project partners:

College of Humanties and Social Science, University of Edinburgh
School of Informatics
Design Informatics
Edinburgh College of Art
59 Productions
Edinburgh International Festival

Press coverage:

Singers Wired in for Edinburgh festival opener (Scotsman on Sunday, 28 June 2015)