robotics Tag

Erin Gee - Swarming Emotional Pianos

Swarming Emotional Pianos

(2012 – ongoing)

Aluminium tubes, servo motors, custom mallets, Arduino-based electronics, iCreate platforms

Approximately 27” x 12” x 12” each

Swarming Emotional Pianos is an installation that features a large, looming projection of a human face surrounded by a set of six musical chime robots.

The projected face is that of an actor (Laurence Dauphinais or Matthew Keyes), who for 20 minutes moves between extreme emotional states of surprise, fear, anger, sadness, sexual arousal, and joy in 5 minute intervals. During the actor’s performance, Gee hooked the performer up to a series of biosensors that monitored how heart rate, sweat, and respiration changed between her emotional states.

The music that the robots surrounding the projection screen play as the actress moves between emotional states is in reaction to these physiological responses: the musical tones and rhythms shift and intensify as heart rate, sweat bursts, blood flow and respiration change in the actress. While the musical result is almost alien to assumptions of what emotional music might sound like, one might encounter the patterns as an abstracted lie-detector test that displays the unique internal fluctuations of the actress that move beneath the surface of her large, projected face. Does emotion lie within the visibility of facial expression, or somewhere in the audible made audible, the patterns of bodily sensation in her body? Is the actor sincere in her performance if the emotion is felt as opposed to displayed? Micro bursts of emotional sentiment are thus amplified by the robots, providing an intimate and abstract soundtrack for this “emotional movie”.

Emotional-physical outputs are extended through robotic performers as human actors focus on their internal states, and in fact activate their emotions mechanistically, as a means of creating change in their body, thus instrumentalizing emotion.

Custom open-source biosensors that collect heartrate and signal amplitude, respiration amplitude and rate, and galvanic skin response (sweat) have been in development by Gee since 2012.  Click here to access her GitHub page if you would like to try the technology for yourself, or contribute to the research.


Thank you to the following for your contributions:

  • Martin Peach (my robot teacher) – Sébastien Roy (lighting circuitry) – Peter van Haaften (tools for algorithmic composition in Max/MSP) – Grégory Perrin (Electronics Assistant)
  • Matt Risk, Tristan Stevans, Simone Pitot, and Jason Leith for their hours of dedicated studio help
  • Concordia University, the MARCS Institute at the University of Western Sydney, Innovations en Concert Montréal, Conseil des Arts de Montréal, Thought Technology, and AD Instruments for their support.

Swarming Emotional Pianos (2012-2014) Machine demonstration March 2014 – Eastern Bloc Lab Residency, Montréal

Erin Gee and Stelarc - Orpheux Larynx

Orpheux Larnyx


Vocal work for three artificial voices and soprano, feat. Stelarc.

Music by Erin Gee, text by Margaret Atwood.

I made Orpheux Larynx while in residence at the MARCs Auditory Laboratories at the University of Western Sydney, Australia in the summer of 2011. I was invited by Stelarc to create a performance work with an intriguing device he was developing there called the Prosthetic Head, a computerized conversational agent that responds to keyboard-based chat-input with an 8-bit baritone voice. I worked from the idea of creating a choir of Stelarcs, and developed music for three voices by digitally manipulating the avatar’s voice. Eventually Stelarc’s avatar voices were given the bodies of three robots: a mechanical arm, a modified segueway, and a commercially available device called a PPLbot. I sang along with this avatar-choir, while carrying my own silent avatar with me on a djgital screen.

It is said that after Orpheus’ head was ripped from his body, he continued singing as his head floated down a river. He was rescued by two nymphs, who lifted his head to the heavens, to become a star. In this performance, all the characters (Stelarc’s, my voice, Orpheus, Euridice, the nymphs) are blended into intersubjective robotic shells that speak and sing on our behalf. The flexibility of the avatar facilitates a pluratity of voices to emerge from relatively few physical bodies, blending past subjects into present but also possible future subjects. Orpheus is tripled to become a multi-headed Orpheux, simultaneously disembodied head, humanoid nymph, deceased Euridice. The meaning of the work is in the dissonant proximity between the past and present characters, as well as my own identity inhabiting the bodies and voices of Stelarc’s prosthetic self.


Music, video and performance by Erin Gee. Lyrics “Orpheus (1)” and “Orpheus (2)” by Margaret Atwood. Robotics by Damith Herath. Technical Support by Zhenzhi Zhang (MARCs Robotics Lab, University of Western Sydney). Choreography coaching by Staci Parlato-Harris.

Special thanks to Stelarc and Garth Paine for their support in the creation of the project.

This research project is supported by the Social Sciences and Humanities Research Council of Canada and MARCS Auditory Labs at the University of Western Sydney. The Thinking Head project is funded by the Australian Research Council and the National Health and Medical Research Council.

Music: Orpheux Larynx © 2011 . Lyrics are the poems by Margaret Atwood: “Orpheus (1)” and “Orpheus (2)”, from the poetry collection Selected Poems, 1966 – 1984 currently published by Oxford University Press © 1990 by Margaret Atwood. In the United States, the poems appear in Selected Poems II, 1976 – 1986currently published by Houghton Mifflin © 1987 by Margaret Atwood. In the UK, these poems appear in Eating Fire, Selected Poetry 1965 – 1995 currently published by Virago Press, ©1998 by Margaret Atwood. All rights reserved.