robotics Tag

Erin Gee - Swarming Emotional Pianos

Swarming Emotional Pianos

(2012 – ongoing)

Aluminium tubes, servo motors, custom mallets, Arduino-based electronics, iCreate platforms

Approximately 27” x 12” x 12” each

Custom biosensors that collect heartrate and signal amplitude, respiration amplitude and rate, and galvanic skin response (sweat).

Biodata collection software and affective data responsive algorithmic music software built in Max/MSP.

A cybernetic musical performance work that bridges robotics and emotion to create biologically harmonic chamber music. Swarming Emotional Pianos features a set of mobile robots that each house a bell instrument and lighting components. The music that these robots play is determined through physiological responses of a human subject to emotional state, which is reflective of affective computing research. These physiological markers including breathing, heart rate, sweat glands, blood pressure. Research is ongoing for integration of skin sensitive neural activity through microneurography into the system.

My final goal is a live performance whereupon actors are hooked up live to biosensors and their emotional data is wirelessly streamed to the robotic musical instruments. This will require extensive biofeedback testing. I maintain an active dialogue with microneurographer and neurophysiologist Vaughan Macefield, in anticipation of networked, telematic performances that involve tiny needles inserted directly into nerves that reflect emotional arousal. The use of microelectrode needles inserted directly into the nerves of awake human performers to pick up on direct electrical neural activity is a unique technical component of this project.

The goal in creating this work is to illuminate and explore the complex relationships between body and mind in human emotions. Emotional-physical outputs are extended through robotic performers as human actors focus on their internal states, and in fact activate their emotions mechanistically, as a means of creating change in their body, thus instrumentalizing emotion.

Credits

Thank you to the following for your contributions:

  • Martin Peach (my robot teacher) – Sébastien Roy (lighting circuitry) – Peter van Haaften (tools for algorithmic composition in Max/MSP) – Grégory Perrin (Electronics Assistant)
  • Matt Risk, Tristan Stevans, Simone Pitot, and Jason Leith for their hours of dedicated studio help
  • Concordia University, the MARCS Institute at the University of Western Sydney, Innovations en Concert Montréal, Conseil des Arts de Montréal, Thought Technology, and AD Instruments for their support.

Swarming Emotional Pianos (2012-2014) Machine demonstration March 2014 – Eastern Bloc Lab Residency, Montréal

Erin Gee and Stelarc - Orpheux Larynx

Orpheux Larnyx

(2011)

Vocal work for three artificial voices and soprano, feat. Stelarc.

Music by Erin Gee, text by Margaret Atwood.

I made Orpheux Larynx while in residence at the MARCs Auditory Laboratories at the University of Western Sydney, Australia in the summer of 2011. I was invited by Stelarc to create a performance work with an intriguing device he was developing there called the Prosthetic Head, a computerized conversational agent that responds to keyboard-based chat-input with an 8-bit baritone voice. I worked from the idea of creating a choir of Stelarcs, and developed music for three voices by digitally manipulating the avatar’s voice. Eventually Stelarc’s avatar voices were given the bodies of three robots: a mechanical arm, a modified segueway, and a commercially available device called a PPLbot. I sang along with this avatar-choir, while carrying my own silent avatar with me on a djgital screen.

It is said that after Orpheus’ head was ripped from his body, he continued singing as his head floated down a river. He was rescued by two nymphs, who lifted his head to the heavens, to become a star. In this performance, all the characters (Stelarc’s, my voice, Orpheus, Euridice, the nymphs) are blended into intersubjective robotic shells that speak and sing on our behalf. The flexibility of the avatar facilitates a pluratity of voices to emerge from relatively few physical bodies, blending past subjects into present but also possible future subjects. Orpheus is tripled to become a multi-headed Orpheux, simultaneously disembodied head, humanoid nymph, deceased Euridice. The meaning of the work is in the dissonant proximity between the past and present characters, as well as my own identity inhabiting the bodies and voices of Stelarc’s prosthetic self.

Credits

Music, video and performance by Erin Gee. Lyrics “Orpheus (1)” and “Orpheus (2)” by Margaret Atwood. Robotics by Damith Herath. Technical Support by Zhenzhi Zhang (MARCs Robotics Lab, University of Western Sydney). Choreography coaching by Staci Parlato-Harris.

Special thanks to Stelarc and Garth Paine for their support in the creation of the project.

This research project is supported by the Social Sciences and Humanities Research Council of Canada and MARCS Auditory Labs at the University of Western Sydney. The Thinking Head project is funded by the Australian Research Council and the National Health and Medical Research Council.

Music: Orpheux Larynx © 2011 . Lyrics are the poems by Margaret Atwood: “Orpheus (1)” and “Orpheus (2)”, from the poetry collection Selected Poems, 1966 – 1984 currently published by Oxford University Press © 1990 by Margaret Atwood. In the United States, the poems appear in Selected Poems II, 1976 – 1986currently published by Houghton Mifflin © 1987 by Margaret Atwood. In the UK, these poems appear in Eating Fire, Selected Poetry 1965 – 1995 currently published by Virago Press, ©1998 by Margaret Atwood. All rights reserved.