biodata Tag

AFFECT FLOW

AFFECT FLOW (2022)
Performance at MUTEK Montreal 2023. Photography by Vivien Gaumand.

2022

AFFECT FLOW is a music performance work of approximately 30 minutes that initiates listeners into a state of “non-naturalist emotion”: emotional manufacture as a technology for survival or pleasure. It is a hybrid of electroacoustic music with live-spoken verbal suggestion, an ensemble of live biofeedback created by hardware synthesisers, and song.

In AFFECT FLOW I use psychological hacks borrowed from method acting and clinical psychology in order to move beyond “natural” emotion, playing with biofeedback music paradigms and group participation through folk hypnosis, verbal suggestion, clinical psychology methods, roleplay, song, and textural sounds.

These performance techniques, which I call “wetware,” challenge the authoritarian aura of quantification, transforming biofeedback into a feminist space of posthumanist connection and expression.

The biofeedback performers (up to 10) in AFFECT FLOW are volunteers referred to as surrogates who meet me a half hour before the performance. After a brief musical interlude, I extend an invitation for the audience to join us in guided visualization and hypnosis led by me and my voice. Each surrogate operates a BioSynth, a musical instrument of my design that responds to physiological markers like heart rate, breathing, and skin conductance as a control parameter for electronic sound. The mechanics of the BioSynths are explained clearly, allowing listeners to perceive the shifting mood in the room during the performance through the bodies of the performers. This collaborative interplay of bodies gives rise to affect as an ecological relation, transcending individual subjectivity.

A lightbulb illuminates at the feet of each performer when their signals are amplified. Because I can control the audio outputs of each body via a mixing board, I can highlight solos, duets, trios, and ensemble moments live in the moment.

Credits

Affect Flow (2022)
Music composition and performance by Erin Gee.

Dramaturgy and text by Jena McLean. Poetry by Andrew C. Wenaus.

BioSynth affective hardware synthesizers are an open-source project by Erin Gee. Programming for this iteration by Etienne Montenegro with sonification programming by Erin Gee. PCB design by Grégory Perrin.

Click here for the BioSynth GitHub.

Click here for Tech Rider

Performances:

International Symposium of Electronic Art. CCCB Barcelona, ES, May 2022.

Society for Art and Technology, Montreal CA, July 2022.

Vancouver New Music – Orpheum Annex, Vancouver CA November 2022.

Electric Eclectics Festival, Meaford ON, CA, August 2023

MUTEK Montreal, CA, August 2023.

AFFECT FLOW (2022) at Vancouver New Music, Vancouver.

NOT POMPIDOU – Paris, FR

As part of the 2020 exhibition NEURONS: Simulated Intelligence at Centre Pompidou, Paris, my work as a media artist in AI, affective computing, and interactive sound website was misattributed to American composer and professor at Brandeis University Erin Gee (who shares my name).

While the two Erin Gees have been aware of one another’s practices for several years, (we share many peers who have teased us about our namesakes, and we were even programmed in the same music festival in 2019), to our knowledge our works have never been mixed up or misattributed in a professional capacity.

A public statement was made by the publishers of the catalogue (HYX editions) on their website and as a digital addendum/downloadable pdf, available here.

This post is intended to clarify the following points:

  1. The work that was presented at Centre Pompidou, Shillim: Mouthpiece 34 (2019), is not my work. Mouthpieces is the name of a body of work by homonymous American composer Erin Gee. She is best known for her work in non-semantic vocal music that typically consists of vocal and instrumental compositions, named Mouthpieces with a numerator afterwards. She has been contributing compositions to the Mouthpieces series for over twenty years.
  2. The artworks Machine Unlearning (2020) and Laughing Web Dot Space (2018) referenced in the wall text of the exhibition (see below) and attributed to the American composer are my works. These are new media artworks, incorporating technologies such as neural networks and interactive HTML in their creation. In addition, my work of the soone (2017) attributed to the American composer via the catalogue (see below) is another new media artwork of mine that uses machine learning / AI, and is a collaboration with Canadian media artist Sofian Audry, who is not acknowledged in the catalogue.
  3. As part of the events surrounding the exhibition, American composer Erin Gee was also invited to speak on a panel as part of Forum Vertigo: human and artificial perception dealing with generative music and artificial intelligence.  The opinions expressed and works she references in this panel discussion are entirely her own and are unrelated to my practice.

 

Following the discovery of the misattribution of my work at the exhibition opening (thanks to Parisian peers who were on site), I worked with Robin Dupuis (the Director of the organization perte de signal, which represents my work) to communicate the seriousness of this error to the exhibition’s curators Frédéric Migayrou et Camille Lenglois. Unfortunately, the wall text misattributing my work and research in new media art to American composer Erin Gee remained on the wall of the exhibition for weeks before being replaced by a text that was truly dedicated to the research of the American composer.

In response, the curators of Neurones apologized for these misattribution errors. They expressed that they were unable to do anything further to mitigate the issue of the 200-page catalogue, which also attributed other new media artworks of mine to the American composer who shares my name. During this period I had also reached out to the American artist who was also onsite, however for personal reasons she was not available to respond to the situation for several months.

A photo of original wall text from Neurones exhibition at Centre Pompidou combining the works and research of Canadian Artist Erin Gee with American Composer Erin Gee.

I am very grateful for the assistance of Robin Dupuis at Perte de Signal as well as Editions HYX publishers for working together to create a digital addendum that addresses the error published in the catalogue a month after the error was discovered. It was very pleasant to work with the publishers together on this solution. Despite this, a digital addendum has only a limited impact, as the printed copies remain in circulation without any printed addendum (see below).

I have recently been in touch with American composer Erin Gee to share a horrified laugh and work on solutions – we have both agreed to be diligent and aware of potential confusions this situation might create in the future. We collectively state:  Canadian new media artist Erin Gee is a specialist in affective technologies, emergent technologies such as quantum computing and AI, and vocal performance inspired by ASMR. American composer Erin Gee is a professor at Brandeis University and also an expert in non-semantic vocal performance and composition techniques.

This is of course an imperfect and improvisational solution, as I would never want to prevent a peer from exploring new technology, nor is it logical for me to avoid non-semantic vocal content in future works. Rather, this strategy speaks to a disciplinary situatedness that our sensibilities emerge from. If you are a professional artist or curator working in our fields, please share this story in your network as a means of preventing further confusion. As more peers learn of this issue, as well as our two distinct practices and achievements in our respective fields, we hope that this error will not reproduce itself.

Festival Inmersiva – Mexico City

FESTIVAL DELAYED DUE TO COVID19 – FEBRUARY 2021

In a very exciting presentation, the Centro de Cultura Digital in Mexico City will be presenting the most ambitious version of telematic artwork Presence (2020, Erin Gee and Jen Kutler) to date.

Presence – Erin Gee & Jen Kutler (Quebec / USA)
Telematic sound performance

Presence is a telematic sound work where two networked performers (Gee and Kutler) send and receive each other’s heart, sweat and respiration data from Canada and the USA, which in turn triggers strong electronic pulses in devices across their skin in response to one another. The two performers listen to a whispered roleplay of verbal suggestions on the topic of impossible and imaginary touch as music is generated live from their embodied reactions in live feedback over the network.

Video and audio livestream will be received at CCD in Mexico City, where a subject will also receive the live electric pulse signals from Jen and Erin’s bodies on both arms.

For more information on this hybrid streaming/real life event: Click here for a preview article published in Cronica MX (Spanish language)

 

 

LEV festival Matadero

My interactive biodata-driven VR work Project H.E.A.R.T. made in collaboration with Alex M Lee will be on view at LEV festival  Matadero in Madrid, Spain from September 24-27, 2020.

 

project heart mousepad preview

ABOUT LEV

L.E.V. (Laboratorio de Electrónica Visual) is a platform specialized in the production and promotion of electronic sound creations, and its relationship with visual arts. It was a European pioneer in this field, and since more than 13 years ago, it tries to converge the natural synergy between image and sound, and the new artistic trends, making special emphasis on live actions.

LEV develops the L.E.V. Festival (in Gijón) and specific, delocalized shows called LEVents. Through both proceedings, the platform reaches its goal: to provide an eclectic, panoramic vision of the current state of creation and all its connections, in an ever-evolving environment. That is why LEV focalizes its work both on international artists that are leaders in audiovisual creativity and local artists, both pioneers and new talents.

 

Darling Foundry Montreal

Erin Gee and Jen Kutler Presence (2020) with Xuan Ye, What lets lethargy dream produces lethargy’s surplus value (2020)

August 13, 2020 – online performances for Darling Foundry, Montreal 

I have been invited to participate in a project by curator Laurie Cotton-Pigeon called Allegorical Circuits for Human Software, a cyberfeminist exploration of Marshall McLuhan’s writing on technology that includes performances and virtual interventions spanning several months from JUNE 11, 2020 – AUGUST 20, 2020 (5 PM TO 10 PM)

I’m very happy to be sharing the performance evening with Xuan Ye, a great Canadian artist working across code, sound, and performance. The programming also includes:

MÉGANE VOGHELL

AVALON

NADÈGE GREBMEIER FORGET

ANNA EYLER & NICOLAS LAPOINTE                           

XUAN YE

 

ERIN GEE & JEN KUTLER

FABIENNE AUDÉOUD

ILEANA HERNANDEZ

NINA VROEMEN & ERIN HILL

EMMA-KATE GUIMOND

 

Cotton-Pigeon writes of our work:

“The notion of mediated connectivity is also present in the performative work of artists Erin Gee and Jen Kutler. As the two artists live in two different places (Gee is based in Canada and Kutler in the United States), they developed a system of sensorial connection without ever meeting in person, which has allowed them to overcome the constraints associated with geographical distance and concretize the “virtuality” of the Internet. Interested in the unconscious and autonomous nature of bodily sensations and their associated emotions, the artists simulate touch by combining an ASMR relaxation technique with the use of DIY devices (Touch Simulation Units) that work similarly to transcutaneous electrical nerve stimulation (TENS).”

 

Allegorical Circuits for Human Software has been conceived in dialogue with the collective exhibition FEEDBACK, Marshall McLuhan and the Arts, which will be presented in summer 2021 at Fonderie Darling.

 

 

Digifest Toronto

Thu, 04/26/2018 –
Sat, 04/28/2018

CORUS QUAY

25 Dockside Dr
ON M5A 1B6 Toronto


Presented by the Goethe-Institut Toronto
Curated by Tina Sauerländer (Berlin) and Erandy Vergara (Montreal)

Project H.E.A.R.T. by Erin Gee and Alex M. Lee
Enter Me Tonight by Li Alin
 
At the invitation of the Goethe-Institut curators Tina Sauerländer and Erandy Vergara have selected VR works for this year’s Toronto Digifest, including two recent pieces by Berlin-based Canadian artist Li Alin and Montreal-based artist Erin Gee in collaboration with South Korean-born, US-based artist Alex M. Lee. The artists use humor and irony to engage in controversial topics: emotions in first-person shooter video games and war in the case of Gee, and a futuristic exploration on human reproduction in technology-oriented times in the case of Alin.

The audience itself explores Gee’s H.E.A.R.T., a virtual work where you have to control your emotions to control the leading character in a war-related VR game, as well as Alin’s Enter Me Tonight, a VR environment engaged with issues on human reproduction, economy, biology, pornography and technology.

In a contextualizing event, the curators will speak about the history of VR and current trends and critical perspectives on this technology.

Digifest 2018 website

Event information courtesy of Goethe Institute

Rhode Island College

“// lonely avatar”, is an exhibition which investigates the use, meaning, and expressive potential of avatars in the contemporary digital landscape. “Lucid Dreaming” ruminates on the emptiness of the virtual avatar whilst “Project H.E.A.R.T.” involves filling that empty avatar with your emotion through a specially designed biosensor. Both projects follow a trajectory of thought in regards to the metaphorical potential of avatars in the virtual space. Curated by Frank Yefeng Wang, this show features works by Alex M Lee commissioned by Trinity Square Video in Toronto, ON and a project made in collaboration with Canadian artist Erin Gee.

Opening reception: 5-8pm
Artist Lecture: 7-7:30pm

The Chazan Family Gallery
Alex & Ani Hall
Rhode Island College
600 Mt. Pleasant Ave
Providence, RI 02908

Affective VR Workshop

Welcome

Welcome to the Emotional Data and Unity/VR workshop!  This workshop is hosted by Erin Gee, a Canadian artist and researcher who has worked in emotional biodata and art since 2012.  She has created work for emotional biodata and robotics (Swarming Emotional Pianos, 2014), children’s choir (Song of Seven, 2017), and now for VR with her latest work Project H.E.A.R.T (2017).

She is an active promoter of open-source and feminist culture, and publishes all of her technical work (Arduino code/Unity code) under the GNU GPL 3.0.

What is the GNU GPL?

The GNU General Public License is a free, copyleft license for software and other kinds of works.

TLDR: You may use, modify, redistribute this code for free.  If you redistribute, you must acknowledge my original authorship, and you must always allow other people to also modify and redistribute for free.  Any violation of this agreement means that you are breaking my copyright!  If you do modify aspects of this code, please share the love and contribute to the GitHub.

If you are not sure how to contribute to a GitHub project, feel free to contact me at erin dot marie dot gee at gmail dot com (or just the contact form on my website) and I’ll set you up!

For the full documentation of GNU GPL v3, click here.

Contextual Resource Section

BIOSENSORS IN GAMES

April 13th, 2011

Jacob Aron first reported on a variety of games that were taking advantage of biosensing technologies in an article published in New Scientist.

Aron, Jacob. (2011). “Emotional Video Gaming Makes the Action Real.” New Scientist.  Accessed November 15th 2017.


October 2016 – BfB Labs’ emotionally responsive game “Champions of the Shengha,” is dependent on a user’s emotional control — measured by a heart rate sensor — for success.


October 2016 – Nevermind is an adventure game where you explore strange worlds and solve puzzles to unlock a mystery that lurks within each “patient’s” inner psyche.  The Windows and Mac versions of Nevermind use biofeedback technology to detect your feelings of stress while playing, dynamically responding to those feelings to affect gameplay.  You can also play the game without this technology. http://nevermindgame.com/


BIOSENSORS IN CINEMA

Published by The Verge on 18 Jul 2017.

Lauren Goode goes inside Dolby’s little-known biophysical labs, where the company has been embarking on a five-year project to track people’s emotional responses as they watch movies and TV shows.

Biosensors are used by Dolby to study viewers’ emotional responses to

  • Aural frequency ranges
  • Dynamic color ranges
  • Audio volume as well as screen brightness
  • Music and sound effects

VR and Empathy – A reading list

Ebert, Roger. (June 23, 2005.) “Ebert’s Hall of Fame Remarks.” Roger Ebert’s Journal. Accessed November 15th 2017.

Bye, Kent. (January 31, 2017). “VR as the Ultimate Empathy Machine with Gabo Arora.”  Voices of VR Podcast.  Accessed November 13th 2017.

Hamilton, Kathryn. (February 23, 2017). “Voyeur Reality.” The New Inquiry. Accessed November 15th 2017.

Yang, Robert. (April 5, 2017). “If you walk in someone else’s shoes, then you have taken their shoes”: empathy machines as appropriation machines.”  radiator design blog.  Accessed November 15th 2017.

Scientific Resources

In this section you will find information on the science behind how emotion is generated by the brain, and how it can be “read” by sensing instruments.

What is Emotion

Emotion is a multi-component response to an emotionally potent event, causing changes in subjective feeling quality (psychological dimension), expressive social behavior (behavioral dimension), and physiological activation (physiological dimension) of the subject. (Kreiberg 2009)

Psychological/neurological frameworks for understanding emotion itself are articulated very well by Dr Lisa Feldman-Barrett, a scientist at Northeastern University.  She speaks particularly of the complexity of the human body, which might be experiencing a host of physiological effects from the ANS system, and how these are interpreted and perhaps also constructed by language.

On her website you will find many plain-language resources for emotional understanding from a contemporary neuroscientific point of view under the heading “Articles.”

How Scientists detect Emotion

The physiological techniques presented at this workshop were made according to psychologist Dr. Sylvia Kreibig‘s 2010 review of 134 publications dating from the 1970s to the 2000s. While the techniques viewed in this document are not weighted for “success” of the cited studies, and the literature clearly shows that there is no one “technique” for applying these skills, this document that has been cited over 1000 times in scientific literature since its publication, including over 200 times in 2017.

Source:

Kreibig,Sylvia D. “Autonomic nervous system activity in emotion: A review.” Biological Psychology 84 (2010) 394–421.

 

Continued research
If you would like to continue keeping up with scientific research in this area, an academic journal search using the databases PsycINFO, PsycARTICLES, and PubMed is recommended with the following search terms:

[emotion] and [autonomic nervous system or cardiovascular or cardiac or heart or respiration or respiratory or electrodermal or skin conductance]

 

Technical Resource Section

Unity Tutorials

This section is dedicated to technical resources, tutorials and code to get you started playing around with the biosensors.

Mirza VFX teaches amazing tutorials in particle creation.  Plug your biosensors in to manipulate beautiful particle-driven environments.

(Inspiration: Mufson, Beckett. (Aug 29, 2014) “A Team Of Artists Are 3D-Printing Their Emotions As Abstract House Decorations.”  Creators Project.  Accessed November 15th 2017.)

GitHub

Hardware Resources

Materials list for Erin Gee’s BiodataTrio PCB board.

Don’t forget to prepare your PulseSensor before you use it!  Click here for instructions on how to apply the protective decal and provide a gluegun seal.  The rest of the advice you can take or leave, but these two steps are ESSENTIAL to the longevity and accuracy of your device!

When buying cables to build your sensors – Look for cable that features 2 signals that are shielded.  The biosensors are picking up on very, very sensitive electrical information from the body – any disruption in the electric field could make your readings less accurate.  Especially if the sensor cables are placed near to one another!

To prevent this, you can either buy shielded cable (like this) and solder the silver shielding to your ground connection (pick whatever you like for the other two, maybe red for power and black for signal?)

Or if you’re in a pinch, you can just twist the ground wire around the signal wire that you are trying to protect from outside interference.

Here’s a link to my Respiration Belt Instructable.  After a few years I didn’t find that a respiration belt was as interesting to me because the belts are awkward to strap into for the average person, but if you’d like to go for it, here is a simple way to make it happen!  This signal is perhaps best amplified, and you might need to calculate the relative drift of the elastic and account for it as an offset in order to capture things like when someone is holding their breath.

Project H.E.A.R.T.

Project H.E.A.R.T. (2017)

2017

A biodata-driven VR game where militainment and pop music fuel a new form of emotional drone warfare.

A twist on popular “militainment” shooter video games, Project H.E.A.R.T. invites the viewer to place their fingers on a custom biodata device, and summon their enthusiasm to engage their avatar, Yowane Haku, in “combat therapy.” Fans of the Vocaloid characters may recognize Haku as the “bad copy” of Japanese pop celebrity Hatsune Miku, a holographic personnage that invites her fans to pour their content and songs into her virtual voice.

The biosensing system features a pulse sensor, and a skin conductance sensor of Gee’s design. Through principles of emotional physiology and affective computing, the device gathers data relative to heart rate and blood flow from index finger, and skin conductance from middle and ring fingers of users. The biodata is read by a microcontroller and transferred to Unity VR, thus facilitating emotional interactivity: a user’s enthusiasm (spikes in signal amplitude in skin conductance, elevated heart rate, and shifts in amplitude of the pulse signal) stimulates the holographic pop star to sing in the virtual warzone, thus inspiring military fighters to continue the war, and create more enemy casualties. At the end of the experience the user is confronted with their “score” of traumatized soldiers vs enemies killed, with no information whether this means that they won or lost the “game”.

The user is thus challenged to navigate soldier’s emotional anxieties and summon their positivity to activate Haku’s singing voice as soldiers battle not only against a group of enemies, but also against their own lack of confidence in times of global economic instability.

The landscape of Project H.E.A.R.T. was built from geopolitically resonant sites found on Google Maps, creating a dreamlike background for the warzone. In-game dialogue wavers between self-righteous soldier banter typical of video games, and self-help, bringing the VR participant to an interrogation of their own emotional body in a virtual space that conflates war, pop music, drone technology, and perhaps movement-induced VR nausea.

As Kathryn Hamilton pointed out in her 2017 essay “Voyeur Realism” for New Inquiry,

“VR’s genesis and development is in the military, where it has been used to train soldiers in “battle readiness,” a euphemism for: methods to overcome the innate human resistance to firing at another human being. In the last few years, VR’s usage has shifted 180 degrees from a technology used to train soldiers for war, to one that claims to “amplify” the voices afflicted by war, and to affect “world influencers” who might be able to stop said wars.”

Credits

Narrative Design: Sofian Audry, Roxanne Baril-Bédard, Erin Gee
3D Art: Alex Lee and Marlon Kroll
Animation and Rigging: Nicklas Kenyon and Alex Lee
VFX: Anthony Damiani, Erin Gee, Nicklas Kenyon
Programming: Sofian Audry, Erin Gee, Nicklas Kenyon, Jacob Morin
AI Design: Sofian Audry
Sound Design: Erin Gee, Austin Haughton, Ben Hinckley, Ben Leavitt, Nicolas Ow
BioSensor Hardware Design: Erin Gee and Martin Peach
BioSensor Case Design: Grégory Perrin
BioSensor Hardware Programming: Thomas Ouellet Fredericks, Erin Gee, Martin Peach
Featuring music by Lazerblade, Night Chaser and Austin Haughton
Yowane Haku character designed by CAFFEIN
Yowane Haku Cyber model originally created by SEGA for Hatsune Miku: Project DIVA 2nd (2010)
Project H.E.A.R.T. also features the vocal acting talents of Erin Gee, Danny Gold, Alex Lee, Ben McCarthy, Gregory Muszkie, James O’Calloghan, and Henry Adam Svec.

Thanks to the support of the Canada Council for the Arts and AMD Radeon, this project was commissioned by Trinity Square Video for the exhibition Worldbuilding, curated by John G Hampton and Maiko Tanaka.

This project would have not been possible without the logistical and technical support of the following organizations:

Technoculture Art and Games Lab (Concordia University)

Concordia University

ASAP Media Services (University of Maine)

Exhibition history

November-December 2017  Worldbuilding @ Trinity Square Video, Toronto

February-March 2018 Future Perfect @ Hygienic Gallery, New London Connecticut

April 26-28, 2018 @ Digifest, Toronto

June 7-17, 2019 @ Elektra Festival, Montreal

January 2020 @ The Artist Project, Toronto

October 2020 @ Festival LEV Matadero, Spain

Links

Project H.E.A.R.T. official website
Worldbuilding Exhibition Website
Review in Canadian Art
My research blog: Pop and Militainment
Featured on Radiance VR

Video

Project H.E.A.R.T (2017)
Installation and Gameplay

Gallery