biodata Tag

Festival Inmersiva – Mexico City

FESTIVAL DELAYED DUE TO COVID19 – FEBRUARY 2021

In a very exciting presentation, the Centro de Cultura Digital in Mexico City will be presenting the most ambitious version of telematic artwork Presence (2020, Erin Gee and Jen Kutler) to date.

Presence – Erin Gee & Jen Kutler (Quebec / USA)
Telematic sound performance

Presence is a telematic sound work where two networked performers (Gee and Kutler) send and receive each other’s heart, sweat and respiration data from Canada and the USA, which in turn triggers strong electronic pulses in devices across their skin in response to one another. The two performers listen to a whispered roleplay of verbal suggestions on the topic of impossible and imaginary touch as music is generated live from their embodied reactions in live feedback over the network.

Video and audio livestream will be received at CCD in Mexico City, where a subject will also receive the live electric pulse signals from Jen and Erin’s bodies on both arms.

For more information on this hybrid streaming/real life event: Click here for a preview article published in Cronica MX (Spanish language)

 

 

LEV festival Matadero

My interactive biodata-driven VR work Project H.E.A.R.T. made in collaboration with Alex M Lee will be on view at LEV festival  Matadero in Madrid, Spain from September 24-27, 2020.

 

project heart mousepad preview

ABOUT LEV

L.E.V. (Laboratorio de Electrónica Visual) is a platform specialized in the production and promotion of electronic sound creations, and its relationship with visual arts. It was a European pioneer in this field, and since more than 13 years ago, it tries to converge the natural synergy between image and sound, and the new artistic trends, making special emphasis on live actions.

LEV develops the L.E.V. Festival (in Gijón) and specific, delocalized shows called LEVents. Through both proceedings, the platform reaches its goal: to provide an eclectic, panoramic vision of the current state of creation and all its connections, in an ever-evolving environment. That is why LEV focalizes its work both on international artists that are leaders in audiovisual creativity and local artists, both pioneers and new talents.

 

Darling Foundry Montreal

Erin Gee and Jen Kutler Presence (2020) with Xuan Ye, What lets lethargy dream produces lethargy’s surplus value (2020)

August 13, 2020 – online performances for Darling Foundry, Montreal 

I have been invited to participate in a project by curator Laurie Cotton-Pigeon called Allegorical Circuits for Human Software, a cyberfeminist exploration of Marshall McLuhan’s writing on technology that includes performances and virtual interventions spanning several months from JUNE 11, 2020 – AUGUST 20, 2020 (5 PM TO 10 PM)

I’m very happy to be sharing the performance evening with Xuan Ye, a great Canadian artist working across code, sound, and performance. The programming also includes:

MÉGANE VOGHELL

AVALON

NADÈGE GREBMEIER FORGET

ANNA EYLER & NICOLAS LAPOINTE                           

XUAN YE

 

ERIN GEE & JEN KUTLER

FABIENNE AUDÉOUD

ILEANA HERNANDEZ

NINA VROEMEN & ERIN HILL

EMMA-KATE GUIMOND

 

Cotton-Pigeon writes of our work:

“The notion of mediated connectivity is also present in the performative work of artists Erin Gee and Jen Kutler. As the two artists live in two different places (Gee is based in Canada and Kutler in the United States), they developed a system of sensorial connection without ever meeting in person, which has allowed them to overcome the constraints associated with geographical distance and concretize the “virtuality” of the Internet. Interested in the unconscious and autonomous nature of bodily sensations and their associated emotions, the artists simulate touch by combining an ASMR relaxation technique with the use of DIY devices (Touch Simulation Units) that work similarly to transcutaneous electrical nerve stimulation (TENS).”

 

Allegorical Circuits for Human Software has been conceived in dialogue with the collective exhibition FEEDBACK, Marshall McLuhan and the Arts, which will be presented in summer 2021 at Fonderie Darling.

 

 

Digifest Toronto

Thu, 04/26/2018 –
Sat, 04/28/2018

CORUS QUAY

25 Dockside Dr
ON M5A 1B6 Toronto


Presented by the Goethe-Institut Toronto
Curated by Tina Sauerländer (Berlin) and Erandy Vergara (Montreal)

Project H.E.A.R.T. by Erin Gee and Alex M. Lee
Enter Me Tonight by Li Alin
 
At the invitation of the Goethe-Institut curators Tina Sauerländer and Erandy Vergara have selected VR works for this year’s Toronto Digifest, including two recent pieces by Berlin-based Canadian artist Li Alin and Montreal-based artist Erin Gee in collaboration with South Korean-born, US-based artist Alex M. Lee. The artists use humor and irony to engage in controversial topics: emotions in first-person shooter video games and war in the case of Gee, and a futuristic exploration on human reproduction in technology-oriented times in the case of Alin.

The audience itself explores Gee’s H.E.A.R.T., a virtual work where you have to control your emotions to control the leading character in a war-related VR game, as well as Alin’s Enter Me Tonight, a VR environment engaged with issues on human reproduction, economy, biology, pornography and technology.

In a contextualizing event, the curators will speak about the history of VR and current trends and critical perspectives on this technology.

Digifest 2018 website

Event information courtesy of Goethe Institute

Rhode Island College

“// lonely avatar”, is an exhibition which investigates the use, meaning, and expressive potential of avatars in the contemporary digital landscape. “Lucid Dreaming” ruminates on the emptiness of the virtual avatar whilst “Project H.E.A.R.T.” involves filling that empty avatar with your emotion through a specially designed biosensor. Both projects follow a trajectory of thought in regards to the metaphorical potential of avatars in the virtual space. Curated by Frank Yefeng Wang, this show features works by Alex M Lee commissioned by Trinity Square Video in Toronto, ON and a project made in collaboration with Canadian artist Erin Gee.

Opening reception: 5-8pm
Artist Lecture: 7-7:30pm

The Chazan Family Gallery
Alex & Ani Hall
Rhode Island College
600 Mt. Pleasant Ave
Providence, RI 02908

Affective VR Workshop

Welcome

Welcome to the Emotional Data and Unity/VR workshop!  This workshop is hosted by Erin Gee, a Canadian artist and researcher who has worked in emotional biodata and art since 2012.  She has created work for emotional biodata and robotics (Swarming Emotional Pianos, 2014), children’s choir (Song of Seven, 2017), and now for VR with her latest work Project H.E.A.R.T (2017).

She is an active promoter of open-source and feminist culture, and publishes all of her technical work (Arduino code/Unity code) under the GNU GPL 3.0.

What is the GNU GPL?

The GNU General Public License is a free, copyleft license for software and other kinds of works.

TLDR: You may use, modify, redistribute this code for free.  If you redistribute, you must acknowledge my original authorship, and you must always allow other people to also modify and redistribute for free.  Any violation of this agreement means that you are breaking my copyright!  If you do modify aspects of this code, please share the love and contribute to the GitHub.

If you are not sure how to contribute to a GitHub project, feel free to contact me at erin dot marie dot gee at gmail dot com (or just the contact form on my website) and I’ll set you up!

For the full documentation of GNU GPL v3, click here.

Contextual Resource Section

BIOSENSORS IN GAMES

April 13th, 2011

Jacob Aron first reported on a variety of games that were taking advantage of biosensing technologies in an article published in New Scientist.

Aron, Jacob. (2011). “Emotional Video Gaming Makes the Action Real.” New Scientist.  Accessed November 15th 2017.


October 2016 – BfB Labs’ emotionally responsive game “Champions of the Shengha,” is dependent on a user’s emotional control — measured by a heart rate sensor — for success.


October 2016 – Nevermind is an adventure game where you explore strange worlds and solve puzzles to unlock a mystery that lurks within each “patient’s” inner psyche.  The Windows and Mac versions of Nevermind use biofeedback technology to detect your feelings of stress while playing, dynamically responding to those feelings to affect gameplay.  You can also play the game without this technology. http://nevermindgame.com/


BIOSENSORS IN CINEMA

Published by The Verge on 18 Jul 2017.

Lauren Goode goes inside Dolby’s little-known biophysical labs, where the company has been embarking on a five-year project to track people’s emotional responses as they watch movies and TV shows.

Biosensors are used by Dolby to study viewers’ emotional responses to

  • Aural frequency ranges
  • Dynamic color ranges
  • Audio volume as well as screen brightness
  • Music and sound effects

VR and Empathy – A reading list

Ebert, Roger. (June 23, 2005.) “Ebert’s Hall of Fame Remarks.” Roger Ebert’s Journal. Accessed November 15th 2017.

Bye, Kent. (January 31, 2017). “VR as the Ultimate Empathy Machine with Gabo Arora.”  Voices of VR Podcast.  Accessed November 13th 2017.

Hamilton, Kathryn. (February 23, 2017). “Voyeur Reality.” The New Inquiry. Accessed November 15th 2017.

Yang, Robert. (April 5, 2017). “If you walk in someone else’s shoes, then you have taken their shoes”: empathy machines as appropriation machines.”  radiator design blog.  Accessed November 15th 2017.

Scientific Resources

In this section you will find information on the science behind how emotion is generated by the brain, and how it can be “read” by sensing instruments.

What is Emotion

Emotion is a multi-component response to an emotionally potent event, causing changes in subjective feeling quality (psychological dimension), expressive social behavior (behavioral dimension), and physiological activation (physiological dimension) of the subject. (Kreiberg 2009)

Psychological/neurological frameworks for understanding emotion itself are articulated very well by Dr Lisa Feldman-Barrett, a scientist at Northeastern University.  She speaks particularly of the complexity of the human body, which might be experiencing a host of physiological effects from the ANS system, and how these are interpreted and perhaps also constructed by language.

On her website you will find many plain-language resources for emotional understanding from a contemporary neuroscientific point of view under the heading “Articles.”

How Scientists detect Emotion

The physiological techniques presented at this workshop were made according to psychologist Dr. Sylvia Kreibig‘s 2010 review of 134 publications dating from the 1970s to the 2000s. While the techniques viewed in this document are not weighted for “success” of the cited studies, and the literature clearly shows that there is no one “technique” for applying these skills, this document that has been cited over 1000 times in scientific literature since its publication, including over 200 times in 2017.

Source:

Kreibig,Sylvia D. “Autonomic nervous system activity in emotion: A review.” Biological Psychology 84 (2010) 394–421.

 

Continued research
If you would like to continue keeping up with scientific research in this area, an academic journal search using the databases PsycINFO, PsycARTICLES, and PubMed is recommended with the following search terms:

[emotion] and [autonomic nervous system or cardiovascular or cardiac or heart or respiration or respiratory or electrodermal or skin conductance]

 

Technical Resource Section

Unity Tutorials

This section is dedicated to technical resources, tutorials and code to get you started playing around with the biosensors.

Mirza VFX teaches amazing tutorials in particle creation.  Plug your biosensors in to manipulate beautiful particle-driven environments.

(Inspiration: Mufson, Beckett. (Aug 29, 2014) “A Team Of Artists Are 3D-Printing Their Emotions As Abstract House Decorations.”  Creators Project.  Accessed November 15th 2017.)

GitHub

Hardware Resources

Materials list for Erin Gee’s BiodataTrio PCB board.

Don’t forget to prepare your PulseSensor before you use it!  Click here for instructions on how to apply the protective decal and provide a gluegun seal.  The rest of the advice you can take or leave, but these two steps are ESSENTIAL to the longevity and accuracy of your device!

When buying cables to build your sensors – Look for cable that features 2 signals that are shielded.  The biosensors are picking up on very, very sensitive electrical information from the body – any disruption in the electric field could make your readings less accurate.  Especially if the sensor cables are placed near to one another!

To prevent this, you can either buy shielded cable (like this) and solder the silver shielding to your ground connection (pick whatever you like for the other two, maybe red for power and black for signal?)

Or if you’re in a pinch, you can just twist the ground wire around the signal wire that you are trying to protect from outside interference.

Here’s a link to my Respiration Belt Instructable.  After a few years I didn’t find that a respiration belt was as interesting to me because the belts are awkward to strap into for the average person, but if you’d like to go for it, here is a simple way to make it happen!  This signal is perhaps best amplified, and you might need to calculate the relative drift of the elastic and account for it as an offset in order to capture things like when someone is holding their breath.

Project H.E.A.R.T.

A biodata-driven VR game where militainment and pop music fuel a new form of emotional drone warfare.

A twist on popular “militainment” shooter video games, Project H.E.A.R.T. invites the viewer to place their fingers on a custom biodata device, and summon their enthusiasm to engage their avatar, Yowane Haku, in “combat therapy.” Fans of the Vocaloid characters may recognize Haku as the “bad copy” of Japanese pop celebrity Hatsune Miku, a holographic personnage that invites her fans to pour their content and songs into her virtual voice.

The biosensing system features a pulse sensor, and a skin conductance sensor of Gee’s design. Through principles of emotional physiology and affective computing, the device gathers data relative to heart rate and blood flow from index finger, and skin conductance from middle and ring fingers of users. The biodata is read by a microcontroller and transferred to Unity VR, thus facilitating emotional interactivity: a user’s enthusiasm (spikes in signal amplitude in skin conductance, elevated heart rate, and shifts in amplitude of the pulse signal) stimulates the holographic pop star to sing in the virtual warzone, thus inspiring military fighters to continue the war, and create more enemy casualties. At the end of the experience the user is confronted with their “score” of traumatized soldiers vs enemies killed, with no information whether this means that they won or lost the “game”.

The user is thus challenged to navigate soldier’s emotional anxieties and summon their positivity to activate Haku’s singing voice as soldiers battle not only against a group of enemies, but also against their own lack of confidence in times of global economic instability.

The landscape of Project H.E.A.R.T. was built from geopolitically resonant sites found on Google Maps, creating a dreamlike background for the warzone. In-game dialogue wavers between self-righteous soldier banter typical of video games, and self-help, bringing the VR participant to an interrogation of their own emotional body in a virtual space that conflates war, pop music, drone technology, and perhaps movement-induced VR nausea.

 

 

As Kathryn Hamilton pointed out in her 2017 essay “Voyeur Realism” for New Inquiry,

“VR’s genesis and development is in the military, where it has been used to train soldiers in “battle readiness,” a euphemism for: methods to overcome the innate human resistance to firing at another human being. In the last few years, VR’s usage has shifted 180 degrees from a technology used to train soldiers for war, to one that claims to “amplify” the voices afflicted by war, and to affect “world influencers” who might be able to stop said wars.”

Photography by Toni Hafkenscheid.  Images of Worldbuilding exhibition courtesy of Trinity Square Video, 2017.

Exhibition history:

November-December 2017  Worldbuilding @ Trinity Square Video, Toronto

February-March 2018 Future Perfect @ Hygienic Gallery, New London Connecticut

April 26-28, 2018 @ Digifest, Toronto

June 7-17, 2019 @ Elektra Festival, Montreal

January 2020 @ The Artist Project, Toronto

 October 2020 @ Festival LEV Matadero, Spain

Credits

Narrative Design: Sofian Audry, Roxanne Baril-Bédard, Erin Gee

3D Art: Alex Lee and Marlon Kroll

Animation and Rigging: Nicklas Kenyon and Alex Lee

VFX: Anthony Damiani, Erin Gee, Nicklas Kenyon

Programming: Sofian Audry, Erin Gee, Nicklas Kenyon, Jacob Morin

AI Design: Sofian Audry

Sound Design: Erin Gee, Austin Haughton, Ben Hinckley, Ben Leavitt, Nicolas Ow

BioSensor Hardware Design: Erin Gee and Martin Peach

BioSensor Case Design: Grégory Perrin

BioSensor Hardware Programming: Thomas Ouellet Fredericks, Erin Gee, Martin Peach

Featuring music by Lazerblade, Night Chaser and Austin Haughton

Yowane Haku character designed by CAFFEIN

Yowane Haku Cyber model originally created by SEGA for Hatsune Miku: Project DIVA 2nd (2010)

Project H.E.A.R.T. also features the vocal acting talents of Erin Gee, Danny Gold, Alex Lee, Ben McCarthy, Gregory Muszkie, James O’Calloghan, and Henry Adam Svec.

Thanks to the support of the Canada Council for the Arts and AMD Radeon, this project was commissioned by Trinity Square Video for the exhibition Worldbuilding, curated by John G Hampton and Maiko Tanaka.

This project would have not been possible without the logistical and technical support of the following organizations:

Technoculture Art and Games Lab (Concordia University)

Concordia University

ASAP Media Services (University of Maine)