Author: Erin Gee

RadianceVR

Project H.E.A.R.T. (2017) has joined an interesting collection of VR works on the website radiancevr.co

If you find yourself looking for great examples of VR art, I’d highly recommend browsing the works on this website!

Founded by curators PHILIP HAUSMEIER and TINA SAUERLAENDER

“Radiance is a research platform and database for VR art. Its mission is to present artists working with VR from all over the world to create visibility and accessibility for VR art and for faster adoption of virtual technologies. The platform works closely with artists, institutions and independent curators to select the highest quality of virtual art for public institutional exhibitions.”

Review in Canadian Art

I really appreciate this article by Tatum Dooley for Canadian Art on the Worldbuilding exhibition curated by John G Hampton and Maiko Tanaka at Trinity Square Video. My work Project H.E.A.R.T. which highlights VR and emotions made with Alex M Lee is featured, among other great works by Jeremy Bailey Kristen D Schaffer Eshrat Erfanian and Yam Lau.  Following is an excerpt from the article:

“The gamification of our bodies renders the physical form void, replaced by screens where our bodies and emotions can be morphed and manipulated. Perhaps the only way to create art with technology as advanced and recent as VR is to reckon with its potential consequences.

Gee’s project, the most realized out of the four artists in the exhibition, masters this reckoning. I spoke with Gee in the lead-up to the exhibition, and she explained the conceptual backbone of the piece. “I’m working through questions of emotional sincerity when it comes to self-help. In theory, if you can technologically master your emotions, if you can just make yourself excited, then you can make yourself a better, happier person. I don’t know how sincere that is…”

Click on the link below for the full article.

VR and the Failure of Self-Help Technology

In general, I feel very proud of this work but also very exhausted by it.  Through the project I’ve been working through the relationship between pop music and war, self help and sincerity, and ultimately I’m working through these issues of technique and technology in how life and trauma comes to us.  During the panel for the exhibition, there was a question of whether I was “pro-war”, and it’s one that I have received a few times in facebook messages from curious friends from far away.  The project is complex and difficult to read because I think it has to be.  It reflects my own mediatized understanding of international conflict, maybe my own frustration at my lack of understanding.

The best I can understand war is how it is mediated to me: through video games and news cycles, through abstract discussions on the radio. The goal of this project was never to address the terror and complexity of geopolitical conflict, but rather, to propose a psychedelic pop culture mirror, imagining a video game ruled not by characters that espouse self-righteous violence and grit, but technologically manipulated empathy and enthusiasm.  This game fails to address war in the same way that all technologically mediated attempts to do so fail to address war.  I also am also dissatisfied at the idea of an artistic protest that makes a cartoonish, morally didactic utopia where rainbows and love shoot out of guns instead of flesh-tearing bullets. I think the answer about the politics of this game lie in the end screen: an abstract screen that confronts you with statistics of death and trauma as a result of the battle itself.  I don’t think there is a way to win the game.

VR Empathy Workshop

Welcome

Welcome to the Emotional Data and Unity/VR workshop!  This workshop is hosted by Erin Gee, a Canadian artist and researcher who has worked in emotional biodata and art since 2012.  She has created work for emotional biodata and robotics (Swarming Emotional Pianos, 2014), children’s choir (Song of Seven, 2017), and now for VR with her latest work Project H.E.A.R.T (2017).

She is an active promoter of open-source and feminist culture, and publishes all of her technical work (Arduino code/Unity code) under the GNU GPL 3.0.

This workshop webiste itself is standard copyright to Erin Gee (2017).

What is the GNU GPL?

The GNU General Public License is a free, copyleft license for software and other kinds of works.

TLDR: You may use, modify, redistribute this code for free.  If you redistribute, you must acknowledge my original authorship, and you must always allow other people to also modify and redistribute for free.  Any violation of this agreement means that you are breaking my copyright!  If you do modify aspects of this code, please share the love and contribute to the GitHub.

If you are not sure how to contribute to a GitHub project, feel free to contact me at erin dot marie dot gee at gmail dot com (or just the contact form on my website) and I’ll set you up!

For the full documentation of GNU GPL v3, click here.

Contextual Resource Section

BIOSENSORS IN GAMES

April 13th, 2011

Jacob Aron first reported on a variety of games that were taking advantage of biosensing technologies in an article published in New Scientist.

Aron, Jacob. (2011). “Emotional Video Gaming Makes the Action Real.” New Scientist.  Accessed November 15th 2017.


October 2016 – BfB Labs’ emotionally responsive game “Champions of the Shengha,” is dependent on a user’s emotional control — measured by a heart rate sensor — for success.


October 2016 – Nevermind is an adventure game where you explore strange worlds and solve puzzles to unlock a mystery that lurks within each “patient’s” inner psyche.  The Windows and Mac versions of Nevermind use biofeedback technology to detect your feelings of stress while playing, dynamically responding to those feelings to affect gameplay.  You can also play the game without this technology. http://nevermindgame.com/


BIOSENSORS IN CINEMA

Published by The Verge on 18 Jul 2017.

Lauren Goode goes inside Dolby’s little-known biophysical labs, where the company has been embarking on a five-year project to track people’s emotional responses as they watch movies and TV shows.

Biosensors are used by Dolby to study viewers’ emotional responses to

  • Aural frequency ranges
  • Dynamic color ranges
  • Audio volume as well as screen brightness
  • Music and sound effects

VR and Empathy – A reading list

Ebert, Roger. (June 23, 2005.) “Ebert’s Hall of Fame Remarks.” Roger Ebert’s Journal. Accessed November 15th 2017.

Bye, Kent. (January 31, 2017). “VR as the Ultimate Empathy Machine with Gabo Arora.”  Voices of VR Podcast.  Accessed November 13th 2017.

Hamilton, Kathryn. (February 23, 2017). “Voyeur Reality.” The New Inquiry. Accessed November 15th 2017.

Yang, Robert. (April 5, 2017). “If you walk in someone else’s shoes, then you have taken their shoes”: empathy machines as appropriation machines.”  radiator design blog.  Accessed November 15th 2017.

Scientific Resources

In this section you will find information on the science behind how emotion is generated by the brain, and how it can be “read” by sensing instruments.

What is Emotion

Emotion is a multi-component response to an emotionally potent event, causing changes in subjective feeling quality (psychological dimension), expressive social behavior (behavioral dimension), and physiological activation (physiological dimension) of the subject. (Kreiberg 2009)

Psychological/neurological frameworks for understanding emotion itself are articulated very well by Dr Lisa Feldman-Barrett, a scientist at Northeastern University.  She speaks particularly of the complexity of the human body, which might be experiencing a host of physiological effects from the ANS system, and how these are interpreted and perhaps also constructed by language.

On her website you will find many plain-language resources for emotional understanding from a contemporary neuroscientific point of view under the heading “Articles.”

How Scientists detect Emotion

The physiological techniques presented at this workshop were made according to psychologist Dr. Sylvia Kreibig‘s 2010 review of 134 publications dating from the 1970s to the 2000s. While the techniques viewed in this document are not weighted for “success” of the cited studies, and the literature clearly shows that there is no one “technique” for applying these skills, this document that has been cited over 1000 times in scientific literature since its publication, including over 200 times in 2017.

Source:

Kreibig,Sylvia D. “Autonomic nervous system activity in emotion: A review.” Biological Psychology 84 (2010) 394–421.

 

Continued research
If you would like to continue keeping up with scientific research in this area, an academic journal search using the databases PsycINFO, PsycARTICLES, and PubMed is recommended with the following search terms:

[emotion] and [autonomic nervous system or cardiovascular or cardiac or heart or respiration or respiratory or electrodermal or skin conductance]

 

Technical Resource Section

Unity Tutorials

This section is dedicated to technical resources, tutorials and code to get you started playing around with the biosensors.

Mirza VFX teaches amazing tutorials in particle creation.  Plug your biosensors in to manipulate beautiful particle-driven environments.

(Inspiration: Mufson, Beckett. (Aug 29, 2014) “A Team Of Artists Are 3D-Printing Their Emotions As Abstract House Decorations.”  Creators Project.  Accessed November 15th 2017.)

GitHub

Erin Gee’s BioData Duo Github

Erin Gee’s UnityBridge

  • Windows OS but you could probably modify code for other OS systems
  • Currently, this script only allows sensors to write to Unity via serial port.  So you can’t yet “write back” to the Arduino from Unity.

Prossel’s UnitySerialPort GitHub (Mac OS)

  • This script worked well for me on MacOS but froze the screen when I used it in Windows.

Hardware Resources

Materials list for Erin Gee’s BiodataTrio PCB board.

Don’t forget to prepare your PulseSensor before you use it!  Click here for instructions on how to apply the protective decal and provide a gluegun seal.  The rest of the advice you can take or leave, but these two steps are ESSENTIAL to the longevity and accuracy of your device!

When buying cables to build your sensors – Look for cable that features 2 signals that are shielded.  The biosensors are picking up on very, very sensitive electrical information from the body – any disruption in the electric field could make your readings less accurate.  Especially if the sensor cables are placed near to one another!

To prevent this, you can either buy shielded cable (like this) and solder the silver shielding to your ground connection (pick whatever you like for the other two, maybe red for power and black for signal?)

Or if you’re in a pinch, you can just twist the ground wire around the signal wire that you are trying to protect from outside interference.

Here’s a link to my Respiration Belt Instructable.  After a few years I didn’t find that a respiration belt was as interesting to me because the belts are awkward to strap into for the average person, but if you’d like to go for it, here is a simple way to make it happen!  This signal is perhaps best amplified, and you might need to calculate the relative drift of the elastic and account for it as an offset in order to capture things like when someone is holding their breath.

Ideas Lab Denmark

I will be giving a unique and in-depth workshop hosted by Emotional Data Lab (Aarhus University), Interactive Denmark and Ideas Lab in Aarhus, Denmark from November 21-23.  The workshop consists of 3 three-hour sessions where I will share my materials and experiences with incorporating physiological markers of emotion into the VR-compatible Unity environment.

Participants will be placed into “teams” in order to work together, experiment, and discuss the promises, problems and potential of using biosensors to capture a user’s emotional experience through digital tools.

WorldBuilding: TSV Toronto

November 3rd – December 9th 2017

Trinity Square Video, 401 Richmond, Toronto Canada.

My work made in collaboration with 3D artist Alex M. Lee for VR and emotional-biosensors, Project H.E.A.R.T. (2017) was debuted on November 5th at Trinity Square Video, Toronto.

This project was commissioned by TSV by curators John Hampton and Maiko Tanaka, thanks to the support of the Canada Council for the Arts. The exhibition also features amazing works by Canadian artists Jeremy Bailey and Kristen Schaffer, Eshrat Erfanian, and Yam Lau.

Visit the Worldbuilding website by clicking here.

 

KidzLab Montreal

KIDZLAB September 28-29 2017

Perte de Signal is happy to announce the launch of its first edition of KidZlab, a 4-day digital arts festival for young creators: “Un laboratoire d’innovation pour l’imaginaire.”

For this first edition of KidZlab, I presented a workshop entitled “Strange Theremin” – teaching teams of young people to work in groups to assemble a circuit that allows them to manipulate musical tones with their skin conductance.  This new musical instrument allows students to explore touch, sweat, and emotional engagement as a potential musical material.

Here’s what my young students had to say:

 

The event also featured very interesting workshops by artists:

Eric Cariat (BE) – Stephanie Castonguay – Maxime Damecour – Erin Gee – Alice Jarry – Roby Provost-Blanchard – Alexandre Quessy

at Perte de Signal 5445 De Gaspé – Espace 107 (RDC) Montréal.

With thanks to:

Conseil des arts et des lettres du Québec
Wallonie-Bruxelles International
KIKK Festival 2017
Les Journées de la culture
Le Fab Lab du PEC

For more information (in French):
http://perte-de-signal.org/kidzlab-festival-dart-numerique-pour-le-jeune-public/

KidZlab Laboratoire d’innovation pour l’imaginaire from PERTE DE SIGNAL on Vimeo.

William Basinski @ Pop Montreal

September 15th, 2017 – 17h POP Box (3450 St Urbain, Montreal)

In the context of this year’s Pop Montreal Festival Symposium, I have been invited to engage in a public conversation with avant-garde composer William Basinski .

Click here for more information on the Pop Symposium, taking place September 14-17, 2017.

Creator of the widely acclaimed album set Disintegration Loops (2002), Basinski is an intuitive composer of ambient electronic music who works work magnetic tape loops to access dreamlike acoustic spaces.  He once described himself as investing incredible amounts of meditative energy towards improvisation and locating the “timeless, amniotic bubble” of sound one could float within. A bubble is an apt metaphor for these sounds: expansive, swirling voids that physically emanate from thin slips of magnetic tape.

Among other topics, I’m looking forward to this opportunity to speak with Basinski about the physicality of sound, both in the sound producing bodies (the magnetic devices he charms into circles and feedback-song) and the receptive media bodies (us leaky humans).

Take a listen below to Basinski’s soundcloud account in order to experience his processes in tape loop and delay systems, found sounds, feedback, and shortwave radio static.

of the soone

A disembodied voice invites the listener to partake in a speculative audio treatment that promises to awaken underdeveloped neural passageways through exposure to the non-human processes of neural network language acquisition.

of the soone is the first in a body of work Erin Gee made in collaboration with artist Sofian Audry that explores the material and authorial agencies of a deceased author, a LSTM algorithm, and an ASMR performer.

The work in this series transmits the aesthetics of an AI “voice” that speaks through outputted text through the sounds of Gee’s softly spoken human vocals, using a human body as a relatively low-tech filter for processes of machine automation.

of the soone, Gee welcomes the listener to a speculative neural treatment called “language processing and de-processing”, preparing the listener as a subject by dressing them in a fluffy terry robe and EEG cap to monitor brainwaves. She introduces the listener to the many benefits of this language processing and de-processing “treatment”, as  sonic exposure to machine learning processes allow one to to subliminally reinvigorate under developed neural-linguistic pathways in their own human mind.

During the aural treatment, the subject listens to Gee’s voice reading out the results of a process enacted by a deep recurrent neural network agent known as “long short term memory” (LSTM). The algorithm “reads” Emily Brontë’s Wuthering Heights character by character, familiarizing itself with the syntactical universe of the text. As it reads and re-reads the book, it attempts to mimic Brontë’s style within the constraints of its own artificial “body”, hence finding its own alien voice.

The reading of this AI-generated text by a human speaker allows the listener to experience the neural network agent’s linguistic journey, and to experience the augmentation of this machine-speech through vocalization techniques adapted from Autonomous Sensory Meridian Response (ASMR). ASMR involves the use of acoustic “triggers” such as gentle whispering, fingers scratching or tapping, in an attempt to induce tingling sensations and pleasurable auditory-tactile synaesthesia in the user. Through these autonomous physiological experiences, the work aims to reveal the listener’s own cyborgian qualities as part of the hybrid system in place.

Exhibition History:

April 2018: NRW Forum, Düsseldorf, Germany

March 2018: XXFiles Radio @ Nuit Blanche, Montreal

January 2018: Her Environment @ TCC Gallery, Chicago

text 2018. Courtesy of artists.

Jury member for Equitable Bank EDAA

I am pleased to have been selected to join the jury of the 2017 competition for the Equitable Bank Emerging Digital Artist Award.  Equitable Bank’s Emerging Digital Artist Award celebrates early-career artists doing exemplary works in digital media, reflecting their interest in creating opportunities for digital innovation.

Award Program Description

The Emerging Digital Artists Award (EDAA) is one of the only corporately funded digital art awards in Canada, designed to foster experimentation in the work of emerging artists and build on funding opportunities currently available to those working in digital media.

The Equitable Bank Collection

Equitable Bank began collecting art in the early 90s and currently holds over 150 artworks in its collection. Our collection focuses on modern and contemporary Canadian art, with a particular interest in modern painting. Our contemporary collection also includes video animation—an area of continued growth, concurrently with the growth of the EDAA.

For more information,

http://edaa.equitablebank.ca/

Project H.E.A.R.T.

A biodata-driven VR game where militainment and pop music fuel a new form of emotional drone warfare.

A twist on popular “militainment” shooter video games, Project H.E.A.R.T. invites the viewer to place their fingers on a custom biodata device, and summon their enthusiasm to engage their avatar, Yowane Haku, in “combat therapy.” Fans of the Vocaloid characters may recognize Haku as the “bad copy” of Japanese pop celebrity Hatsune Miku, a holographic personnage that invites her fans to pour their content and songs into her virtual voice.

The biosensing system features a pulse sensor, and a skin conductance sensor of Gee’s design. Through principles of emotional physiology and affective computing, the device gathers data relative to heart rate and blood flow from index finger, and skin conductance from middle and ring fingers of users. The biodata is read by a microcontroller and transferred to Unity VR, thus facilitating emotional interactivity: a user’s enthusiasm (spikes in signal amplitude in skin conductance, elevated heart rate, and shifts in amplitude of the pulse signal) stimulates the holographic pop star to sing in the virtual warzone, thus inspiring military fighters to continue the war, and create more enemy casualties. At the end of the experience the user is confronted with their “score” of traumatized soldiers vs enemies killed, with no information whether this means that they won or lost the “game”.

The user is thus challenged to navigate soldier’s emotional anxieties and summon their positivity to activate Haku’s singing voice as soldiers battle not only against a group of enemies, but also against their own lack of confidence in times of global economic instability.

The landscape of Project H.E.A.R.T. was built from geopolitically resonant sites found on Google Maps, creating a dreamlike background for the warzone. In-game dialogue wavers between self-righteous soldier banter typical of video games, and self-help, bringing the VR participant to an interrogation of their own emotional body in a virtual space that conflates war, pop music, drone technology, and perhaps movement-induced VR nausea.

 

 

As Kathryn Hamilton pointed out in her 2017 essay “Voyeur Realism” for New Inquiry,

“VR’s genesis and development is in the military, where it has been used to train soldiers in “battle readiness,” a euphemism for: methods to overcome the innate human resistance to firing at another human being. In the last few years, VR’s usage has shifted 180 degrees from a technology used to train soldiers for war, to one that claims to “amplify” the voices afflicted by war, and to affect “world influencers” who might be able to stop said wars.”

Photography by Toni Hafkenscheid.  Images of Worldbuilding exhibition courtesy of Trinity Square Video, 2017.

Exhibition history:
November-December 2017 Worldbuilding @ Trinity Square Video, Toronto
February-March 2018 Future Perfect @ Hygienic Gallery, New London Connecticut
April 26-28, 2018 @ Toronto Digifest, Toronto

Credits

Narrative Design: Sofian Audry, Roxanne Baril-Bédard, Erin Gee

3D Art: Alex Lee and Marlon Kroll

Animation and Rigging: Nicklas Kenyon and Alex Lee

VFX: Anthony Damiani, Erin Gee, Nicklas Kenyon

Programming: Sofian Audry, Erin Gee, Nicklas Kenyon, Jacob Morin

AI Design: Sofian Audry

Sound Design: Erin Gee, Austin Haughton, Ben Hinckley, Ben Leavitt, Nicolas Ow

BioSensor Hardware Design: Erin Gee and Martin Peach

BioSensor Case Design: Grégory Perrin

BioSensor Hardware Programming: Thomas Ouellet Fredericks, Erin Gee, Martin Peach

Featuring music by Lazerblade, Night Chaser and Austin Haughton

Yowane Haku character designed by CAFFEIN

Yowane Haku Cyber model originally created by SEGA for Hatsune Miku: Project DIVA 2nd (2010)

Project H.E.A.R.T. also features the vocal acting talents of Erin Gee, Danny Gold, Alex Lee, Ben McCarthy, Gregory Muszkie, James O’Calloghan, and Henry Adam Svec.

Thanks to the support of the Canada Council for the Arts and AMD Radeon, this project was commissioned by Trinity Square Video for the exhibition Worldbuilding, curated by John G Hampton and Maiko Tanaka.

This project would have not been possible without the logistical and technical support of the following organizations:

Technoculture Art and Games Lab (Concordia University)

Concordia University

ASAP Media Services (University of Maine)