VR Tag

RadianceVR

Project H.E.A.R.T. (2017) has joined an interesting collection of VR works on the website radiancevr.co

If you find yourself looking for great examples of VR art, I’d highly recommend browsing the works on this website!

Founded by curators PHILIP HAUSMEIER and TINA SAUERLAENDER

“Radiance is a research platform and database for VR art. Its mission is to present artists working with VR from all over the world to create visibility and accessibility for VR art and for faster adoption of virtual technologies. The platform works closely with artists, institutions and independent curators to select the highest quality of virtual art for public institutional exhibitions.”

Review in Canadian Art

I really appreciate this article by Tatum Dooley for Canadian Art on the Worldbuilding exhibition curated by John G Hampton and Maiko Tanaka at Trinity Square Video. My work Project H.E.A.R.T. which highlights VR and emotions made with Alex M Lee is featured, among other great works by Jeremy Bailey Kristen D Schaffer Eshrat Erfanian and Yam Lau.  Following is an excerpt from the article:

“The gamification of our bodies renders the physical form void, replaced by screens where our bodies and emotions can be morphed and manipulated. Perhaps the only way to create art with technology as advanced and recent as VR is to reckon with its potential consequences.

Gee’s project, the most realized out of the four artists in the exhibition, masters this reckoning. I spoke with Gee in the lead-up to the exhibition, and she explained the conceptual backbone of the piece. “I’m working through questions of emotional sincerity when it comes to self-help. In theory, if you can technologically master your emotions, if you can just make yourself excited, then you can make yourself a better, happier person. I don’t know how sincere that is…”

Click on the link below for the full article.

VR and the Failure of Self-Help Technology

In general, I feel very proud of this work but also very exhausted by it.  Through the project I’ve been working through the relationship between pop music and war, self help and sincerity, and ultimately I’m working through these issues of technique and technology in how life and trauma comes to us.  During the panel for the exhibition, there was a question of whether I was “pro-war”, and it’s one that I have received a few times in facebook messages from curious friends from far away.  The project is complex and difficult to read because I think it has to be.  It reflects my own mediatized understanding of international conflict, maybe my own frustration at my lack of understanding.

The best I can understand war is how it is mediated to me: through video games and news cycles, through abstract discussions on the radio. The goal of this project was never to address the terror and complexity of geopolitical conflict, but rather, to propose a psychedelic pop culture mirror, imagining a video game ruled not by characters that espouse self-righteous violence and grit, but technologically manipulated empathy and enthusiasm.  This game fails to address war in the same way that all technologically mediated attempts to do so fail to address war.  I also am also dissatisfied at the idea of an artistic protest that makes a cartoonish, morally didactic utopia where rainbows and love shoot out of guns instead of flesh-tearing bullets. I think the answer about the politics of this game lie in the end screen: an abstract screen that confronts you with statistics of death and trauma as a result of the battle itself.  I don’t think there is a way to win the game.

Affective VR Workshop

Welcome

Welcome to the Emotional Data and Unity/VR workshop!  This workshop is hosted by Erin Gee, a Canadian artist and researcher who has worked in emotional biodata and art since 2012.  She has created work for emotional biodata and robotics (Swarming Emotional Pianos, 2014), children’s choir (Song of Seven, 2017), and now for VR with her latest work Project H.E.A.R.T (2017).

She is an active promoter of open-source and feminist culture, and publishes all of her technical work (Arduino code/Unity code) under the GNU GPL 3.0.

What is the GNU GPL?

The GNU General Public License is a free, copyleft license for software and other kinds of works.

TLDR: You may use, modify, redistribute this code for free.  If you redistribute, you must acknowledge my original authorship, and you must always allow other people to also modify and redistribute for free.  Any violation of this agreement means that you are breaking my copyright!  If you do modify aspects of this code, please share the love and contribute to the GitHub.

If you are not sure how to contribute to a GitHub project, feel free to contact me at erin dot marie dot gee at gmail dot com (or just the contact form on my website) and I’ll set you up!

For the full documentation of GNU GPL v3, click here.

Contextual Resource Section

BIOSENSORS IN GAMES

April 13th, 2011

Jacob Aron first reported on a variety of games that were taking advantage of biosensing technologies in an article published in New Scientist.

Aron, Jacob. (2011). “Emotional Video Gaming Makes the Action Real.” New Scientist.  Accessed November 15th 2017.


October 2016 – BfB Labs’ emotionally responsive game “Champions of the Shengha,” is dependent on a user’s emotional control — measured by a heart rate sensor — for success.


October 2016 – Nevermind is an adventure game where you explore strange worlds and solve puzzles to unlock a mystery that lurks within each “patient’s” inner psyche.  The Windows and Mac versions of Nevermind use biofeedback technology to detect your feelings of stress while playing, dynamically responding to those feelings to affect gameplay.  You can also play the game without this technology. http://nevermindgame.com/


BIOSENSORS IN CINEMA

Published by The Verge on 18 Jul 2017.

Lauren Goode goes inside Dolby’s little-known biophysical labs, where the company has been embarking on a five-year project to track people’s emotional responses as they watch movies and TV shows.

Biosensors are used by Dolby to study viewers’ emotional responses to

  • Aural frequency ranges
  • Dynamic color ranges
  • Audio volume as well as screen brightness
  • Music and sound effects

VR and Empathy – A reading list

Ebert, Roger. (June 23, 2005.) “Ebert’s Hall of Fame Remarks.” Roger Ebert’s Journal. Accessed November 15th 2017.

Bye, Kent. (January 31, 2017). “VR as the Ultimate Empathy Machine with Gabo Arora.”  Voices of VR Podcast.  Accessed November 13th 2017.

Hamilton, Kathryn. (February 23, 2017). “Voyeur Reality.” The New Inquiry. Accessed November 15th 2017.

Yang, Robert. (April 5, 2017). “If you walk in someone else’s shoes, then you have taken their shoes”: empathy machines as appropriation machines.”  radiator design blog.  Accessed November 15th 2017.

Scientific Resources

In this section you will find information on the science behind how emotion is generated by the brain, and how it can be “read” by sensing instruments.

What is Emotion

Emotion is a multi-component response to an emotionally potent event, causing changes in subjective feeling quality (psychological dimension), expressive social behavior (behavioral dimension), and physiological activation (physiological dimension) of the subject. (Kreiberg 2009)

Psychological/neurological frameworks for understanding emotion itself are articulated very well by Dr Lisa Feldman-Barrett, a scientist at Northeastern University.  She speaks particularly of the complexity of the human body, which might be experiencing a host of physiological effects from the ANS system, and how these are interpreted and perhaps also constructed by language.

On her website you will find many plain-language resources for emotional understanding from a contemporary neuroscientific point of view under the heading “Articles.”

How Scientists detect Emotion

The physiological techniques presented at this workshop were made according to psychologist Dr. Sylvia Kreibig‘s 2010 review of 134 publications dating from the 1970s to the 2000s. While the techniques viewed in this document are not weighted for “success” of the cited studies, and the literature clearly shows that there is no one “technique” for applying these skills, this document that has been cited over 1000 times in scientific literature since its publication, including over 200 times in 2017.

Source:

Kreibig,Sylvia D. “Autonomic nervous system activity in emotion: A review.” Biological Psychology 84 (2010) 394–421.

 

Continued research
If you would like to continue keeping up with scientific research in this area, an academic journal search using the databases PsycINFO, PsycARTICLES, and PubMed is recommended with the following search terms:

[emotion] and [autonomic nervous system or cardiovascular or cardiac or heart or respiration or respiratory or electrodermal or skin conductance]

 

Technical Resource Section

Unity Tutorials

This section is dedicated to technical resources, tutorials and code to get you started playing around with the biosensors.

Mirza VFX teaches amazing tutorials in particle creation.  Plug your biosensors in to manipulate beautiful particle-driven environments.

(Inspiration: Mufson, Beckett. (Aug 29, 2014) “A Team Of Artists Are 3D-Printing Their Emotions As Abstract House Decorations.”  Creators Project.  Accessed November 15th 2017.)

GitHub

Hardware Resources

Materials list for Erin Gee’s BiodataTrio PCB board.

Don’t forget to prepare your PulseSensor before you use it!  Click here for instructions on how to apply the protective decal and provide a gluegun seal.  The rest of the advice you can take or leave, but these two steps are ESSENTIAL to the longevity and accuracy of your device!

When buying cables to build your sensors – Look for cable that features 2 signals that are shielded.  The biosensors are picking up on very, very sensitive electrical information from the body – any disruption in the electric field could make your readings less accurate.  Especially if the sensor cables are placed near to one another!

To prevent this, you can either buy shielded cable (like this) and solder the silver shielding to your ground connection (pick whatever you like for the other two, maybe red for power and black for signal?)

Or if you’re in a pinch, you can just twist the ground wire around the signal wire that you are trying to protect from outside interference.

Here’s a link to my Respiration Belt Instructable.  After a few years I didn’t find that a respiration belt was as interesting to me because the belts are awkward to strap into for the average person, but if you’d like to go for it, here is a simple way to make it happen!  This signal is perhaps best amplified, and you might need to calculate the relative drift of the elastic and account for it as an offset in order to capture things like when someone is holding their breath.

Ideas Lab Denmark

I will be giving a unique and in-depth workshop hosted by Emotional Data Lab (Aarhus University), Interactive Denmark and Ideas Lab in Aarhus, Denmark from November 21-23.  The workshop consists of 3 three-hour sessions where I will share my materials and experiences with incorporating physiological markers of emotion into the VR-compatible Unity environment.

Participants will be placed into “teams” in order to work together, experiment, and discuss the promises, problems and potential of using biosensors to capture a user’s emotional experience through digital tools.

WorldBuilding: TSV Toronto

November 3rd – December 9th 2017

Trinity Square Video, 401 Richmond, Toronto Canada.

My work made in collaboration with 3D artist Alex M. Lee for VR and emotional-biosensors, Project H.E.A.R.T. (2017) was debuted on November 5th at Trinity Square Video, Toronto.

This project was commissioned by TSV by curators John Hampton and Maiko Tanaka, thanks to the support of the Canada Council for the Arts. The exhibition also features amazing works by Canadian artists Jeremy Bailey and Kristen Schaffer, Eshrat Erfanian, and Yam Lau.

Visit the Worldbuilding website by clicking here.

 

Project H.E.A.R.T.

Project H.E.A.R.T. (2017)

2017

A biodata-driven VR game where militainment and pop music fuel a new form of emotional drone warfare.

A twist on popular “militainment” shooter video games, Project H.E.A.R.T. invites the viewer to place their fingers on a custom biodata device, and summon their enthusiasm to engage their avatar, Yowane Haku, in “combat therapy.” Fans of the Vocaloid characters may recognize Haku as the “bad copy” of Japanese pop celebrity Hatsune Miku, a holographic personnage that invites her fans to pour their content and songs into her virtual voice.

The biosensing system features a pulse sensor, and a skin conductance sensor of Gee’s design. Through principles of emotional physiology and affective computing, the device gathers data relative to heart rate and blood flow from index finger, and skin conductance from middle and ring fingers of users. The biodata is read by a microcontroller and transferred to Unity VR, thus facilitating emotional interactivity: a user’s enthusiasm (spikes in signal amplitude in skin conductance, elevated heart rate, and shifts in amplitude of the pulse signal) stimulates the holographic pop star to sing in the virtual warzone, thus inspiring military fighters to continue the war, and create more enemy casualties. At the end of the experience the user is confronted with their “score” of traumatized soldiers vs enemies killed, with no information whether this means that they won or lost the “game”.

The user is thus challenged to navigate soldier’s emotional anxieties and summon their positivity to activate Haku’s singing voice as soldiers battle not only against a group of enemies, but also against their own lack of confidence in times of global economic instability.

The landscape of Project H.E.A.R.T. was built from geopolitically resonant sites found on Google Maps, creating a dreamlike background for the warzone. In-game dialogue wavers between self-righteous soldier banter typical of video games, and self-help, bringing the VR participant to an interrogation of their own emotional body in a virtual space that conflates war, pop music, drone technology, and perhaps movement-induced VR nausea.

As Kathryn Hamilton pointed out in her 2017 essay “Voyeur Realism” for New Inquiry,

“VR’s genesis and development is in the military, where it has been used to train soldiers in “battle readiness,” a euphemism for: methods to overcome the innate human resistance to firing at another human being. In the last few years, VR’s usage has shifted 180 degrees from a technology used to train soldiers for war, to one that claims to “amplify” the voices afflicted by war, and to affect “world influencers” who might be able to stop said wars.”

Credits

Narrative Design: Sofian Audry, Roxanne Baril-Bédard, Erin Gee
3D Art: Alex Lee and Marlon Kroll
Animation and Rigging: Nicklas Kenyon and Alex Lee
VFX: Anthony Damiani, Erin Gee, Nicklas Kenyon
Programming: Sofian Audry, Erin Gee, Nicklas Kenyon, Jacob Morin
AI Design: Sofian Audry
Sound Design: Erin Gee, Austin Haughton, Ben Hinckley, Ben Leavitt, Nicolas Ow
BioSensor Hardware Design: Erin Gee and Martin Peach
BioSensor Case Design: Grégory Perrin
BioSensor Hardware Programming: Thomas Ouellet Fredericks, Erin Gee, Martin Peach
Featuring music by Lazerblade, Night Chaser and Austin Haughton
Yowane Haku character designed by CAFFEIN
Yowane Haku Cyber model originally created by SEGA for Hatsune Miku: Project DIVA 2nd (2010)
Project H.E.A.R.T. also features the vocal acting talents of Erin Gee, Danny Gold, Alex Lee, Ben McCarthy, Gregory Muszkie, James O’Calloghan, and Henry Adam Svec.

Thanks to the support of the Canada Council for the Arts and AMD Radeon, this project was commissioned by Trinity Square Video for the exhibition Worldbuilding, curated by John G Hampton and Maiko Tanaka.

This project would have not been possible without the logistical and technical support of the following organizations:

Technoculture Art and Games Lab (Concordia University)

Concordia University

ASAP Media Services (University of Maine)

Exhibition history

November-December 2017  Worldbuilding @ Trinity Square Video, Toronto

February-March 2018 Future Perfect @ Hygienic Gallery, New London Connecticut

April 26-28, 2018 @ Digifest, Toronto

June 7-17, 2019 @ Elektra Festival, Montreal

January 2020 @ The Artist Project, Toronto

October 2020 @ Festival LEV Matadero, Spain

Links

Project H.E.A.R.T. official website
Worldbuilding Exhibition Website
Review in Canadian Art
My research blog: Pop and Militainment
Featured on Radiance VR

Video

Project H.E.A.R.T (2017)
Installation and Gameplay

Gallery

VR Commission Update

Here’s a sneak peek at some of the art developed last summer in a residency at the Technoculture Art and Games lab at Concordia University with lead 3D artist Alex Lee, AI designer Sofian Audry, art assistant Marlon Kroll, and research assistant Roxanne Baril-Bédard. Among holographic popstars who may or may not have their own consciousness to begin with, the project includes rhetorical analysis of post 9/11 counterterrorist video games, reality television, startup culture, and self-help manuals for improving emotional state.

I am implementing the Biosensor control system this Winter and plan on working on finalizing the game’s art, music and sounds this summer for a launch towards the end of 2017 in an exhibition at Trinity Square Video in Toronto.


In the future, weapons of war possess advanced AI systems, systems that guarantee successful automated combat on behalf of soldiers wielding the technology.  The military still trains its soldiers in case of equipment failure, but at this point, fighters function more as passive operators. The terrorist threat has nothing similar to this technology in their ranks, and the effectiveness of our systems is swift and deadly.  Historically, our soldiers manning the machines have never witnessed violence or devastation at this scale: the largest threat to soldiers today defending our nation’s values is Post Traumatic Stress Syndrome.

To address this unfortunate state of affairs, the military developed a startup fund open to the public to resolve this issue through technological innovation.  Significant scholarships and research funding was provided for researchers interested in devoting time to creating a means towards mitigating the psychological crisis.  A risky but intriguing proof of concept was eventually presented: the creation of a revolutionary entertainment for the troops as they fought the terrorist threat.

Yowane Haku became the face of this entertainment: a mobile holographic pop star engineered specifically for psychological distraction on the battlefield.  

The world’s most talented engineers, design consultants, and pop writing teams were assembled to enshrine Haku with every aesthetic and technical element to impress not only the troops, but the world with her next-generation technology.  However, the initial test-run of this mobile holographic pop medium in combat trials was….a failure.  

On the battlefield, Haku’s perfect body glowed faintly amongst the dust and screams, bullets and explosions passing ineffectually, dance moves precise, vocalizations on point. But ultimately her pop music performance lacked resonance with the battle.  Instead of the soldiers being emboldened by this new entertainment, which was intended to distract or inspire them from their gruesome tasks, their adverse psychological symptoms…flourished.  Some of the men went mad, laughing maniacally in tune with the holographic presence smiling sweetly at them.  It was only due to the superiority of our AI weaponry and automated drone operation that the morally corrupt foreign threat, with their violent and technologically crude methods, were stopped that day. The minds of our soldiers were lost.

Months later, a young pool of startup professionals would provide another solution.  This vocal minority of engineers…though others called them crazy….had a hunch. For the hologram pop star to “work,” her systems needed access pure emotion, to link a human element with the trauma of the human soldiers.  But it was not clear who, or what, could best provide this emotional link…and what amount of embodied “disruption” this might entail…

This enthusiastically crowdfunded group of millennials completed their groundbreaking research without the strings of ethics funding or institutional control.  Human emotions and consciousness now flow direct to Haku via experimental trials in VR technology.  Haku rises again on the battlefront.

Simultaneously, a new reality television show has been borne of these first human trials. The star of this reality show could be…….you.

Could you be the next American Sweetheart?  Do you have what it takes to provide 110% Best Emotional Performance?  Join us through advanced VR technologies, Live and Direct on the battlefield, to find out if you could be fit to man the ultimate weapon of war: Our Next Holographic Idol.

This project is supported by the Canada Council for the Arts and Trinity Square Video’s AMD VR lab

Musicworks #126 Interview

Click here to read my interview with Alex Varty.  “ERIN GEE SINGS THE BODY ELECTRONIC”

Fresh on the heels of my return from the premiere of Echo Grey in Vancouver (my newest composition for vocal quartet, feedback soloist and tape), I find I’ve received my physical copy of Musicworks, which is a triannually released publication featuring experimental sounds from across Canada.

Amidst a really massive transition phase right now, I find that teaching full time has really changed what I can do as an artist.  Pushing myself to learn entirely new skillsets in organization and pedagogical performance (sidenote: yes, everything is a performance) has left me with little time or energy to invest in building new technologies.

Music composition has been something that I can invest time into, as all I need is a few moments, a microphone, my laptop, a notepad with pencil scribbles, my imagination.

This interview with Musicworks magazine was very interesting for me, as recently my opportunities have been coming from music composition.  The whole issue is actually very interesting, with a full feature on music and sound revolution in VR spaces, as well as some features on other very energetic and productive electroacoustic artists.

Musicworks #126 is available now with a special curated cd of sounds included in the physical magazine.  On this CD you can find a track from my Voice of Echo (2011) series.

New VR artwork commission from Trinity Square Video

I’m thrilled to announce that Trinity Square Video will be presenting new artworks for Virtual Reality interfaces in 2016-2017, including a new commissioned work by myself!  The work will feature pop music’s potential military applications in a first-person shooter style video game – expect autotuned voices, virtual pop stars, and new embodiments of my emotional biosensor hardware to take shape in this new work.

The project will feature Alex M. Lee as head artistic designer as well as work by Marlon Kroll and Roxanne Baril-Bédard.  I’ll continue to post teasers, hardware updates and more through this summer 2016!