human computer interaction Tag

Ideas Lab Denmark

I will be giving a unique and in-depth workshop hosted by Emotional Data Lab (Aarhus University), Interactive Denmark and Ideas Lab in Aarhus, Denmark from November 21-23.  The workshop consists of 3 three-hour sessions where I will share my materials and experiences with incorporating physiological markers of emotion into the VR-compatible Unity environment.

Participants will be placed into “teams” in order to work together, experiment, and discuss the promises, problems and potential of using biosensors to capture a user’s emotional experience through digital tools.

Project H.E.A.R.T.

Project H.E.A.R.T. is the code name for the Holographic Empathy Attack Robotics Team, a biosensor-driven virtual reality artwork developed by Erin Gee in collaboration with 3D artist Alex M. Lee for use with the Oculus Rift.  The game was commissioned by Trinity Square Video for the exhibition Worldbuilding in November 2017.

A twist on popular “militainment” shooter video games, Project H.E.A.R.T. invites the viewer to place their fingers on a biodata gathering device and then summon their enthusiasm in order to direct their avatar, Yowane Haku, in “combat therapy.” The biosensor device gathers the human player’s positivity and energy to drive Haku’s voice forward to boost morale as soldiers battle not only against a group of enemies, but also against their own lack of confidence and rising anxiety.

Fans of the Vocaloid characters may recognize Haku as the “bad copy” of Japanese pop celebrity Hatsune Miku, a holographic personnage that invites her fans to pour their content and songs into her virtual voice.

 

 

As Kathryn Hamilton pointed out in her 2017 essay “Voyeur Realism” for New Inquiry,

“VR’s genesis and development is in the military, where it has been used to train soldiers in “battle readiness,” a euphemism for: methods to overcome the innate human resistance to firing at another human being. In the last few years, VR’s usage has shifted 180 degrees from a technology used to train soldiers for war, to one that claims to “amplify” the voices afflicted by war, and to affect “world influencers” who might be able to stop said wars.”

The colorful landscape of the game was built from from geopolitically resonant sites found on Google Maps, creating a dreamlike background for the warzone. In-game dialogue wavers between self-righteous soldier banter typical of video games, and self-help, bringing the VR participant to an interrogation of their own emotional body in a virtual space that conflates war, pop music, video games, emotional investment, and virtual-movement induced nausea.

Photography by Toni Hafkenscheid.  Images of Worldbuilding exhibition courtesy of Trinity Square Video, 2017.

Credits

Narrative Design: Sofian Audry, Roxanne Baril-Bédard, Erin Gee

3D Art: Alex Lee and Marlon Kroll

Animation and Rigging: Nicklas Kenyon and Alex Lee

VFX: Anthony Damiani, Erin Gee, Nicklas Kenyon

Programming: Sofian Audry, Erin Gee, Nicklas Kenyon, Jacob Morin

AI Design: Sofian Audry

Sound Design: Erin Gee, Austin Haughton, Ben Hinckley, Ben Leavitt, Nicolas Ow

BioSensor Hardware Design: Erin Gee and Martin Peach

BioSensor Case Design: Grégory Perrin

BioSensor Hardware Programming: Thomas Ouellet Fredericks, Erin Gee, Martin Peach

Featuring music by Lazerblade, Night Chaser and Austin Haughton

Yowane Haku character designed by CAFFEIN

Yowane Haku Cyber model originally created by SEGA for Hatsune Miku: Project DIVA 2nd (2010)

Project H.E.A.R.T. also features the vocal acting talents of Erin Gee, Danny Gold, Alex Lee, Ben McCarthy, Gregory Muszkie, James O’Calloghan, and Henry Adam Svec.

Thanks to the support of the Canada Council for the Arts and AMD Radeon, this project was commissioned by Trinity Square Video for the exhibition Worldbuilding, curated by John G Hampton and Maiko Tanaka.

This project would have not been possible without the logistical and technical support of the following organizations:

Technoculture Art and Games Lab (Concordia University)

Concordia University

ASAP Media Services (University of Maine)

Algorithms that Matter Residency: Austria

I’ve been selected to be a featured artist in residence at the Institut für Elektronische Musik und Akustik (IEM) in Graz, Austria, participating in the Algorithms that Matter Residency.  This residency will take place in April-June 2018, and even though it’s a year away I want to share the incredible news!

From the ALMAT website:

“Algorithms that Matter is an artistic research project by Hanns Holger Rutz and David Pirrò.  It aims at understanding the increasing influence of algorithms, translating them into aesthetic positions in sound, building a new perspective on algorithm agency by subjecting the realm of algorithms to experimentation.

Almat is grounded in the idea that algorithms are agents that co-determine the boundary between an artistic machine or “apparatus” and the object produced through this machine. The central question is: How do algorithmic processes emerge and structure the praxis of experimental computer music? The hypothesis is that these processes, instead of being separated from the composer—as generators and transformers of infinite shapes—exhibit a specific force that retroacts and changes the very praxis of composition and performance.”

 

Exploring these algorithms as unique electronic voices will extend my reach into exciting new territories, I am excited to play at the IEM and make unique new emotional sounds that combine physiological markers of emotion with algorithmic agencies.

VR Commission Update

Here’s a sneak peek at some of the art developed last summer in a residency at the Technoculture Art and Games lab at Concordia University with lead 3D artist Alex Lee, AI designer Sofian Audry, art assistant Marlon Kroll, and research assistant Roxanne Baril-Bédard. Among holographic popstars who may or may not have their own consciousness to begin with, the project includes rhetorical analysis of post 9/11 counterterrorist video games, reality television, startup culture, and self-help manuals for improving emotional state.

I am implementing the Biosensor control system this Winter and plan on working on finalizing the game’s art, music and sounds this summer for a launch towards the end of 2017 in an exhibition at Trinity Square Video in Toronto.


In the future, weapons of war possess advanced AI systems, systems that guarantee successful automated combat on behalf of soldiers wielding the technology.  The military still trains its soldiers in case of equipment failure, but at this point, fighters function more as passive operators. The terrorist threat has nothing similar to this technology in their ranks, and the effectiveness of our systems is swift and deadly.  Historically, our soldiers manning the machines have never witnessed violence or devastation at this scale: the largest threat to soldiers today defending our nation’s values is Post Traumatic Stress Syndrome.

To address this unfortunate state of affairs, the military developed a startup fund open to the public to resolve this issue through technological innovation.  Significant scholarships and research funding was provided for researchers interested in devoting time to creating a means towards mitigating the psychological crisis.  A risky but intriguing proof of concept was eventually presented: the creation of a revolutionary entertainment for the troops as they fought the terrorist threat.

Yowane Haku became the face of this entertainment: a mobile holographic pop star engineered specifically for psychological distraction on the battlefield.  

The world’s most talented engineers, design consultants, and pop writing teams were assembled to enshrine Haku with every aesthetic and technical element to impress not only the troops, but the world with her next-generation technology.  However, the initial test-run of this mobile holographic pop medium in combat trials was….a failure.  

On the battlefield, Haku’s perfect body glowed faintly amongst the dust and screams, bullets and explosions passing ineffectually, dance moves precise, vocalizations on point. But ultimately her pop music performance lacked resonance with the battle.  Instead of the soldiers being emboldened by this new entertainment, which was intended to distract or inspire them from their gruesome tasks, their adverse psychological symptoms…flourished.  Some of the men went mad, laughing maniacally in tune with the holographic presence smiling sweetly at them.  It was only due to the superiority of our AI weaponry and automated drone operation that the morally corrupt foreign threat, with their violent and technologically crude methods, were stopped that day. The minds of our soldiers were lost.

Months later, a young pool of startup professionals would provide another solution.  This vocal minority of engineers…though others called them crazy….had a hunch. For the hologram pop star to “work,” her systems needed access pure emotion, to link a human element with the trauma of the human soldiers.  But it was not clear who, or what, could best provide this emotional link…and what amount of embodied “disruption” this might entail…

This enthusiastically crowdfunded group of millennials completed their groundbreaking research without the strings of ethics funding or institutional control.  Human emotions and consciousness now flow direct to Haku via experimental trials in VR technology.  Haku rises again on the battlefront.

Simultaneously, a new reality television show has been borne of these first human trials. The star of this reality show could be…….you.

Could you be the next American Sweetheart?  Do you have what it takes to provide 110% Best Emotional Performance?  Join us through advanced VR technologies, Live and Direct on the battlefield, to find out if you could be fit to man the ultimate weapon of war: Our Next Holographic Idol.

This project is supported by the Canada Council for the Arts and Trinity Square Video’s AMD VR lab

Musicworks #126 Interview

Click here to read my interview with Alex Varty.  “ERIN GEE SINGS THE BODY ELECTRONIC”

Fresh on the heels of my return from the premiere of Echo Grey in Vancouver (my newest composition for vocal quartet, feedback soloist and tape), I find I’ve received my physical copy of Musicworks, which is a triannually released publication featuring experimental sounds from across Canada.

Amidst a really massive transition phase right now, I find that teaching full time has really changed what I can do as an artist.  Pushing myself to learn entirely new skillsets in organization and pedagogical performance (sidenote: yes, everything is a performance) has left me with little time or energy to invest in building new technologies.

Music composition has been something that I can invest time into, as all I need is a few moments, a microphone, my laptop, a notepad with pencil scribbles, my imagination.

This interview with Musicworks magazine was very interesting for me, as recently my opportunities have been coming from music composition.  The whole issue is actually very interesting, with a full feature on music and sound revolution in VR spaces, as well as some features on other very energetic and productive electroacoustic artists.

Musicworks #126 is available now with a special curated cd of sounds included in the physical magazine.  On this CD you can find a track from my Voice of Echo (2011) series.

New VR artwork commission from Trinity Square Video

I’m thrilled to announce that Trinity Square Video will be presenting new artworks for Virtual Reality interfaces in 2016-2017, including a new commissioned work by myself!  The work will feature pop music’s potential military applications in a first-person shooter style video game – expect autotuned voices, virtual pop stars, and new embodiments of my emotional biosensor hardware to take shape in this new work.

The project will feature Alex M. Lee as head artistic designer as well as work by Marlon Kroll and Roxanne Baril-Bédard.  I’ll continue to post teasers, hardware updates and more through this summer 2016!

Erin Gee - Swarming Emotional Pianos

Swarming Emotional Pianos

(2012 – ongoing)

Aluminium tubes, servo motors, custom mallets, Arduino-based electronics, iCreate platforms

Approximately 27” x 12” x 12” each

Swarming Emotional Pianos is an installation that features a large, looming projection of a human face surrounded by a set of six musical chime robots.

The projected face is that of an actor (Laurence Dauphinais or Matthew Keyes), who for 20 minutes moves between extreme emotional states of surprise, fear, anger, sadness, sexual arousal, and joy in 5 minute intervals. During the actor’s performance, Gee hooked the performer up to a series of biosensors that monitored how heart rate, sweat, and respiration changed between her emotional states.

The music that the robots surrounding the projection screen play as the actress moves between emotional states is in reaction to these physiological responses: the musical tones and rhythms shift and intensify as heart rate, sweat bursts, blood flow and respiration change in the actress. While the musical result is almost alien to assumptions of what emotional music might sound like, one might encounter the patterns as an abstracted lie-detector test that displays the unique internal fluctuations of the actress that move beneath the surface of her large, projected face. Does emotion lie within the visibility of facial expression, or somewhere in the audible made audible, the patterns of bodily sensation in her body? Is the actor sincere in her performance if the emotion is felt as opposed to displayed? Micro bursts of emotional sentiment are thus amplified by the robots, providing an intimate and abstract soundtrack for this “emotional movie”.

Emotional-physical outputs are extended through robotic performers as human actors focus on their internal states, and in fact activate their emotions mechanistically, as a means of creating change in their body, thus instrumentalizing emotion.

Custom open-source biosensors that collect heartrate and signal amplitude, respiration amplitude and rate, and galvanic skin response (sweat) have been in development by Gee since 2012.  Click here to access her GitHub page if you would like to try the technology for yourself, or contribute to the research.

Credits

Thank you to the following for your contributions:

  • Martin Peach (my robot teacher) – Sébastien Roy (lighting circuitry) – Peter van Haaften (tools for algorithmic composition in Max/MSP) – Grégory Perrin (Electronics Assistant)
  • Matt Risk, Tristan Stevans, Simone Pitot, and Jason Leith for their hours of dedicated studio help
  • Concordia University, the MARCS Institute at the University of Western Sydney, Innovations en Concert Montréal, Conseil des Arts de Montréal, Thought Technology, and AD Instruments for their support.

Swarming Emotional Pianos (2012-2014) Machine demonstration March 2014 – Eastern Bloc Lab Residency, Montréal

Erin Gee - Formants - Image courtesy of InterAccess Gallery

Formants

(2008)

Fiberglass, plexiglas, hair, copper, wood, electronics

20” x 49” x 27.5”

Formants is an interactive audio sculpture featuring the heads of two female figures that sing when their hair is brushed: a musing on desire, vanity, absent bodies, morality, intimacy and touch.

Credits

  • (version 1) Pure Data Programming: Michael Brooks
  • (version 2) Electronics technician and programmer: Martin Peach
  • Vocalists: Lynn Channing and Christina Willatt
  • Made with the support of Soil Digital Media Suite