
Embodiment
An immersive collaborative Experience using Biofeedback
How would it be/feel if you and another person have or share the same body?
Through immersive VR we attempt to mix the senses of two people and encourage them to coordinate their biomes in order to survive.
Embodiment
Have you ever thought what it would be like to share the same body with another being? If your body and another person’s become confused into one? If your senses get mixed and you need to coordinate both your organs to survive?
We proposed an experience/experiment where we attempted to merge two people into one.
For that, we created a journey inside VR and the CAVE where the
players have to go through three basic challenges to finally finish
their first step towards this merge.
We entitled these three challenges in order, body, heart, and mind.
This is not based on the religious separation of mind
and body, but more represents the three brain layers: Reptilian, Limbic, and Neocortex.
In the first challenge, the participants try to synchronize their
breathing, in the second their heart rate, and the third, they
combine their thoughts.
The experience uses measurements of the users’ biofeedback in real time. The biofeedback from two users is transmitted to their
environment and affect their respective perceptions.

Collaboration
-
Two users collaboratively control one shared body underwater.
-
The two players have to synchronize their breathing to stabilize their hosting animal.
-
Together they can steer the body through the depths of the ocean.
-
Biofeedback is collected through sensors placed on the body of each participant and used to
-
control the whale
-
connect their respective perceptions, and alter them based on the sensor data from the other person, e.g. breathing or heartbeat.
-
affect and interact with the environment or landscape
-
Navigation in our experience
-
Players control either left or right “side”
-
Sensor input (breathing) is mapped to tilt of flippers
-
Movement direction is calculated based on “whale model”


I DIYed this belt to which I attached a flex sensor. The flex sensor bends and straightens with the belly's inflation and deflation, sending data about the breathing through the Arduino, to the application which compares it to the biofeedback from the other player and controls the navigation of the third body


Raymaching shader
I experimented with raymarching and wrote a few shaders.
A cube and Sphere, representing each of the players, separate and merge dynamically in real time based on the sensor data from both and depending on how synchronized their breathing is.




