Can we create and enhance a social VR experience, within the defined language of The Waldorf Project, generating and controlling a depth of user empathy to match real-world experiences?
The Waldorf Project has been exploring empathy engineering as a creative concept for the past eight years. Initially focusing on intimate performances for 40 guests in industrial spaces, but then moving onto larger and more ambitious experiments involving 5,000 people. This is now a rare opportunity  to test out our empathy engineering algorithm in a vast digital landscape using VR and explore a Waldorf Project experiment on a massive global scale. One which will demonstrate that a true social digital community can be formed, moulded and rewarded in VR without gamification. With relevance to the future of our project and XR,  we are examining the potential power of what we believe will be one of the “commodities” of the Metaverse; Empathy.
We are blending audience participation with machine learning to produce an adaptive social VR experience with an ML agent creating a personalised and unique journey, guided by emotional cues, and controlling a depth of user empathy that aim to match our real-world experiences of The Waldorf Project.
Combining sensor data with machine learning, we can now create experiences that adapt to each user like never before, with users’ natural responses driving their experience in the moment.
The final deployed experience works with the emotional quotient score calculated in real-time by the EQ-ML model from user bio-signals (This is a propriety system based on the circumplex model of emotion). It's taken as an observation by the WP-ML Agent, along with the environmental conditions initiating an action cycle of environmental change, resulting in reward as to whether the emotional quotient has moved closer to its known goal.
Continual user input improves the WP-ML Agent, ultimately enhancing the overall experience and emotional journey. It is built using the Unity ML Agent framework.
Back to Top