Can we create and enhance a social VR experience, within the defined language of The Waldorf Project, generating and controlling a depth of user empathy to match real-world experiences?
The Waldorf Project has been exploring empathy engineering as a creative concept for the past eight years. Initially focusing on intimate performances for 40 guests in industrial spaces, but then moving onto larger and more ambitious experiments involving 5,000 people. This is now a rare opportunity  to test out our empathy engineering algorithm in a vast digital landscape using VR and explore a Waldorf Project experiment on a massive global scale. One which will demonstrate that a true social digital community can be formed, moulded and rewarded in VR without gamification. With relevance to the future of our project and XR,  we are examining the potential power of what we believe will be one of the “commodities” of the Metaverse; Empathy.
We are blending audience participation with machine learning to produce an adaptive social VR experience with an ML agent creating a personalised and unique journey, guided by emotional cues, and controlling a depth of user empathy that aim to match our real-world experiences of The Waldorf Project.
Combining sensor data with machine learning, we can now create experiences that adapt to each user like never before, with users’ natural responses driving their experience in the moment.
Stage 1 Research & Development
The final deployed experience works with the emotional quotient score calculated in real-time by the EQ-ML model from user bio-signals (This is a propriety system based on the circumplex model of emotion). It's taken as an observation by the WP-ML Agent, along with the environmental conditions initiating an action cycle of environmental change, resulting in reward as to whether the emotional quotient has moved closer to its known goal.
Continual user input improves the WP-ML Agent, ultimately enhancing the overall experience and emotional journey. It is built using the Unity ML Agent framework.
Core Team
Sean Rogg | Director & Founder of 'The Waldorf Project'
British artist working in the field of experimental performance. His work has been exhibited globally. Since 2012 He has been leading a team of future thinking artists, designers and scientists to explore the full potential of the Waldorf Project experiment.
Greg Shaw | Creative Director/Experiential Designer
Founder & CEO of Depicted Studios, Specialising in realtime/immersive technologies and world building within narrative design & 20+ years experience as Production Designer for Film & TV 
Solomon Rogers | Founder and CEO of REWIND
Chairman of BAFTA's Immersive Entertainment Advisory Group. Chairman of IMMERSE UK, The Co-Founder of the non-profit to create ‘VR for Good’
Carl H Smith | Director of the Learning Technology Research Centre (LTRC)  
Principal Research Fellow Ravensbourne University. Background in Computer Science and Artificial Intelligence, research includes Distributed Cognition, Spatial Literacy and Hyperhumanism
DeepQ | Healthcare division of HTC, to advance artificial intelligence and virtual reality technologies to promote medical progress