Out of the Ordinary

SEE MORE
VR Opera made in Unreal Engine for Meta Quest 2 - 2022
See More
Project Overview
Out of the Ordinary is a virtual reality community opera, created through a series of workshops with communities around Ireland, in collaboration with Irish National Opera.

It leverages AI image generation in it's earliest implementations to deliver a deliberately surreal VR experience with spatial audio.
My Contributions
  • Unreal Engine Programming
  • Spatial Audio Pipeline
  • VR Implementation & Testing
Project Highlights
When worked on this project with Algorithm Productions it was already under production. This ended up being a valuable asset as I was able to bring an outside perspective to the team. As the development team was quite small I had the opportunity to help develop a pipeline and structure for some of the tricky components such as the spatial audio delivery.

One of the biggest technical challenges with this one was performance. I'm certain this is true of anyone developing for VR but especially when, like us, one is targeting a standalone build for the Meta Quest 2 (then Oculus Quest). Figuring out the best techniques to push the visual spectacle we wanted for this project was a serious challenge given the hardware was roughly equivalent in power to a moderately capable mobile phone and we as a studio were used to pushing Unreal Engine to it's highest fidelity results.

While there was lots of number crunching, compression and load management involved in getting where we wanted, the real beauty was exploring all the ways we could 'smoke and mirrors' a desired effect into existence by using weird scale, lighting or particle system tricks.
The goal was to deliver a visual experience that would match the tone, cadence and narrative of a beautiful Irish language Opera. What made this unique was having characters and environments move and change throughout. Working with planes and simple geometry we were able to use textures, shaders and particle systems to give the impression of depth and space in our environments without taxing the hardware.
This project was intensely experimental. We decided we wanted to push the envelope in terms of what people could expect from this kind of event. We added hand tracking and simple interactions to deepen the sense of immersion, even going as far as to use motion capture data from dancers to drive animations. Seeing people immersed in the experience was especially rewarding.