When worked on this project with Algorithm Productions it was already under production. This ended up being a valuable asset as I was able to bring an outside unbiased perspective to the team. As the development team was quite small I had the opportunity to help develop a pipeline and structure for some of the tricky components such as the spatial audio delivery.
One of the biggest technical challenges with this one was performance. I'm certain this is true of anyone developing for VR but especially when, like us, one is targeting a standalone build for the Meta Quest 2 (then Oculus Quest). Figuring out the best techniques to push the visual spectacle we wanted for this project was a serious challenge given the hardware was roughly equivalent in power to a moderately capable mobile phone and we as a studio were used to pushing Unreal Engine to it's highest fidelity results.
While there was lots of number crunching, compression and load management involved in getting where we wanted, the real beauty was exploring all the ways we could 'smoke and mirrors' a desired effect into existence by using weird scale, lighting or particle system tricks.