Google Earth Day Calculator

SEE MORE
Promotional Web Application to Showcase Chrome OS Flex Launch - 2022
See More
Project Overview
Algorithm in collaboration with Cogs & Marvel developed an interactive web experience for Google to promote the launch of Chrome OS Flex and prove the effects of its sustainability. This was launched at an event for Google's business clients.
My Contributions
  • Web Development & Network features
  • Project Timeline and Task Allocation (SCRUM)
  • Animation Framework (PIXIJS and Node.js)
This project was a great opportunity for me to implement a more typical technical project structure and introduce non-technical team members to SCRUM ideology and version control systems.

One interesting facet was that we were working for Google and as such, were given very clear and concise guideline not just for the visual deliverables but also for the technology we should be using.

The characters and certain graphics were pre-conceptualized but we worked closely with their team to establish an animation that lived up to both teams expectations. This was my first time using Google Firebase rather than AWS and also using instead of using a canvas driven programming framework we used PIXIJS which was much more performant for animation and allowed loading of vector graphics which could be scaled losslessly across any mobile devices.

Finally we transposed and implemented a data driven calculator developed by Google's research team to determine c02 emissions Chrome OS Flex could save.
The goal was to deliver a visual experience that would match the tone, cadence and narrative of a beautiful Irish language Opera. What made this unique was having characters and environments move and change throughout. Working with planes and simple geometry we were able to use textures, shaders and particle systems to give the impression of depth and space in our environments without taxing the hardware.
This project was intensely experimental. We decided we wanted to push the envelope in terms of what people could expect from this kind of event. We added hand tracking and simple interactions to deepen the sense of immersion, even going as far as to use motion capture data from dancers to drive animations. Seeing people immersed in the experience was especially rewarding.
Project Highlights
When worked on this project with Algorithm Productions it was already under production. This ended up being a valuable asset as I was able to bring an outside unbiased perspective to the team. As the development team was quite small I had the opportunity to help develop a pipeline and structure for some of the tricky components such as the spatial audio delivery.

One of the biggest technical challenges with this one was performance. I'm certain this is true of anyone developing for VR but especially when, like us, one is targeting a standalone build for the Meta Quest 2 (then Oculus Quest). Figuring out the best techniques to push the visual spectacle we wanted for this project was a serious challenge given the hardware was roughly equivalent in power to a moderately capable mobile phone and we as a studio were used to pushing Unreal Engine to it's highest fidelity results.

While there was lots of number crunching, compression and load management involved in getting where we wanted, the real beauty was exploring all the ways we could 'smoke and mirrors' a desired effect into existence by using weird scale, lighting or particle system tricks.