Our recent project for BBC Earth – ‘Life in VR’ – presented us with an amazing opportunity to realise a natural history story within VR. This article highlights some of the technical challenges we faced and the solutions we arrived at.

An opportunity like this comes with a lot of responsibility. For the experience to be a success it needed to capitalise on the real-time technical affordances of the platform, while still putting great emphasis on ease of use and overall comfort for the audience.

We also felt strongly that the experience must support the oldest DayDream device right up to the latest, including new 6DOF standalone headsets. Achieving this and  finding that balance was exciting, highly iterative and resulted in a lot of learnings. Here’s a handful of those. If you want more detail (or more explanation!), see the full case study here, or get in touch: hi@preloaded.com

REAL-TIME CREATURE BEHAVIOURS

What makes Life in VR different to the documentary format from which it’s born is that users don’t just watch these authentic stories unfold; they experience them in real time – as interactive VR ‘stories’. Our ambition was to create a living, breathing ecosystem where each visit revealed a new visual and interactive experience.

So once each creature had been modelled and rigged, the next step was to bring everything to life. After rolling up our sleeves and making tea, we soon came to the conclusion that hand-animating every possible encounter would have taken vast resources away from different parts of the project. Most importantly, we would have ended up with a very linear, ‘staged’ experience.

Leaning on PRELOADED’s gaming background, an AI system was developed to power the creatures’ swim movements and blend animations. Traits such as curiosity, hunger, nervousness and more could be handled within the creatures’ AI, requesting and blending animations as required. Humboldt squid would dart along their migration paths back to the relative safety of the ocean depths, while sea otter would hunt for urchins on the ocean floor before returning to the surface.

This level of AI behaviour means the player’s presence is recognised – the sea otter looks over and shows its curiosity, for example. By investing time in the development of this system, emergent behaviours were created naturally, helping to emphasise the living, breathing ecosystem our audience were immersed in.

An AI system was developed to power the creatures’ swim movements and blend animations

Creature AI behaviours

BUMPER CARS

With the creatures now swimming around and performing their daily routines, the next step was to make sure they moved through and around the environment naturally and didn’t bump into things, as we were unable to rely on standard collision geometry.

After weeks of testing across multiple devices, a solution was reached that voxelized the scenery, storing collision data in arrays. For example, if a voxel cube contained a rock, it would store a true value. If it was empty, it would store a false value. With this data in place, the engine takes the player and creatures positions in the environment, looks up the relevant location (x,y,z) within the array and sees if it is occupied or not (true or false).

We worked out we could use this data for all kinds of magical things, like detecting if the player had pushed their head through rocks (on 6DOF headsets) so the environment could react accordingly, or if creatures were colliding with rocks and kelp branches.

We worked out we could use this data for all kinds of magical things, like detecting if the player had pushed their head through rocks (on 6DOF headsets) so the environment could react accordingly

Voxel_Debugger
Voxel debugger view pretending to be Mine Craft

WE NEED TO TALK ABOUT KELP (AND CUSTOM BATCHING)

If I were to say ‘kelp’ in the studio, you would probably see a sway of people pinching their nose bridges or reaching for bottles of whiskey. Early on in development, we discovered that the quantity of kelp branches needed to make up the rich underwater environment just wasn’t going to be possible, even by using billboarding techniques. Some of our first attempts yielded results of 20fps (frames per second). In essence, using transparent textures broke batching impacting performance. An obvious solution to use 20 branches of Kelp, as opposed to the 600+ branches we ended up using, wasn’t going to cut the mustard!

We opened up PRELOADED’s ‘Visual tricks and optimisation for mobile’ play book, making sure we had ticked off all the usual suspects: reducing the poly count on kelp models; implementing shaders for sea current simulations flowing through the branches; GPU instancing. They all helped, but we needed more juice. A few months later, a custom mesh batching system was developed. This could have different settings fed to it at runtime, allowing us to cater for lower end DayDream devices.

Because of our continued optimisation, we had enough grunt in 6DOF to allow us to swap out static meshes for cloth physics – allowing players to push their head through kelp on 6DOF headsets.

To pull everything together and provide a visual grade, we developed a custom fogging system that allowed you to ramp through colour not only in depth, but also in height. This helped reinforce story beats through the experience as you journeyed deeper into the ocean.

Digging deep into our arsenal of visual tricks, we finally rolled out a custom light baker that allowed us to quickly zone areas of scenery within the Unity editor and export to Maya to allow us to render high-quality bakes.

An obvious solution to use 20 branches of Kelp, as opposed to the 600+ branches we ended up using, wasn’t going to cut the mustard!

Slowing down the screen construction per frame

PAINTING WITH MUSIC

Sound effects and music pulled everything together to give it the BBC Natural History stamp of approval. Spatial sound was implemented through Google’s Resonance Audio API and music was commissioned based on a thematic feel for each of the chapters.

Each composition was created in a collection of styles such as Neutral, Happy, Danger and more. The goal was to create a dynamic music engine that would mix music based on the players’ encounters and position within the scene. Our music engine allowed us to paint areas using each channel of a RGB image, including its Alpha channel on a plan view of the environment. Each of the colour channels was assigned a different style of music and based on a player’s location in the scene. The music’s volume was blended between the different styles, with gradients between the colours acting as volume mixers.

The goal was to create a dynamic music engine that would mix music based on the players’ encounters and position within the scene

Painting with numbers

AND NOT TO MENTION...

Some other final technical highlights include the entire experience having no loading between chapters, creating a seamless experience. GPU instancing also allowed us to push some 16,000+ krill(!) around in a bate ball, and Unium Asset, created by an old Preloader, let us turn geometry on and off in real time while running on target devices from a desktop.

GPU instancing on 16,000 krill

In summary, our understanding of VR and pushing the technical edges of mobile devices to bursting point has levelled up through the development of Life in VR. At every challenge, thinking outside the box and strong collaboration was critical. An iterative process between art, design and tech to come up with creative solutions was one of the key factors for delivering such an exciting and successful project.

Related Case studies

XR

BBC Studios

Bringing one of the world’s most respected and loved natural history brands to VR

Read more
XR

Tate & VIVE Arts

Using VR to develop empathy with the artist in his block-buster Modigliani exhibition.

Read more
XR

Science Museum

Revealing the magic of mathematics in the Science Museum's first real-time VR experience

Read more
Jon Caplin

Jon is Associate Creative Director at PRELOADED. He's a fine purveyor in pixel art, a spare time indie game developer and milk before water kinda guy.