instagram envelope_alt facebook twitter search youtube_play whatsapp remove external_link loop2 arrow-down2

The gaming and movie industries are inspiring each other

IN 2012, Guy Ritchie directed a trailer for Call Of Duty: Black Ops 2. The 60-second TV spot took 12 weeks to shoot and involved a crack team of 30-plus computer graphics (CG) artists in order to render decimated downtown 2025 LA, complete with razed buildings, zombies and Robert Downey Jr… in a jet.

Innovative: Tallulah Haddon stars as Leila in new thriller Kiss Me First

As a piece of sheer visual entertainment it was more meh than masterpiece. However, it does mark an early example of the cross-pollination of visual effects between Hollywood and the video-game industry — and in its wake the crossover has gone from strength to strength. Though Steven Spielberg’s nostalgia-infused Ready Player One and Channel 4’s virtual-reality drama Kiss Me First are examples of the way mainstream cinema and television have been tapping into video- game culture in 2018, it’s not just the stories on screen that are learning from the realm of joysticks and pixels.

Press Play

Just like the movies: Destiny 2

Axis Studios has worked on video games including Destiny 2, Halo 5 and League Of Legends. It also provided animation for new thriller Kiss Me First. Richard Scott, Axis CEO, says the worlds of animation, video games and television have long been following similar paths. Each area has been influenced by — and has borrowed from — the others across technology and techniques, as well as creative execution.

‘Blockbuster games have looked at movies and television to guide them on what is considered high production values, from casting Hollywood talent as lead characters [Liam Neeson, Charles Dance and Kiefer Sutherland have all been in video games], to creating massive action-heavy set pieces that rival any movie made by directors like James Cameron or Michael Bay,’ says Scott. ‘The influences have always been there and probably always will.’

Role play: Hollywood actor Liam Neeson starred in the game Fallout 3

The difference today, he adds, is that this generation of games technology and consoles can produce visuals and sound that feel more like these movies — and, as gamers demand higher-end visual and cinema-standard effects, the technology crosses between the two worlds. The most obvious area is in software used to create VFX for film and games, such as Autodesk, which brands its tools as being for ‘the entertainment industry’ for exactly that reason, says Scott.

Another technology crossover is the use of motion capture, where an actor’s movements are tracked by hundreds of cameras in a 3D space. These movements can then be applied to CG characters and creatures, and can be further augmented by animators to create even more exaggerated movement. For Kiss Me First the actors’ performances had to cross seamlessly from the live-action world to the virtual and back. Performance capture was the perfect solution for this.

‘We used full performance capture in Kiss Me First,’ says Scott. ‘This means we captured the actor’s body and facial performance together.’

These are the same techniques used to create the apes in the Planet Of The Apes and the Hulk in The Avengers, as well as being utilised in just about every blockbuster video game from Call Of Duty to Assassin’s Creed, he says.

Build a jungle

Virtual set: The Jungle Book (2016)

At the National Film and Television School, which recently won an Outstanding British Contribution to Cinema Bafta, games students collaborate with film students — and head of digital effects John Rowe agrees with Scott that the crossover starts with the software. However, he also believes there’s an artistic trade-off between the disciplines.

‘Film students constantly think about ways to make things look more real,’ he says. ‘Games people constantly think about ways to make things go faster.’

The Jungle Book (2016) is the perfect example. Rowe explains how it was made with a technique known as virtual cinematography, where a team of CG artists built a jungle in 3D, took it to the studio and filmed the actor on the green screen as they rendered the jungle behind him on a computer.

‘Essentially, it’s a virtual set,’ Rowe says. ‘If you look on screen, Mowgli appears to be walking through a jungle and the only way you do that is to use a gaming engine to play back the jungle on the set and film the actor in real time.’

Rowe says that a computer game wouldn’t be able to make as good-looking a jungle because it’s too much information to render all at once. Games systems just aren’t that powerful — yet.

‘Computers and the engines driving the games are becoming more sophisticated [the Xbox One X, for example, runs 4K at 60 frames per second],’ he says. ‘They’re getting up to a level where they can create similar work. The two worlds are getting closer and closer together.’

New Skins

Full performance capture: The technique was used for the Hulk in The Avengers

The gaming VFX invasion isn’t just about looks either. Bryan Elsley, writer and executive producer of Kiss Me First, explains that the crossover of industries has enabled a creative freedom too.

‘The coming together of the worlds of drama, gaming and VFX means there are new perspectives and fresh ideas available on creativity because of the experience of artists and conceptualizers meeting, sometimes for the first time, across forms,’ says the writer of seminal show Skins.

An example of this in Kiss Me First is a fantasy ice-skating sequence Elsley says would never work in a live-action setting. The sequence was key framed (animated frame by frame without a motion-capture process) with the backgrounds changing colour constantly — and Elsley says the game-based animators were relaxed with the idea of mood being expressed through colour and a constantly shifting backdrop.

Battle Of Shadow And Light: Axis Studios worked on video games such as Halo 5

‘The making of a parallel world so different from the real one was the tangible add-on to live drama,’ he says. ‘Gamers have an ease with the creation of alternative environments, which is so much more open-minded than our sometimes earthbound views in the drama field.’

So will we see more convergence between these two disciplines? Elsley believes so. ‘It’s certainly an exciting time ahead with a world of new young contributors with whom to work.’

Show me the light

On reflection: The new tech gets an airing in San Francisco

LAST month at the annual Game Developers Conference in San Francisco, video game and software development company Epic Games showcased the kind of cinematic effects that could be achieved thanks to improvements to its graphics engine and ray tracing, a rendering technique designed to produce realistic lighting effects typically associated with hours of PC processing time.

Powered by Nvidia’s next-gen graphics tech, and with help from a bunch of digitally animated Star Wars Stormtroopers, the company demonstrated real-time light reflections and shadows alongside photorealistic reflections rendered in Unreal Engine 4, which powers many of the highest-quality video games.

What this ultimately means is the technology and techniques driving gaming graphics forward could one day see video game engines like Unreal being used to create impressive visuals for film, signifying a huge step forward when it comes to the convergence of film and games.