The Evolution of Motion Capture at Deep Silver Dambuster Studios
23 Jul 2020

Deep Silver Dambuster Studios has an in-house Animation Department that was responsible for creating the in-game and cinematic animations required in Homefront: The Revolution and is currently working on the next instalment in the Dead Island franchise.

An integral part of this team is Performance Capture Lead, Rich Holleworth. He ensures that our Animators are constantly receiving the high-quality Motion Capture data they need to bring the Player Character, NPC’s, and of course hordes of zombified enemies to life (or back to life in the latter’s case).

An Introduction to Performance Capture Lead, Rich Holleworth

Rich first became involved with Motion Capture after completing a Degree in Electronic Imaging and Media at the University of Bradford. It was whilst commencing a Postgraduate Degree in Interaction Design in 2004, that he did his first professional Motion Capture work.

‘I was doing Video Editing and CG work, and sort of fell into helping out at the Mocap facility which the University had put together, they had bought a fairly advanced system for the time.

Traditionally, Motion Capture would be part of a University’s research portfolio for Sports Science or Biomechanics and Gait Analysis, but this new system was instead based in the Media department. I think it was the first time a MoCap setup had been bought by a UK University specifically for animation purposes.

The University was using the equipment for both teaching purposes as well as commercial jobs and had just picked up its first major contract for a game – which turned out to be the fourth game in the Driver series, Driver: Parallel Lines.

The researcher who was running the system asked me if I wanted to be a part of the project – since I had a lot of stage experience and knew 3D Graphics well – and so I ended up helping him run the shoot and process the data on the back end. I’ve been involved in MoCap ever since.’Rich Holleworth Performance Capture Lead at Deep Silver Dambuster Studios

Working as a roadie for the Student’s Union gave Rich a unique advantage when it came to performance capture, as it taught him about stage awareness, and how to quickly assemble complex technical setups under pressure.

Rich has now been working in motion capture for over 15 years and has held positions at several companies including being the Co-Founder and CTO of Andy Serkis’ Imaginarium Studios. The Imaginarium pioneered many advanced Performance Capture processes in the UK, and Rich is bringing these techniques to Dambuster’s own in-house Studio.

During this time, he has also been a Course Leader and Visiting Lecturer at the University of West London, and developed their Games, Design, and Animation Undergraduate Degree.

[Rich Holleworth, Performance Capture Lead at Deep Silver Dambuster Studios]

Joining Deep Silver Dambuster Studios

Rich also worked on Homefront: The Revolution whilst at Imaginarium Studios before officially joining Deep Silver Dambuster Studios in 2018.

Since then, he has worked side by side with our Animation Department capturing performance data, through to its processing and final implementation in the game engine.

Here’s what Rich had to say about working with the Animation Department at Deep Silver Dambuster Studios:

‘It’s unusual for an Animation team to be so involved in Motion Capture, but as they are all highly capable, and the quality of data we are getting is so high and our pipeline is so robust, they can pretty much serve themselves without the need for a large tracking Team.

I have developed a lot of Tools and aids in the MoCap processing software, Vicon Shôgun, and similarly, I have built an entire Pipeline inside MotionBuilder that simplifies MoCap specialist skills into single mouse clicks. This way, the Animators can make the creative choice as to whether to edit within Shôgun, MotionBuilder, or in Maya before bringing the animation into the Engine.’

[Rich worked on Homefront: The Revolution at the Imaginarium Studios before joining Deep Silver Dambuster Studios]

Motion Capture Practices in the Past

Prior to 2020, Motion Capture requirements at Dambuster were handled in a few different ways. The studio owned a basic MoCap setup of 16 two-megapixel Optitrack cameras, which gave the team a few metres of capture space – ideal for movement sets for the Player, NPC’s and enemies.

This tends to be the approach many studios have adopted, as having a small in-house system lets Animators record for reference or blocking, as well as capturing moves to be used in the game.

If a cutscene involved a large number of actors and needed significant space however, it was typically shot externally either at the Imaginarium, or the old Carlton TV studios in Nottingham, which are now owned by the University of Nottingham.

The University has kept two of the Carlton studios functional, and we have used Studio Seven – which is a full-scale television set – frequently on Homefront: The Revolution and our current projects. A problem with using this Studio however, was that it has a weekly rolling booking at the weekends, so the team was always operating on a strict time scale.

This meant that they would need to be on location on Monday to set up and ready to shoot from Tuesday through to Thursday, with everything packed up by Friday for the booking to come in at the weekend.

Planning in advance was critical to executing these shoots and ensuring that we got everything required within the allotted time. Being off-site required us to be highly self-sufficient, and bring everything, from the cameras to the coffee machine.

Motion Capture at Deep Silver Dambuster Studios Today

In February this year, we opened our very own space in Nottingham, which is dedicated to Performance Capture.

Since Mid 2018, we have operated a Vicon Vantage system that consists of 46 cameras with a mix of resolutions between five and eight megapixels. We also have three Vicon Cara HMC systems and Standard Deviation head rigs, which the cameras are attached to along with professional grade wireless microphones.

The video based facial capture system gives performers the freedom of a 12x12 meter capture volume plus high-resolution images of the actor’s face for Animation reference.

‘If you look at the price of buying a system versus using a service provider for tens of days of capture on a large project, it makes logistical and financial sense to bring that in house.

When you have the facilities available, you also start to use them more and more, because it becomes less of a logistical task. You don’t have to co-ordinate actors, location and systems to just try something out, you can simply say, “next week someone from the team put the suit on and let’s try something” – it gives you a lot of power.’Rich Holleworth Performance Capture Lead at Deep Silver Dambuster Studios.

[In February, Deep Silver Dambuster Studios opened its very own space dedicated to performance capture]

How Has COVID-19 and WFH Impacted Motion Capture?

Our studio is in Nottingham and most of the actors we employ are based in London, which has been an issue due to previous travel restrictions.

Since the partial lockdown lifting, and when the studio has been confident people can travel safely, we’ve conducted about one MoCap session a week.

Compared to the film and television industries where you need around five or six people just to operate a camera – let alone dozens of Riggers, Grips, and Lighting Technicians – we are very lucky that the entire MoCap system can be run by just one skilled operator.

If we didn’t have our own system and studio available, we would have another logistical hurdle in needing to send our team to an external studio to work with the actors.

Having a dedicated space has also allowed us to carry out extensive pre-shoot preparation to ensure things run as smoothly as possible and the correct safety measures are in place.

Whilst recording in the studio, only Rich (operating the MoCap and Video systems), the actor, and a MoCap Director are present.

Other stakeholders can safely be off site – monitoring proceedings through video conferencing technologies – and can talk directly to the actor through a PA channel without ever needing to be on-stage. We have markings and other safety procedures in place to ensure everyone stays safe on stage. We are constantly monitoring industry best practice guides to ensure we operate safely and can restart Performance Capture with multiple actors.

[Performance Capture during COVID-19 has required new safety measures]

Extra Information: What’s the difference between MoCap and pCap?

The term Performance Capture was widely popularized by Andy Serkis following the release of King Kong (2006), and grew in public awareness after the 2009 release of James Cameron’s AVATAR. Performance Capture is specifically the technique of recording an actor’s voice, body motions, and facial performance at once and in perfect synchronization. Previously, Motion Capture of body-only performances would have facial animation applied later – sometimes provided by an entirely different performer – or be for Animation sets with no facial performance at all.

What differentiates a Performance Capture shoot from Motion Capture is the integration of all these different recordings, and the more Film-like procedures on set. Once only available at studios such as Giant or WETA, these technologies and practices are gradually being adopted by high-end games studios.