Skydance Interactive shows how it injected VR physicality in “The Walking Dead: Saints & Sinners”

Spread the love

Disclaimer: The following tech blog displays graphic imagery of zombie dismemberment. 

Unreal Engine –

Skydance Interactive shows how it injected VR physicality in “The Walking Dead: Saints & Sinners”



Hi, my name is Peter T. Akemann and I’m the co-founder and chief technologist at Skydance Interactive. In this tech blog, I’ll be talking about our recently-released VR game, The Walking Dead: Saints and Sinners.  

Physicality has long been seen as the holy grail of VR games, where VR could give gamers a next-level experience over conventional gaming platforms. But in practice, VR physics has been a mixed bag – great moments combined with a lot of jank and unpredictable results. This works when the novelty of the physics sandbox is the essence of the game, but this has so far limited VR physics gaming to a narrower audience.

The Walking Dead: Saints and Sinners is part of the vanguard of physics-based VR games, which have honed the “physics experience” into a set of intuitive and reliable control mechanics that lets the player focus on the larger game. We believe we have begun to deliver on the promise of next-level VR gaming, especially for immersive melee combat. 

This blog post will give an overview of the player-versus-Walker combat system, with highlights of some of the tricks, techniques, and key insights that made our system come together.


“Visual haptics” – seeing is believing

Motion controllers offer a high dimensional and intuitive input apparatus compared to standard game controllers, but they are still an abstraction far from the full physical “reality” of your game. Player-hand positions, if taken literally, will pass right through physical surfaces, feel no weight or inertia from held objects, and generally give no sense of true interaction with the virtual environment. 

What is missing is something called haptics. The dictionary defines haptics as technology to “reproduce in remote operation or computer simulation the sensations that would be felt by a user interacting directly with physical objects.” Of course, such technology remains well out of reach of current commercial reality.

Fortunately, our minds are very good at making sense of what we see, even in defiance of our other senses. We can happily reconcile deviation of our virtual hand from its real-world position if we see it encountering obstacles, reacting to weight or impact from virtual objects, and those objects responding to the hand in kind. When the natural feedback loops you expect are convincingly represented, you can “feel” them and intuitively accept them.

Thus, we have adopted the term “visual haptics” to describe the collection of physics and animation techniques that make the VR player’s hand motion predictable, believable, satisfying, and effective.
Fig. 1: Objects of varying mass

Physical hands

To get started, we separate the game hand from the player hand. The game hand is visually a skeletal mesh, but physically, it’s a single PhysX rigid body. Hand pose logic keeps the fingers in a cosmetically valid position when in contact with surfaces. This allows us to respect collision and keep the game hand in the world. 

The player hand is like a marionette control for the game hand. It determines the set points for two PID controllers—a linear controller and a quaternion controller—which apply restitutive forces to the game hand. Because the target position and orientation are constantly in motion, our integral components are zero, leaving proportional and derivative components – effectively a critically damped oscillator.

PID gain coefficients are scaled to model the player’s in-game strength and energy level. This provides both realism and useful in-game stats to affect gameplay.

Grip is modeled as a physical constraint between the physics hand and physical objects in the world. The visual player hand is posed to show a firm grasp on the held object.

The result is a realistic sensation of weight, linear and angular momentum, and collision of held objects.

This physicality leads players to mimic the physicality of whatever they are holding, from rapid motions with smaller objects to larger, more deliberate motions with larger objects, enhancing the physical experience.

Where we had to cheat 

There was no universal set of PID numbers, which made the hands feel responsive for the full range of potential held objects. We needed more strength for longer objects – such as axes and swords – without over-dialing the simulation when empty-handed. Thus we scale our PID controller strength via a curve in proportion to the moment of inertia of a held object.

We would also cheat mass on certain objects, such as guns, where players are accustomed to a high (and unrealistic) degree of control. The takeaway: Player experiences always trump “realism.”
Fig. 2: Two-handed weapon grips

More on grips and object manipulation

To enjoy the physicality of objects, how you hold them is everything. We needed to make it easy for players to enjoy the full range of one- and two-handed grips without getting into awkward positions.

So, we instrument each object with multiple grip points and grip splines, which also specify the degree of movement or angular displacement allowed at each grip point.

Objects are easily passed from one hand to the other or transitioned from one-handed to two-handed grips. We’ve added some special actions for flipping between orientations at the same grip point (see the knife in Fig. 1 above).

Two-handed interactions produce a surprising range of natural outcomes that feel intuitive and physical. Long objects are easier to handle with grip points far apart. Varying the distance between your hands let you manage the heft versus the degree of control, all from the underlying physics. Different grips make different actions natural to execute.

This gives the player freedom to engage in flashy weapon play, making for great theater and the thrill of being a weapons master.

Melee combat

Any blunt object can cause melee damage or knockbacks by interpreting physical impacts as potential damage events. This applies to both held and thrown objects.

Melee weapons are equipped with special collider geometry that specifies which parts of weapons do damage (and from what angles) to affect the damage properties of bladed or spiked weapons.

Good hit reactions and a frighteningly realistic wound shader (combining a meat layer reveal, decal fringe splashes, and vertex deformation to express the depth and angle of wounds) make for satisfying and varied damage from weapons of all shapes and sizes.

It’s all pretty standard stuff, but the real gems of the system are the stabbing and grappling systems.
Fig. 3: The stab

Melee attacks and embedded weapons: The stabbing system

The stabbing system is a signature feature of The Walking Dead: Saints & Sinners‘ combat, which allows weapons to become embedded in the enemy such that they must be yanked out with enough force to dislodge them.

A successful stab consists of the following phases:

1. Initial detection
We use a greatly enlarged collision volume around the stabbing weapon’s tip to detect an incoming stabbing action. If the stab volume comes into contact with a stabbable surface—and has sufficient velocity normal to that surface—a stab is initiated.

2. Stab initiation
If needed, the player character is given a small burst of forward motion to close range so the player arm can reach the stab point. It feels great, and players generally don’t even notice they’ve moved.

In parallel, the held weapon and player hand are snapped into the nearest allowable position and orientation that represent a valid stab into the stabbable collision surface.  The blade’s tip or edge is placed just inside the stabbed surface at an appropriate orthogonal orientation. A constraint is created at the impact point, allowing only motion along the axis of the weapon. A small damage event is delivered from the initial impact.

3. Stab push/pull
Player hand displacement along the stab direction is interpreted as force, which drives the blade deeper into the stabbed surface. The blade will penetrate up to its maximum depth. It will not deliver its full damage payload until it has been driven in all the way. A different curve is used in reverse for drawing the blade out again.

These curves give the sensation of subsurface resistance to the blade, which has proven immensely palpable. The displacement model also makes it easy to master the physical actions required.
Fig. 4: Stab penetration and withdrawal curves

The stabee also plays a critical role by entering a stunned state and/or complying with grapples to provide enough resistance so the stab can be cleanly completed. More on this below.

NPC melee behavior

At their core, our Walkers and NPCs are conventionally animated with a blend of mocap and hand-authored animation data controlled by Animation Blueprints, including IK for head look, aiming, and foot placement, partial- and whole-body hit reactions in response to attacks and a ragdoll state for death.

However, the physicality of Saints & Sinners required several additional systems to take the physical experience over the top.
Fig. 5: Active ragdoll with partial stab

Active ragdoll

We implemented an active ragdoll system on our NPCs and Walkers using a UPhysicalAnimationComponent, which runs on their upper body during unconstrained movement (for instance, when there is no grapple or stab connection to the player). This allows characters to react to small stimuli, such as player hands, thrown objects, or player-held objects not being wielded with enough force to elicit a hit-reaction animation.

This system provides natural and varied reactions to a wide range of stimuli that fall short of a full hit reaction or knockdown event. In a VR environment, the player has such freedom to interfere in arbitrary ways with NPC movement that this was a necessity. Without it, the characters felt like animatronic statues and unrealistic.

Active ragdoll also greatly improves the sense of contact during a stab operation. This allows the stabbed character to react physically to the presence of the partially embedded blade, greatly increasing both the realism and the challenge of pulling off a successful stab.
Fig. 6: “The Grapple”


Saints & Sinners supports grapples in both directions between the player and the Walker. 
Players can grapple a Walker by grabbing their head (see examples in the GIF montage above). Here, we need to synchronize the animation of the Walker and the player so we switch to an animation model: physics hand motion is disabled and locked by IK to the head of the Walker. The Walker is then animated via a three-dimensional blendspace of offsets based on player hand motion relative to the grab points. This retains the sensation of physical interaction but with controlled outcomes. When the player releases the grappled Walker, it is thrown into a stagger animation matching the direction and speed of motion before release.

Walkers can also grab the player by either arm (including two Walkers at once). The player and Walkers are locked into a common animation state. The player hand is again taken out of physics, and hand movement drives a 2D animation Blendspace to simulate the struggle to escape. It ends with either the Walker being dislodged and staggering back (if the player breaks free) or the player being bitten (goodnight, player!).

These two modes can coexist: left arm grappled by a Walker while you grab that Walker’s head with the other hand.

Grappling also dovetails perfectly with the stabbing system. By grabbing a Walker’s head, stabbing the Walker is much easier because you are holding the target still. Moreover, pulling the blade out is easier because your two hands work together to separate the blade from the body.

More cheating required

Once you’ve stabbed someone (or something) in the head and not withdrawn your knife, you find you are dragging the body around (see Fig. 3). Physically, in addition to the weight of the dragged physics chain, we constrain player movement to the dragged object, which strongly reinforces the sense of weight.

Here, we encountered the problem of stability. Long constraint chains between objects of dissimilar mass are notoriously difficult to resolve. And this is perhaps our worst case: The chain of hand→shiv→Walker head→ Walker torso (and remaining limbs, and so on) has a very heavy irregular object on one end of the chain, the player hand with strong restitutive forces being applied to it on the other, and the very light shiv in the middle. The result was constant jitter at the hand-knife and knife-head constraint points.

Our solution to this was to cheat the masses of the objects when they were in the stabbed state. We temporarily scale down the overall mass of the ragdoll and add that same mass to the held object (in this case, the shiv). The player’s hand is dragging the same total mass, but its distribution was much more simulation-friendly.

In summary

With a modest team of Unreal Engine veterans, the power of VR, and a singular focus to create a great combat experience, the team at Skydance believes we were able to raise the bar for intuitive physical combat in video games.

Moreover, we believe we have barely begun to uncover the potential of what can be achieved. As the economics of VR continues to grow, we look forward to pushing VR combat and exploring new heights of player-physical combat.

We hope we are blazing a trail that many will follow.

If you want to be a part of it, check out our career opportunities