A deep dive into the wind, weather, and interactive particles that make Jin’s journey come alive.
The island of Tsushima is a beautiful place to explore, made of many tiny details lovingly crafted by our team. It’s also considerably different from our last game, Infamous: Second Son, which was full of superpowered visual effects. I’m Matt Vainio, lead visual effects artist at Sucker Punch. I like to describe my job as solving art and design problems with technology, something I’ve been doing at Sucker Punch since Infamous 2. Today I’ll be covering how we made the transition from flashy superpowers to a grounded but beautiful game filled with mud, blood and steel – as well as the methods we used to create the visual effects in Ghost of Tsushima.
When we started Ghost I identified a couple of major areas of improvement, based on what the project needed. The first was that I wanted to push the levels of interactivity in our particle systems. We had a huge investment in expression-driven particle systems from Second Son, where we created a wide array of magical superpowers. For Ghost of Tsushima, one of my main goals was to take that system and pivot it towards high levels of interactivity. We knew from the very beginning that wind was an important element we had to incorporate across particle systems and beyond. We also wanted to add animal life, provide epic vistas with environmental ambience and really nail the “mud, blood, & steel” direction by dynamically muddying and bloodying characters as they fight and move across the environment. There are a lot of ways we made particles more interactive, but it all required more data from the game world. These are just some of the examples of the kind of data I’m talking about: global wind, player-created wind from movement, character displacement, terrain, and water position info, weather info like wetness, time of day, and much more.
The second major goal was that we had to build at a large scale. The world of Ghost of Tsushima is significantly larger than Second Son but the Visual Effects team at Sucker Punch was only two people for most of the development. This meant we needed to update our content through automated processes as much as possible. We also had to support large vistas across a dynamic weather and 24 hour moving clock. Lastly, we hand-placed a small amount of elements to help guide the player during their exploration of the island of Tsushima.
One of the key art direction goals from the very beginning was that everything needed to move. This was an obvious area where particles could help by adding floating leaves and pollen in the air, but there are many systems that are all working in concert to provide the illusion of the wind actually blowing. Aside from the particles, there are trees, grass, cloth, and ropes that all move with the wind. All of these elements were tuned together so that they all moved appropriately under different wind conditions. We integrated this global wind direction into nearly every single effect in the game; when a bomb goes off or a campfire is lit, the smoke drifts in the correct wind direction. This is true for fires, sparks, smoke – pretty much everything. We also sample the speed of the wind to add additional turbulence as the wind speed increases.
We didn’t originally intend for our global wind system to become a navigation mode until the Art Director for Ghost, Jason Connell, came to one of our lead engineers, Adrian Bentley, and me to ask if we could make the particles be the direction indicator for quests. Thus, the “windicator” was born. Our approach was not to try and build a pathing system to constantly change directions to avoid obstacles, but to aim directly at the objective and leave the pathfinding to players. This was in large part because we wanted the player to be exploring, engaging their mind in the navigation process and not following some sort of game UI without thinking. This direct path approach was only possible because our particle system can access terrain information, which allowed us to have the particles aware of the landscape around the player.
In my first attempts I made particles hug the terrain, going up and down, flowing with the hills and valleys. This was problematic for a few reasons. The first is that our mountains have a lot of rock models for cliff faces; these cliff models are not part of the terrain information, only the underlying dynamically tessellated mesh was. This meant that the particles would clip right into all rock faces and disappear, making the windicator confusing to follow. It also looked a little unnatural hugging the terrain so perfectly, so my next attempt was to set the terrain as a floor so that particles would be pushed upward when moving uphill and let particles drift flat when the terrain sloped downhill away from them. I added upward velocity when hills are in the way by doing several tests ahead along the path of the particle’s motion. At each of these tests, the particle is looking to see how close it would be to the terrain at that point, or if it would even be buried. If the particle is too close or buried in these look-ahead points, the particle will be given upward velocity. Lastly, the windicator has different elements as the environment changes: pampas fluffs and grass in fields, leaves in forests, ash in burnt areas, and others. I’ll talk more about how that system works a little later in this post.
With foliage, the plants are all rigged with joints that respond to the local wind speed. We worked with the coding and environment teams to create separate controls for trunks from the branches, which allowed us the flexibility to create the wide variety of trees and shrubs needed for Tsushima. On top of the larger trunk and branch movement we layered noise that causes the smaller ripples across the leaf surfaces.
The grass and pampas fields were a particularly iconic element for the game and one that required a lot of back and forth work between the rendering, environment, and VFX teams. The fields are a mix of procedurally-generated triangles for the grass and modeled assets for the pampas stalks and tufts. Initially, we tried using particles to achieve the wave-like motion by displacing the grass in overlapping arcs. The visual quality on the grass was good, but it looked wrong for trees and bushes plus the performance cost was a bit higher than we could afford. For our second attempt, we added two layers of procedural gusts to the environment. The first is a large noise pattern that scrolls with the wind direction; layered on top is a texture that scrolls across the terrain for smaller details on the grass. The advantage of this approach (aside from cost improvements) was that we were able to use the coarser wind gust noise on the trees and bushes as well, which made the grass and other foliage match in a more seamless way.
We did actually end up using the particle grass displacement tech, but for pushing the grass around as the player and horse run through it. This has been done before in games, but one of the key improvements we made here was that by using the particle system to control the displacement, we were able to leverage our expressions for more advanced behavior. One of these behaviors was to make the grass bounce back in a realistic manner by applying a damped wave to the strength of the displacement, which prevents the grass from snapping back to its rest position in a linear and unnatural fashion. The video below is an example of the grass displacement and a debug view of the particle system controlling it. The green parts of the trail represent displacement and the red parts are where the displacement fades. If you look carefully you can see that the grass loses and then regains displacement values in smaller and smaller amounts as the trail gets further from the hero. This creates a natural reverberation of motion, making the grass feel more lifelike.
Last but not least, the character tech art team took the lead on adding dynamic cloth and ropes to the game due to their high use on the characters themselves. Each of the rope and cloth simulations uses the same wind inputs as the foliage and particles, adding to the illusion that there is actual wind moving through the scene.
We assumed early on that any animals the VFX team created would only be seen far away, but as the project evolved we realized they could play a bigger role. For this feature we teamed up with the same rendering coder who wrote the particle system during Second Son production, Bill Rockenbeck. His first step was to let particles spawn fully-rigged and animated mesh objects. Once that was up and running, using the same terrain position information we used on the windicator we could match the orientation of the models to the terrain and have them collide with it when necessary.
We added animals such as frogs, birds/cranes, fish, crabs, bugs – these creatures all have reactivity to Jin and other characters in the world. By using a very low value wind sphere around characters (including Jin) the animal particle effects are able to detect when people are nearby. We used a new conditional event system to change particle motion when this condition was met, causing the animals to scatter away from people. This was also used on projectiles and impacts, which meant that arrows and daggers would also scare animals away.
In the following video you can see some early iteration on the interactive crab effects scattered across the beaches. We started with some static meshes (probably my finest modeling efforts) and iterated on the behavior. I wanted the crabs to maintain some minimum range from Jin to prevent him standing directly on top of them and make them feel appropriately skittish. To help me debug behaviors, the crabs in the video are colored white when they are moving and orange when they are still.
Leaves and Duels
Because one of the hallmarks of Ghost of Tsushima is the wind, we knew that we would need to add a ton of leaves and make them feel really stylish, but also naturalistic. This is why there can be tens of thousands of leaves on screen at any moment in Ghost and they all interact with the wind, environment, and characters.
In order to make the leaves feel appropriately realistic, we spent a lot of effort on making them land correctly based on the terrain. Each leaf is modeled as a disc which uses 3D math to cause it to rotate appropriately in response to torques applied as it contacts the ground. Beyond just landing on terrain, we modeled some other advanced behavior: leaves in Ghost land on water surfaces and flow with the current, and will fall down a waterfall and sink over time as they drift.
Near the end of development on Ghost, a QA tester came to me with a strange bug: leaves would fall into campfires and sit unburnt on the terrain under or next to the fire. I agreed this was pretty weird and decided to resolve this by placing a wind emitter in campfires that the falling leaves would react to. This upward draft kept leaves out of the fire for the most part and was a nice detail.
Duels became a well-loved element in Ghost from their first tests and from the very beginning the VFX team was given the reins by the game directors to help create scenarios that best showed off the interactivity in our game and particle systems.
In many of these duels, we created persistent leaves that would react to player and AI movements. For this, we repurposed the grass displacement tech we built and allowed particles to sample this information. By doing this, we were able to make the leaves part in a stylish way when characters moved rapidly through them. The goal here for duels was not perfect realism, but stylish beauty in motion. We used the same event system described above to look for times when the player displacement values were above a threshold; we then created a falloff so that the leaves would settle from the initial motion after a short time and could then be pushed again. We also used a noise pattern that temporarily allowed leaves to be picked up from the ground by the passing wind, helping to simulate gusts and the unstable nature of leaves.
Other types of duels also make heavy use of other dynamic particle systems, such as floating lanterns, candles that blow out, heavy fog parting at your feet, lightning strikes, and more. We used techniques similar to the leaves on the lanterns, where each lantern is actually a set of particles that move and bob on the water based on player displacement. For the dynamic candles in the second Ryuzo duel we added wind emitters to the sword swings and sampled those on the candles. If the wind passed a threshold, we used our event system to snuff out the fire and light while spawning up a new smoke wisp and ceiling smoke effect. Additionally, players generate wind while rolling and dodging, so quick movements past these special duel candles will also put them out.
Mud and blood
When characters run, roll, slide, and fall on the ground they dynamically add mud to their meshes. Having our characters get dirty as they live in our world was an early goal from art direction to set the tone of the game. During animations, we add mud “sources” in varying amounts to places like knees, elbows, and shoulders so that Jin and others look appropriately battle-dirty. If you look at famous fight scenes from samurai movies, you can see the inspiration for Ghost here: the characters push mud with their feet and are muddy from all of their interactions with the ground.
Like mud, we create blood sources that dynamically add blood to the character meshes during combat, showing their wounded state. Each impact added blood near the sword strike location and a bit beyond. This was used in cutscenes and scripted moments in addition to systemic combat.
Besides adding blood to character models, every attack creates a particle effect with thousands of blood droplets and blood strings. Each of the droplets lands on nearby geometry, and in the water it even disperses into clouds that move with the current.
Build at a large scale
A key component of building at scale was to make our VFX match up with environments and not have to hand place them. We knew that with our small team size and a giant world to fill that we would have to create some form of procedural placement because the environment biomes were changing very frequently. We didn’t want to spend all of our time just maintaining content, so it had to be self-correcting.
Growth and biomes
Here’s a simple example: we wanted particle leaves to fall in only the forested areas and not in the grasslands. The first approach we tried was very brute force: we placed an effect in the trees that were grown procedurally around the island. This failed pretty quickly because the environment was instancing hundreds of trees in small spaces, which, while fine for geometry, was too expensive for particle systems. We had some small success by setting our particles to only appear when very close to the trees (~50 feet), but this felt like a failure when the view distances were so far. The other problem with this approach was that when we dropped a leaf from the canopy 40 feet above and the player is riding a horse, it might not even fall into view by the time they passed the tree – so the leaves were often not visible.
Our second approach was to use the growth system in the same way as the environment, where we procedurally placed the particle systems alongside the trees in the same space by using the same masks and expressions. For this approach we needed to place overlapping circles of leaves within an occasionally complex mask shape. The circles were pushed in the distance of their radius from the edge of the mask rules, which meant there would often be large gaps where there were no leaves present due to the growth rules and expressions. We could balance this by creating more, smaller emitters, but at the cost of performance. It was also difficult to fix specific areas because if we changed the growth rules to fix the density or holes in an area we were looking at, we would often break other clusters of forest that were previously working well. Despite all of this, the technique mostly worked and even though there were some flaws we nearly shipped it.
In image 1, this screenshot demonstrates using our growth tool to add particles based on the rules that determine where the trees are. There are major gaps, so this ended up being below our bar. For image 2, I’ve increased the amount of particle emitters grown substantially. This nearly works, but takes up valuable performance we could use elsewhere. Image 3 shows our third approach, the biome map. One of our rendering programmers was working on a feature for the lighting team that would use data from the environmental growth masks that could be accessed in real-time. I asked if we could piggy-back on this technology and luckily enough we were able to do so. What this got us was the ability for the particle system to know what biome any specific particle was created in. Image 4 has a debug particle system that shows how the particles read this data. The blue areas are grasslands and the green is the forest.
The first pass of this was a little blocky but once I used a noise pattern to modify the sampled information it became much more organic and usable. This became the core of our new environmental ambience system when combined with other features like sampling the terrain position, material, and wind direction. In any average location, we were able to remove hundreds of grown particle systems from the environment and replace it with a single system that followed the camera and was much more accurate to the environmental biomes.
Even though we moved a good deal of our effects from the growth system into a real-time biome system, we did utilize the growth system for some particle effects in Ghost. Some examples include birds placed at the edges of forests, crabs and seagulls in the beach areas, herons in the rice paddy areas and cranes in the marshes. Lastly, we used the growth system to add fog banks at the edges of the forest. In all of these cases, whenever the environment team changed where the forests, fields, and beaches were located, the VFX came along automatically.
Vistas and navigation hints
In Ghost, the vistas are important to both the gameplay by helping you find objects as well as being an important part of setting the tone and art direction. We felt that finding quests should be as natural as possible so we created a variety of effects that could be used to indicate quest locations and challenge content. Some examples include different types and scales of smoke at mission or challenge content, birds that circle a haiku opportunity, steam rising from an onsen steam bath, and more.
Putting it all together
The visual effects in Ghost of Tsushima are an integral part of the environment, with every frame containing many separate systems working together to enhance the dynamic look of the game. From the foreground biome and animal elements to the background weather and content markers – the visual effects helped bring the world to life and make it a joy to explore Tsushima.
Thanks for reading and I hope you’ve enjoyed your journey through the visual effects of Tsushima. If you’re still curious to learn more about some of our underlying technology, you can watch a talk I gave for the Game Developers Conference in 2014 that focuses on our expression-driven particle systems.