Skip to main content

Avatar: Frontiers of Pandora - the big developer tech interview

A deep dive into Ubisoft Massive's phenomenal, revised Snowdrop engine.

Avatar: Frontiers of Pandora almost came out of nowhere to become one of the best-looking games of the year, with the movie series' iconic setting meshing beautifully with almost Crysis-style gameplay and a newly upgraded Snowdrop engine - which was itself first announced at E3 ten years ago with The Division. Best of all, Avatar is a technical triumph not only on PC, where it pushes the boundaries of graphics technology, but also on consoles - where it holds up surprisingly well on PlayStation 5, Xbox Series X and even Series S.

The studio behind the title is Ubisoft Massive, and recently Digital Foundry's Alex Battaglia had a chance to interview two key figures in its technical development: Nikolay Stefanov, the game's technical director, and Oleksandr Koshlo, the render architect of the Snowdrop engine.

The interview that follows is a fascinating look behind the scenes at how Massive were able to develop the Snowdrop engine, realise the world of Pandora in video game form and deliver a game of a type and quality that isn't as common as it once was.

As usual, both the questions and answers have been lightly edited for clarity. Enjoy!

This is the full PC tech review of Avatar: Frontiers of Pandora, which goes into detail about its many systems and how they succeed - or fall short. Watch on YouTube

Digital Foundry: The first thing I noticed playing the game is that you're using an entirely new global illumination (GI) system. Ever since RT-capable GPUs came out in 2018, we've seen a bunch of different techniques for achieving hardware RT lighting, so I'd love to hear how it's done in this version of Snowdrop and what hand you had in its development.

Oleksandr Koshlo: I'm a rendering architect here at Snowdrop, so my job is to look for a general direction for graphics renderer development. I've spent quite a bit of time on the BVH (bounding volume hierachy) management part of our ray tracing, with other members of our team spending time on actual lighting, the "rays" part of it... We have a lower-detail representation of the geometry plus average materials in our ray tracing world. It's a combination of screen-space traces, world-space hardware traces, and probes that are also ray tracing to get the proper lighting in question.

So the process is to do a screen-space trace. If we hit something, do the lighting of that hit, if we didn't, let's continue from that with a hardware ray into the ray tracing world. Depending on the effect, which is either diffuse GI or specular reflections, the length of the ray is different. So, if it didn't hit anything at the length of the ray, we fall back to the probe result. So we get result from probes on the failure. If you do hit something, we light it with our local lights, sunlight, and also feedback from probes. The probes are both a fallback for missed rays and a source of a secondary light. This is how we get feedback and multibounce.

Nikolay Stefanov: I think one of the things that you will see with us is that - as we are all Swedes - we are pretty bad at naming things. So we don't have a catchy name for it. But I think it's a really cool, versatile system that allows us to basically take advantage of all of the different techniques that we've done so far. So, for The Division, we had the probe-based lighting, and this is continued now where we are using it as sort of a cache for the secondary bounces, screen-space GI and ray tracing. Of course we're also taking advantage of hardware ray tracing. But also one of the things that I think we should mention is that we also have a compute shader fallback for this, for graphics cards that don't support hardware-accelerated RT.

Oleksandr Koshlo: It's a bit hard to distinguish between screen-space and world-space rays, because I tend to call world-space rays "hardware rays", but these are also possible to do in software and when we're talking about probes I'd like to emphasise that these are real-time ray-traced probes. There is nothing baked.

Digital Foundry: That's what I was wondering too, because I imagine you use the probes for transparencies as well - and if you fall back to the PRT (precomputed radiance transfer) system that you had earlier, they would definitely not look as high quality as they do here.

Nikolay Stefanov: Exactly. Probes in this way are more of a radiance cache, so to speak, rather than something baked in. And speaking of "baked in", that's one of the things that is also really great about this system - it allows us to skip all the expensive baking, since the world of Pandora is so much more detailed, and it has so many more regions than what we used to have on both The Division games for a very different environment. We started the game originally using the PRT system, which was taking days to bake over the entire level; just doing iterations on the world took ages. So it's really good to have a system that allows us to move stuff around and see the changes in real-time, especially when it comes to interiors.

Digital Foundry: Yeah, the game starts off in interiors, and you can already notice it with the flashlights, from the characters that are walking around inside and actually relighting the world as you go through it. As mentioned in pre-release materials, there's ray tracing for shadows. Can you explain how that works?

Oleksandr Koshlo: As I mentioned before, we have a lower-detail geometry representation for our BVH, which means we cannot use RT for precise shadows... but we do use it in two ways. One is contact shadows, so short rays towards the light to see if we hit any surface to get a contact hard shadow. And kind of the opposite of that is long-range shadows. So anything behind the range of our shadow cascade is ray-traced, and that's how we get shadows at long distance.

Nikolay Stefanov: This is used for things like the big stone arches or the floating islands that are in the vistas and it's super important for us to get that detail. We also do traces against the terrain, if I remember correctly.

Oleksandr Koshlo: We trace against terrain, and we also added imposters into the ray tracing world. So those are hardware-traced boxes. And once we hit the boxes, those are software ray marched against that baked bit for the tree.

Nikolay Stefanov: You'll see that with a lot of our technology, where it's a combination of the best parts of existing techniques that we can combine in order to get the best result possible.

Digital Foundry: I've only been playing on PC, but I'm very curious actually how you scale this to get the GI to run well on Xbox Series X, Series S and PlayStation 5, because there are obviously limitations to how much you can push a certain amount of hardware.

Here's Tom's video on how the game runs on PS5, Series X and Series S. The answer: surprisingly well, given the level of graphical fidelity. Watch on YouTube

Oleksandr Koshlo: It's been challenging I'd say, but there's a bunch of knobs to crank to scale it across different quality and hardware. Number of rays, the resolution of the result, quality of the denoising, precision of the results, the length of rays can vary. We have certain trade-offs. We can trace this faster if it's less precise, so let's use that one. The tweaks we have can be large, such as resolution, to very small things that we can tweak.

Nikolay Stefanov: I would also say that besides performance on the GPU, one of the things where we've had to scale has been memory, especially on Series S where there is less memory available than on the other target platforms. So for example, we load the ray tracing world at a short distance, so some of the distant shadows are not going to be as accurate as they are on the other platforms. Some of the geometry that we use for the BVH, for the ray tracing, it is at a lower LOD (level of detail) than it is on other platforms. Things like that.

Digital Foundry: All sensible scaling, so that applies to the GI. Any of the other tracing that you mentioned, with regards to terrain shadows, or hard contact shadows, is that also found on the consoles?

Oleksandr Koshlo: Yes. Actually, all the effects are the same between platforms, though on PC we expose more options to disable or enable things.

Digital Foundry: So you say a simplified world in the BVH? Which aspects of foliage or skinned characters are included?

Nikolay Stefanov: I think most of the geometry is included by default inside the ray tracing world; we tend to remove very, very small things. So for example, not all the grass is going to be in the ray tracing world, not all of the little micro details are going to be in the ray tracing world. Characters are there, or at least most of them should be. In general, it's down to the technical artists to make a decision about whether they want certain things to be there or not.

There are certain considerations that I think you're going to find a little bit interesting. So, for example, if you have a super-small and bright thing, it might actually be better to remove it from the ray tracing just to reduce the noise. It's also important to not have every single bit of geometry in the ray tracing world due to memory constraints. To summarise, by default everything is in the RT world but there are certain things that the technical artists can decide to switch off individually and remove them from the RT world.

Digital Foundry: So you mentioned small, bright things. Along those lines, what was really noticeable in the game for me was the support for emissive lighting. And could you tell me how that plugs in? Does it "just work"?

Oleksandr Koshlo: It just works. This works as a part of GI, where if a ray cast hits an emissive surface, it's going to contribute to lighting. But obviously, if it's a small surface, and we cast rays randomly throughout the scene, we risk rays hitting a small surface which introduces lots of noise. So, we do recommend to our artists to remove small emissive surfaces from the ray tracing world.

Nikolay Stefanov: If I can explain it briefly in path tracing terms, what we do is a specific technique called "guided paths". Basically on the first ray that you hit, you evaluate the lighting analytically. So you don't just do complete Monte Carlo path tracing. But this is only there for the analytical lights. For the emissive surfaces, as Sasha was saying, we actually rely on the randomness of the rays. So that's why this can introduce more noise than analytics lights. But emissives do work and we fully support them.

Digital Foundry: So you're talking about ray guidance? Did the project ever look into using ReSTIR or ReGIR, to make the results of that randomness a little bit better?

Oleksandr Koshlo: We did. And we definitely did a lot of research in RT techniques, denoising techniques. We did not end up using ReSTIR specifically. We are still evaluating, and will be evaluating all of the advancements in RT. But I think we have really good people working on the denoising side, tirelessly, as it is a really hard problem to solve, and we are really satisfied with the end result.

Nikolay Stefanov: I think that if you want to target Xbox Series S, the combination of techniques that we use are roughly where you are going to end up. Some sort of first bounce GI plus some sort of caching in probes, etc, etc. I think ReSTIR and other techniques, while they are super promising, it's hard to make them run and perform well on consoles, and also at 60fps as well.

Digital Foundry: When did you actually start giving the artists the ability to play with the ray tracing? Was it at the initial stages of the project? Or did it come in midway through replacing the old PRT system?

Nikolay Stefanov: A little bit earlier than midway through, but as I say, we started out with PRT, or actually, we even went further back to the Dunia engine at one point, just to cut down on the baking times. So the switch was actually pretty easy for us, just because the quality of the RT is so much better than the pre-baked [approach]. That was done some time during pre-production. The impact was pretty low on the visual side.

One of the things that is interesting is that when you're building for ray tracing, there are actually different rules for setting up the assets. So, for example, one of the things that you need to do is to make sure that the interiors are watertight, otherwise you're going to get light bleeding from the outside. We also need objects to be double-sided too. So there's things that you normally wouldn't have done before, where you would only have one-sided polygons for the walls, where now you actually have to make them double sided, to make sure that all the probe stuff and everything else works correctly.

Oleksandr Koshlo: The geometry now needs to represent the real world much more closely.

Digital Foundry: Regarding the probes, how are they placed in the world? Is it just a grid? Or is it selective to a degree?

Oleksandr Koshlo: It's a grid that is a bit selective [everyone laughs]. So we still have some heuristics, on which level to place the grid and where to bias it based on if we're indoors or outdoors, what kind of things we have there, what the dimensions are of the place we're in. But it's a cascaded grid - so it is four cascades, with the same resolution for each but each one subsequently covers a much greater distance.

Digital Foundry: Regarding transparency shading, obviously you have the reflections on water, and the GI is propagating onto transparent surfaces and taking from them - but what about glass? How is the shading done there?

Oleksandr Koshlo: We still have cube maps. And we still rely on those for these kinds of glass surfaces. There's also a local refraction, which you can see in the water, or on fully transparent glass surfaces that is screen-space based. So there's no support for ray-traced refractions or reflections from the semi-transparent objects, as of now.

In addition to Tom's full video, Oliver and Rich discussed the console versions of Avatar on DF Direct Weekly, reproduced in the DF Clips video here. Watch on YouTube

Digital Foundry: On consoles are you using your own, pre-built BVH that you're loading in? How are you building the BVH?

Oleksandr Koshlo: We have a custom solution for the BVH on consoles. Since we are not relying on their APIs, we pre-build the bottom-level BVH for meshes offline to get higher quality. Then we built our own custom solution for the BVH in a way that allows us to build the top-level BVH on the CPU - whereas with DXR and existing APIs, the way you do this is that you send all of your instances to the GPU, and the GPU creates an acceleration structure. We rely on caching a lot, and we only rebuild things that have changed. This allows us to actually efficiently build the top level on the CPU and saves us some GPU time on that.

Digital Foundry: That is fascinating as that is usually done in async compute on the GPU. So what is done asynchronously on the GPU? It's probably platform-specific in the end, but I'm very curious about what things are done asynchronously there.

Oleksandr Koshlo: It's actually a lot of things. We use async compute a lot. We love it. We only ship DX12 on PC so we don't actually have platform differences in terms of things using async. The volumetrics are completely running on async; the probe ray tracing and lighting is running on async as well. While the g-buffer tracing part runs on the graphics queue, the probe tracing part runs on async. The GPU culling also runs on async compute, and a bunch of other smaller things as well, so it's loaded quite well.

Digital Foundry: On PC with the DXR ray tracing API, you've got 1.0 and 1.1 inline variants. How are you doing it on PC?

Oleksandr Koshlo: We're using 1.1 inline. This was really crucial for us, as we decided early on that ray tracing is going to work for us and we can ship with it by avoiding all the shading divergence. So DXR 1.1 allows us to do it in a very similar fashion to how we're doing it on the consoles. It's basically just changing instructions. With the average materials, that is definitely enough for us.

Digital Foundry: That's like one material per object, or...?

Oleksandr Koshlo: It's one material per mesh. Often our objects consist of a bunch of meshes, so you still get some variation within an object.

Digital Foundry: So, what are the modes on console and how do they shake out?

Nikolay Stefanov: So we support a 60fps "prefer performance" mode on PS5 and Xbox Series X. Players can also select a "prefer quality" mode which targets 30 fps. Here we sort of crank things up a little bit more, and we output at a higher resolution internally. On Series S, we target 30fps and there is no 60fps mode for that particular console.

[Our full Avatar console tech breakdown has since been performed by Tom, which goes into detail about how the modes compare in terms of graphics and performance, including how the Series S holds up.]

Digital Foundry: So in the past, Snowdrop was one of the few engines that kind of popularised temporal upscaling, so how are you doing that this time around?

Nikolay Stefanov: We are using FSR on consoles for upscaling and temporal anti- aliasing, same as on PC. By default, I think you've probably noticed that it's FSR. On PC, we also support DLSS. We're also working with Intel to support the latest version of XeSS, which is going to come as an update - hopefully soon.

Digital Foundry: Does the game use dynamic resolution scaling on consoles? I don't actually recall if the Division did?

Oleksandr Koshlo: The Division did use dynamic resolution scaling and we do use it for Avatar as well.

Nikolay Stefanov: That's one of the differences between the favour quality and favour performance mode. So in the 60fps performance mode, we allow the internal resolution to drop a little bit more. That's one of the major differences you're going to see.

Digital Foundry: So on PC, there's an option next to the resolution scaler that talks about biasing the resolution. Can you explain what that does?

Nikolay Stefanov: Yeah, absolutely. There's a PC features deep-dive article that talks about this and many other things... the VRAM meter, the PC benchmark, etc etc. It's basically controlling what internal resolution you render at and the quality of the upscaling.

[The scaling is based on the current display resolution. Sub-4K resolutions are biased towards higher rendering resolutions, at 4K it's the same as fixed scaling, above 4K it's biased towards lower rendering resolutions].

Avatar: Frontiers of Pandora showcased running at 4K DLSS performance mode on PC under ultra settings - at the time these shots were taken, Ubisoft Massive had yet to reveal the hidden 'unobtanium' settings.

Digital Foundry: One thing I noticed is that the world density is incredibly high in terms of just how much vegetation there is. Did you leverage any of the newer DX12 features and/or things brought about by RDNA, like primitive shading or mesh shading?

Oleksandr Koshlo: We do ship with mesh shading on consoles. So, there are two things that contribute to the high density of our geometry in the world. One is the GPU geometry pipeline, which is new to Avatar, and it supports our procedural placement pipeline. So this brings a lot of geometry instances and we use the GPU to cull them away, to only render what is on the screen. And then we also chunk geometry into what we call meshlets, and we use native hardware features like primitive shading and mesh shading to then render them on screen. We use an additional culling pass to discard the meshes that aren't on screen. These things really improve performance for geometry rendering.

Digital Foundry: Is there a mesh shading path on the PC version?

Oleksandr Koshlo: No, we decided against that at some point, as the technology is fresh and there's a certain challenge in supporting the variety of GPUs and hardware available on PC. So for now, we went with the simpler path of fully supporting it first on consoles.

Nikolay Stefanov: But on all PCs, we still use the GPU-driven pipelines for culling and more. So it's just the meshlets path which is not there.

Digital Foundry: Could you go into this GPU-driven pipeline and how it works. The first time I remember reading about it was in Seb Aaltonen's presentation for AC Unity, what is it like exactly?

Nikolay Stefanov: So, as you said, the density of the detail of the world is something that we wanted to excel at, especially since Pandora is the star of the show in the movies, right? We started by developing systems to define placement for how a particular biome should look. There are rule-based systems that tell us when you're close to water, what kind of specific plants live there, when you have this kind of tree, what kind of other flora is it surrounded by, etc etc. So, these operate in near real-time, so you can change the rules and then have the world be repopulated within a couple of seconds.

There were two challenges with this. One is that we have nearly ten times the amount of detail as our previous titles. And the other challenge is that we need to show this detail at a great distance with the vista systems that we've developed. So the only way for us to handle this type of detail was to move to a GPU-based pipeline - and there's nothing super complex about GPU pipelines.

Basically, what they do is that, rather than operating on a per-asset basis, they operate on big chunks of geometry, sectors that are 128x128 metres. What the GPU pipeline does is it takes the entire sector, first goes through a specific path that culls the sector instance, where it basically says "is this sector at all visible?", then it does the individual culling process for the instances, including the meshlets for the specific mesh parts.

Then this builds a list of things for the GPU to do vertex shading for - which is quite complex vertex shading I must say. You would be surprised by the stuff that our technical artists are doing in vertex shaders. We basically render these to the G-buffers and light them, etc, etc. But it's important for us to still maintain the flexibility that vertex shading gives us because they're used for all of the interactive plants that you see in the game: things that turn, the things that bend, the way those yellow plants move....

Digital Foundry: Oh, yeah, the weird, conical plants that scrunch up when you touch them.

Nikolay Stefanov: So all of this is actually done in vertex shaders. And if you were just to run these for everything, then the performance would tank. So that's why it's important to have meshlet support for this. So that's roughly how our culling works.

Oleksandr Koshlo: With regard to the GPU instance culling pipeline specifically, we don't have any distinction on the asset side. So when an asset is created, it has no knowledge of whether it is going to be procedurally placed then GPU culled, or if it is going to be hand-placed and go through another system, so it's all transparent in that aspect.

Nikolay Stefanov: Another thing that we've done for this project is the vista system. So basically, we have a couple of stages. Things that are close to you at a reasonable distance are full-detail geometry, eventually those get loaded out of memory. After that we switch into our imposter representation at the second distance stage, which again, is GPU-driven completely for entire sectors. The imposters are your standard imposters with normal maps, though we support shadowing on them as well. And then as you move out even further you have the third stage; even the imposters get unloaded, and we're left with the representation of the super big things: the arches, the floating islands, and so on. Again, all of this is driven by the GPU: culling, rendering, etc.

Digital Foundry: I'm genuinely shocked that it even runs without mesh shading on PC based on your description there, so it must be pretty optimised even without.

Oleksandr Koshlo: To be fair, getting mesh shading to be faster than non-mesh shading was actually quite a big challenge for me. I've spent quite a lot of time on it and still the vanilla rasterisation is really fast and works really well.

Digital Foundry: You talked about rules-based placement of assets, but how is the terrain actually generated?

Nikolay Stefanov: So for this project, as with any high-quality open world, I think the key thing here is to make sure that you got a good ratio of hand-placed content - where you actually have a designer that sits down and decides know what the level is going to look like and what the terrain is going to be - and then you leave the detail to the computer which can place stuff and do erosion a lot faster than a human and which can do erosion.

For us, the way that the world is achieved with something we call level templates. Take for example, the home tree in the game. This is one specific level template that has a lot of hand-placed detail inside of it, but it also has artists doing the terrain around it by hand. What our level editor in Snowdrop allows us to do is to take that level template and move it around the world, so that hand-made terrain is blended with the larger terrain that is the base plate of the level.

In general that's how we do it; we have a base plate that is created by a designer, through the use of procedural systems, but also with a lot of hand crafting to guide the player. We have systems for erosion, for how bounce plants are spreading... And on top of that, we place level templates, some of which are placed by hand in an exact location, the terrain is blended with them and everything is aligned. We also have specific level templates which are also procedurally placed, or scattered around the level in order to simplify the lives of the designers who probably don't want to be placing a rock formation by hand thousands of times.

A look at how Avatar scales across the current-gen consoles. It's interesting to note that even Series S produces a good-looking experience, albeit one limited to 30fps with resolution and feature cutbacks.

Digital Foundry: I was a little surprised by how big the world was, because I got to the home tree, I looked at the map and thought, "Oh, I'm not even a quarter of the way through this world."

Nikolay Stefanov: We do have three distinct regions. I think you're still in the first biome then. Each one of them is a little bit bigger than the size of the map in The Division 2.

Digital Foundry: One thing I noticed on PC specifically is something that I have been talking about for years now. It makes me sad that I have to talk about it at all, but I want to know about PSO compilation. I'm curious how the game handles it for the PC platform, because the game doesn't stutter as we see in far too many other PC releases.

Nikolay Stefanov: We basically pre-built the PSOs and we shipped... I think around 3GB of PSOs on PC, something like that. It's a little bit crazy.

Oleksandr Koshlo: It's just a lot of variation. We also handle the loading of objects differently. I don't know if I should show all the cards here [everyone laughs].

Oleksandr Koshlo: Stutter should not be experienced in the game by design. If a PSO compilation needs to occur, that means the object will stream in later. We treat the compilation step as part of loading the object. Now technically, there can be bugs in the code which cause PSO stutter, but we look out for that. That's reported internally and we catch it. But that is not the norm. We take that very, very seriously.

Digital Foundry: One thing I noticed while looking in the config file is that there's VRS (variable rate shading) listed - does the game actually support this?

Oleksandr Koshlo: Yes, it does. I need to check for the setting specifically, but the support is there.

Digital Foundry: Is it used on the Xbox Series consoles?

Nikolay Stefanov: I don't think it is.

Oleksandr Koshlo: I don't think we're using it at this time on the Series consoles.

Digital Foundry: Are there any particular parts of the project you're especially proud of?

Nikolay Stefanov: One of the things that I want to bring your attention to is the sound implementation of the game. This is something that we are all collectively very proud of. We use ray tracing for sound propagation. When a [sound] emitter is occluded or a sound reflects [off a surface], this is all simulated through our ray tracing world. I do hope we're going to get an opportunity to talk about it at GDC next year - it's a really cool system.

One of the other crazy things is that every single individual plant that you see on the ground actually has a little bit of "trigger volume". So when the player character or a land animal walks through them, they will make a localised sound emitter. So basically, when you hear something rustling, that means that there is actually an animal that is going through the plants there, it's not just a looping ambience that is "faked". So if you have a good pair of headphones, then you can really enjoy that.

Another thing I am proud of is the PC benchmark. It has very, very detailed graphs that I think you'll find interesting. We have profiling tags in our game that tell us how much time the ray tracing pass takes on the GPU, how long the G-buffer pass took, how long the post-processing pass took, etc. And there is a detail page where you'll be able to see all these things individually as part of the benchmark. We also support automation of the benchmark, so you can launch it through the command line and then it will give you all of these details in a CSV file. The benchmark will also go into CPU usage. So it will tell you how much time it took us to process agents, collision detection, etc, etc. So if you'd like stats and graphs, I think this one is going to be for you.

Oleksandr Koshlo: I think in general I am just proud of how it all came together - and that we managed to cram it all into consoles at 60fps. Our philosophy for a long time has been not to rely on some "hot thing". We do ray tracing here, but only for the things we care about, things that improve visual quality a lot with the proper performance for us. We care about things with a high bang for your buck. And we try to work on, not just the hard things, but on basic things and then do it right so that everything comes together well. I think we did that again. And I definitely hope you like the results.

Nikolay Stefanov: I have a question for you, Alex. Did you take a look at the motion blur?

Digital Foundry: [Laughs] Yes. I did take a look at the motion blur. It is much better than in the trailer. [Everyone laughs]

Digital Foundry: This is just a bit of feedback. Would you mind implementing a motion blur slider, because the game currently just has a binary switch for motion blur at the moment. It'd be nice to turn up the exaggeration of the effect or turn it down based on personal preference. A lot of the motion blur disappears at higher frame-rates and some people might like greater smoothing, especially in a game like this one that has cinematic ambitions.

Nikolay Stefanov: I think it's a good idea. We can discuss with the designers and see if this is something that we can implement later. I think some people really enjoy motion blur. The funny thing about the motion blur is that our creative director, Magnus Jansén, he's a big fan of Digital Foundry, so the moment he saw you talking about the motion blur he came to us.

The motion blur discussion in this section of the interview references Alex's initial reaction to the Avatar: Frontiers of Pandora trailer, shown above.Watch on YouTube

Digital Foundry: You mentioned as part of the benchmark that you are recording data on CPU usage that you are exposing to the users. Can you go into how you're taking advantage of multi-core CPUs and multi-threading in a good way? Because it's something that is still a big, big problem area in PC games.

Nikolay Stefanov: Absolutely, we can definitely go into a little more detail. So with Snowdrop and Avatar, we work with something that's called a task graph. Rather than having a more traditional single gameplay thread, we actually split the work up into individual tasks that have dependencies, and that allows us to use multi-core CPUs in a much more efficient way. Actually, the game doesn't run that well if you don't have many cores.

The way we do it is that we utilise all of the cores except one, which we leave for the operating system. For the rest, we run a bunch of tasks on them, depending on the load. One of the good things about Snowdrop is that it allows us the flexibility to run this kind of stuff and one of the things that we spend a lot of time on is just breaking up dependencies to make sure that for example, the NPCs can update in parallel, that the UI can update in parallel, that the physics can update in parallel as well. So hopefully you'll see good CPU optimisation.

Digital Foundry: I definitely did right away. Just very briefly, you support FSR 3 Frame generation. And you're going to be supporting XeSS, hopefully in the future. Are you looking at DLSS 3 Frame Generation at all?

Nikolay Stefanov: We don't have any concrete plans for DLSS 3 Frame Generation... but we are working with Nvidia quite closely, so hopefully that you're going to hear more related to that in the future.

Digital Foundry: There are some plants that are breakable in the world. How are they done?

Nikolay Stefanov: It's a continuation of the systems that we used for The Division. While most of the objects have support for destruction in one way or another, the most basic form of destruction is to switch to a destroyed version of the shaders. You can see it when you go near a polluted area, for example, you can see the destroyed plants and so on, then when you defeat the bases the nature is cleansed and it switches back to the original plant look.

Certain bigger plants support something that we call "mesh cutting" and I think most of them are "pre-cut". Here in DCCs (digital content creation applications), such as Maya or 3DS Max, you define how they're supposed to be cut. Then, when we detect a hit, we take that particular plant instance from the GPU-driven pipeline, and we transform it into a more traditional CPU-driven object that is then split up and destroyed. Then we do physics simulation on the bits that fall out of it. If you do this too much, you're probably going to start seeing some frame-rate drops.

Digital Foundry: With regards to the sound system and the ray tracing of it, is that ray tracing done on the CPU? Or is it done in hardware on the GPU?

Nikolay Stefanov: It's the GPU, where it is available in hardware. We use the exact same ray tracing worlds and the exact same ray tracing queries as the rest of the system.

Digital Foundry: The sound does seem like it's propagating in a way that is highly realistic, it's very well done.

Nikolay Stefanov: Yeah, absolutely. I think one of the other things that you'll see is that it's also interactive. So if you try to fire your weapon, you will see that certain bird sounds disappear, because they're scared of you. That's not going to happen if you fire your bow. It's all based around the interactivity.

That's one of the things that I'm always in two minds about as a technical director. As a technical director, you want to keep a limit on how ambitious a particular system is. But this time with the audio team, they've had to reign in their own ambition. They are doing so much. There are even moments where they place procedural seeds where you're going to hear the wind whistling through certain geometry, so they figure out where you're likely to have wind whistling sounds based on different assets, and when a storm comes it will have unique aspects, you will be hearing all this with 3D-positioned propagated sounds.

Digital Foundry: The terrain itself on the ground is rather tessellated. How is that done?

Oleksandr Koshlo: It's pre-tessellated on the CPU. So we just send more detailed grids in the places that we need those.

Nikolay Stefanov: In terms of terrain, we didn't really invest that much in the technology this time around, because a lot of times it's completely covered in stuff!

Digital Foundry: Yeah it is usually covered! One of the things that I always loved about The Division was the volumetric rendering of the lighting itself and the particle lighting. Have things changed for Avatar here?

Oleksandr Koshlo: Yes. In terms of volumetrics, we only had the volume around and in front of the player in The Division games. We now have a volume and ray marching past it, so we can support much larger distances; it would completely fall apart without it. We also have volumetric clouds now. We uniformly ray march through fog and clouds. Clouds can also be a part of the close-up volume as well, because we can actually fly into that now with the flying mount. It is a unified system.

Nikolay Stefanov: As Sasha says, you can actually fly above the clouds now. This leads to interesting situations where, on the ground, for example, you might have a thunderstorm, but now you can fly the banshee though the volumetric ray marched clouds and then actually get above it. It looks pretty cool.

In terms of particles, they receive lighting from ray tracing and we actually have full support for GPU particles now. In the Division games we use GPU particles for snow and rain, if I remember correctly, but now it's all completely integrated with Snowdrop's node graphs. So the majority of particle effects are going through the GPU with collision detection and all of the lighting. So that's one of the big things that we've done. So all of this swirling, small things that you see, those are just GPU particles.

Digital Foundry: A lot of information to digest here. Thank you so much, Sasha. Thank you so much, Nikolay. Thank you for your time. I hope I'll talk to you both at some point in the future again. I hope there's a GDC presentation about everything you've done!

Read this next

seductrice.net
universo-virtual.com
buytrendz.net
thisforall.net
benchpressgains.com
qthzb.com
mindhunter9.com
dwjqp1.com
secure-signup.net
ahaayy.com
tressesindia.com
puresybian.com
krpano-chs.com
cre8workshop.com
hdkino.org
peixun021.com
qz786.com
utahperformingartscenter.org
worldqrmconference.com
shangyuwh.com
eejssdfsdfdfjsd.com
playminecraftfreeonline.com
trekvietnamtour.com
your-business-articles.com
essaywritingservice10.com
hindusamaaj.com
joggingvideo.com
wandercoups.com
wormblaster.net
tongchengchuyange0004.com
internetknowing.com
breachurch.com
peachesnginburlesque.com
dataarchitectoo.com
clientfunnelformula.com
30pps.com
cherylroll.com
ks2252.com
prowp.net
webmanicura.com
sofietsshotel.com
facetorch.com
nylawyerreview.com
apapromotions.com
shareparelli.com
goeaglepointe.com
thegreenmanpubphuket.com
karotorossian.com
publicsensor.com
taiwandefence.com
epcsur.com
mfhoudan.com
southstills.com
tvtv98.com
thewellington-hotel.com
bccaipiao.com
colectoresindustrialesgs.com
shenanddcg.com
capriartfilmfestival.com
replicabreitlingsale.com
thaiamarinnewtoncorner.com
gkmcww.com
mbnkbj.com
andrewbrennandesign.com
cod54.com
luobinzhang.com
faithfirst.net
zjyc28.com
tongchengjinyeyouyue0004.com
nhuan6.com
kftz5k.com
oldgardensflowers.com
lightupthefloor.com
bahamamamas-stjohns.com
ly2818.com
905onthebay.com
fonemenu.com
notanothermovie.com
ukrainehighclassescort.com
meincmagazine.com
av-5858.com
yallerdawg.com
donkeythemovie.com
corporatehospitalitygroup.com
boboyy88.com
miteinander-lernen.com
dannayconsulting.com
officialtomsshoesoutletstore.com
forsale-amoxil-amoxicillin.net
generictadalafil-canada.net
guitarlessonseastlondon.com
lesliesrestaurants.com
mattyno9.com
nri-homeloans.com
rtgvisas-qatar.com
salbutamolventolinonline.net
sportsinjuries.info
wedsna.com
rgkntk.com
bkkmarketplace.com
zxqcwx.com
breakupprogram.com
boxcardc.com
unblockyoutubeindonesia.com
fabulousbookmark.com
beat-the.com
guatemala-sailfishing-vacations-charters.com
magie-marketing.com
kingstonliteracy.com
guitaraffinity.com
eurelookinggoodapparel.com
howtolosecheekfat.net
marioncma.org
oliviadavismusic.com
shantelcampbellrealestate.com
shopleborn13.com
topindiafree.com
v-visitors.net
djjky.com
053hh.com
originbluei.com
baucishotel.com
33kkn.com
intrinsiqresearch.com
mariaescort-kiev.com
mymaguk.com
sponsored4u.com
crimsonclass.com
bataillenavale.com
searchtile.com
ze-stribrnych-struh.com
zenithalhype.com
modalpkv.com
bouisset-lafforgue.com
useupload.com
37r.net
autoankauf-muenster.com
bantinbongda.net
bilgius.com
brabustermagazine.com
indigrow.org
miicrosofts.net
mysmiletravel.com
selinasims.com
spellcubesapp.com
usa-faction.com
hypoallergenicdogsnames.com
dailyupdatez.com
foodphotographyreviews.com
cricutcom-setup.com
chprowebdesign.com
katyrealty-kanepa.com
tasramar.com
bilgipinari.org
four-am.com
indiarepublicday.com
inquick-enbooks.com
iracmpi.com
kakaschoenen.com
lsm99flash.com
nana1255.com
ngen-niagara.com
technwzs.com
virtualonlinecasino1345.com
wallpapertop.net
casino-natali.com
iprofit-internet.com
denochemexicana.com
eventhalfkg.com
medcon-taiwan.com
life-himawari.com
myriamshomes.com
nightmarevue.com
healthandfitnesslives.com
androidnews-jp.com
allstarsru.com
bestofthebuckeyestate.com
bestofthefirststate.com
bestwireless7.com
britsmile.com
declarationintermittent.com
findhereall.com
jingyou888.com
lsm99deal.com
lsm99galaxy.com
moozatech.com
nuagh.com
patliyo.com
philomenamagikz.net
rckouba.net
saturnunipessoallda.com
tallahasseefrolics.com
thematurehardcore.net
totalenvironment-inthatquietearth.com
velislavakaymakanova.com
vermontenergetic.com
kakakpintar.com
jerusalemdispatch.com
begorgeouslady.com
1800birks4u.com
2wheelstogo.com
6strip4you.com
bigdata-world.net
emailandco.net
gacapal.com
jharpost.com
krishnaastro.com
lsm99credit.com
mascalzonicampani.com
sitemapxml.org
thecityslums.net
topagh.com
flairnetwebdesign.com
rajasthancarservices.com
bangkaeair.com
beneventocoupon.com
noternet.org
oqtive.com
smilebrightrx.com
decollage-etiquette.com
1millionbestdownloads.com
7658.info
bidbass.com
devlopworldtech.com
digitalmarketingrajkot.com
fluginfo.net
naqlafshk.com
passion-decouverte.com
playsirius.com
spacceleratorintl.com
stikyballs.com
top10way.com
yokidsyogurt.com
zszyhl.com
16firthcrescent.com
abogadolaboralistamd.com
apk2wap.com
aromacremeria.com
banparacard.com
bosmanraws.com
businessproviderblog.com
caltonosa.com
calvaryrevivalchurch.org
chastenedsoulwithabrokenheart.com
cheminotsgardcevennes.com
cooksspot.com
cqxzpt.com
deesywig.com
deltacartoonmaps.com
despixelsetdeshommes.com
duocoracaobrasileiro.com
fareshopbd.com
goodpainspills.com
hemendekor.com
kobisitecdn.com
makaigoods.com
mgs1454.com
piccadillyresidences.com
radiolaondafresca.com
rubendorf.com
searchengineimprov.com
sellmyhrvahome.com
shugahouseessentials.com
sonihullquad.com
subtractkilos.com
valeriekelmansky.com
vipasdigitalmarketing.com
voolivrerj.com
worldhealthstory.com
zeelonggroup.com
1015southrockhill.com
10x10b.com
111-online-casinos.com
191cb.com
3665arpentunitd.com
aitesonics.com
bag-shokunin.com
brightotech.com
communication-digitale-services.com
covoakland.org
dariaprimapack.com
freefortniteaccountss.com
gatebizglobal.com
global1entertainmentnews.com
greatytene.com
hiroshiwakita.com
iktodaypk.com
jahatsakong.com
meadowbrookgolfgroup.com
newsbharati.net
platinumstudiosdesign.com
slotxogamesplay.com
strikestaruk.com
techguroh.com
trucosdefortnite.com
ufabetrune.com
weddedtowhitmore.com
12940brycecanyonunitb.com
1311dietrichoaks.com
2monarchtraceunit303.com
601legendhill.com
850elaine.com
adieusolasomade.com
andora-ke.com
bestslotxogames.com
cannagomcallen.com
endlesslyhot.com
iestpjva.com
ouqprint.com
pwmaplefest.com
qtylmr.com
rb88betting.com
buscadogues.com
1007macfm.com
born-wild.com
growthinvests.com
promocode-casino.com
proyectogalgoargentina.com
wbthompson-art.com
whitemountainwheels.com
7thavehvl.com
developmethis.com
funkydogbowties.com
travelodgegrandjunction.com
gao-town.com
globalmarketsuite.com
blogshippo.com
hdbka.com
proboards67.com
outletonline-michaelkors.com
kalkis-research.com
thuthuatit.net
buckcash.com
hollistercanada.com
docterror.com
asadart.com
vmayke.org
erwincomputers.com
dirimart.org
okkii.com
loteriasdecehegin.com
mountanalog.com
healingtaobritain.com
ttxmonitor.com
nwordpress.com
11bolabonanza.com