Skip to main content

The making of Senua's Saga: Hellblade 2 - the Ninja Theory interview

Breaking down an Unreal Engine 5 technological showcase.

Senua's Saga: Hellblade 2 stands as one of the most impressive games we've seen this year, and, in fact, this entire generation. It's a game that pushes the limits of what Unreal Engine 5 can offer at this point in time, taking full advantage of its many features. Beautiful acting, amazing character models, stunning landscapes, and truly exceptional lighting all come together to create a moody, beautiful and atmospheric experience.

We recently had the chance to sit down with three key members from the Hellblade 2 development team at Ninja Theory: VFX director Mark Slater-Tunstill, environment art director Dan Atwell and audio director David García Díaz. We spoke at length about the tech behind the game, the challenges and opportunities of being an early Unreal Engine 5 release, and some key development decisions that defined the title.

As usual, the interview text below has been lightly edited for length and clarity, with the full discussion available via the video embed below.

Here's the full Hellblade 2 tech interview, featuring Mark Slater-Tunstill, Dan Atwell and David García Díaz. Watch on YouTube

Digital Foundry: I want to start by going back to the beginning. When the project first got underway, this was announced prior to the arrival of the new consoles. It was the first next generation game announced. And as we know now, it did end up shipping on Unreal Engine 5, using many of those key features. When you announced it, UE5 didn't yet exist publicly, and you guys didn't actually have any information on it either. Can you talk about the point where you first switched over to this new version of Unreal Engine, what your first thoughts on these new tools and technologies were, and discuss the transitional period when you moved over from UE4 to UE5?

Mark Slater-Tunstill: The initial Senua's Saga was actually UE4 for quite a while. And I remember we all saw the tech demos for UE5 come out and we said "is that real? Are we going to be able to use that many triangles?" So there was a little bit of trepidation. Actually looking into it, we then started to think about how we could use that to make the vision of a really immersive cinematic experience.

Dan Atwell: So I'm trying to think when we transitioned. It was after the Troll demo, actually... it's probably a little-known fact that the Troll demo actually UE4. And so we transitioned after that, which was pretty smooth, really. It wasn't too bad. It wasn't the case that many parts of moving over to UE5 that were that alien. There was the excitement of going to Nanite and Lumen. But it was pretty easy, I would say.

Mark Slater-Tunstill: I think as a developer though, it's so easy to get super excited. You see all these new features and say "we're going to use them all! We want everything!" But then you come back down to reality and go, OK, we've got this. The GPU limits, the CPU limits, certain memory limits. So actually, how do we balance this stuff?

Dan Atwell: We weren't totally sure that we were going to use Nanite even in the beginning, or even Lumen, because of the overheads and that stuff.

Mark Slater-Tunstill: Then virtual shadow maps came in really late, but it was one of those things where, when we saw them on the characters, we said "we really need this." This is just going to sell those performances so much better.

Digital Foundry: So one of the key features then, of course, is Nanite. And Hellblade is a game that features a lot of dense, granular geometry across its world. It's extremely detailed. But this was also true of the Unreal Engine 4 version you guys showed off prior to switching to Unreal Engine 5. And I'm curious how this impacted the art pipeline, the workflow, and generally what it was like transitioning from the more traditional static mesh design to using Nanite in your game.

Dan Atwell: Yeah, so moving to Nanite from a static mesh point of view, it's all or nothing. It's more expensive to have a non-Nanite mesh, so we had to go all in. In terms of how it changed our pipeline, you remove a big chunk of processing time because you're effectively taking the raw data, because we moved over to doing as much photogrammetry as possible as well. So first of all, you're removing the chunk of the pipeline where you're making stuff by hand and then you're having to reduce it down and bake it and then get it in-engine. So it is a big bit of that that's just gone. So we upfront loads of stuff by going out and capturing things on location or in the studio, etc. And then it's a case of processing that and post-processing it in a lot of automated ways and then getting it straight in at its highest fidelity that we can. It then leaves you a lot more time to do things like set dressing and lighting, composition and all that kind of stuff. It was quite a bit of a change really. That was the biggest thing for us in terms of environment art.

Mark Slater-Tunstill: I think it's really good from a workflow perspective, cutting out the "I'm making just a hundred rocks" process. Actually, you can then think "what will make this scene look good? How can we adjust the light to really pick out the detail?" And it frees you up to actually be a bit more artistic and do less manual labour that actually isn't as enjoyable as the artistic part of it.

Going back to where it all began - the original Senua's Saga: Hellblade 2 reveal trailer.Watch on YouTube

Digital Foundry: Which version did you end up shipping on? Because there was a point where Nanite couldn't handle things such as vegetation, foliage and the like. Which version was that?

Mark Slater-Tunstill: Uh, 5.3 we shipped on. And that was actually a really important upgrade for us, because we did want moving foliage and that kind of stuff, and versions 5.2 and 5.1 didn't have that and we were really restricted in that sense.

Dan Atwell: Foliage was a huge bottleneck for a long time because of the expense of non-Nanite meshes for grass, etc. And then you had a Nanite mesh, but with a mask material on it that was super expensive. So we did a lot of networks in Houdini where we did bulk conversion of assets to basically cut out geometry where the masking was. Which is a bit of a pain when you're having to retrofit it in, but what we ended up with obviously works pretty well. For all of the grass meshes, all of the bushes and trees and things like that, we had to do a lot of post-processing on those when we moved over to the fully Nanite foliage system.

Mark Slater-Tunstill: It looks cool when you go into photo mode though, and like zoom right in and you realise "oh, that's actually modelled."

Digital Foundry: When you switched over to using Nanite, was there ever a moment where you went from making discrete, distance-based LODs to Nanite, where you realised after the switch that you could increase the quality of what would have been LOD 0 before, because now Nanite would render it with a greater efficiency?

Dan Atwell: Yeah, so for the equivalent of LOD 0 stuff, there's definitely a bump. It's probably not as high a poly count as you would think in a lot of instances. We got away with doing a 50 percent reduction in poly count on a lot of assets' Nanite meshes as well, with no visible difference in a lot of instances. Because the big bottleneck with Nanite is memory; that's the new poly count, as it were. When we hit the memory too hard, with too many super-high-poly Nanite meshes, then you get an issue. So do you have to selectively go in there and start reducing polys on the Nanite meshes, but in a lot of instances you can get away with a huge amount of reduction without seeing any sort of visual fidelity drop.

Digital Foundry: Ah, so that explains what I noticed that in photo mode, if you zoom the camera in really far, you can kind of see it breaks down a little bit, but from a normal gameplay perspective it looks really good. The detail is there, so it makes sense to sort of limit the mesh density to save on memory resources.

Dan Atwell: And it's interesting that you say that because, when you go into photo mode, you're kind of breaking it in a way, and you've always got to remember when developing a game that it's easy to always be in the debug camera, flying around and putting in detail where you don't need it. As soon as you go into the game camera, you're missing 90 percent of that detail because of the distance you are away from the surface.

Here's the full DF tech review of Hellblade 2, with John Linneman charting its technical achievements. Watch on YouTube

Since you weren't using Lumen in that original troll demo, I was curious what aspects of indirect lighting you were bringing over from Unreal Engine 4. Was it baked lighting with light maps or was it something else?

Mark Slater-Tunstill: It was fully real-time lighting in that one, but lots of placed individual point lights, non-shadow casting lights, just to fake the bounce.

Dan Atwell: It's also worth saying actually, that even with Lumen, we were going in and hand art directing and hand-placing bounce cards and lights and stuff as well to supplement what you got from the real-time GI.

Mark Slater-Tunstill: Because our aesthetic internally was always called "cinematic realism" rather than "realism"- the way a movie set would be lit. We had extra lighting as well as the bounce lighting, because it's so important for the feeling of certain scenes and sections.

Digital Foundry: That's really interesting. When you were making that troll demo in UE4, was the intent to essentially handle lighting across the entire game in the more traditional method of placing a lot of very carefully hand-adjusted lights throughout the scene in order to get the desired effect?

Dan Atwell: Well, for us it was a continuation of Senua's Sacrifice, exactly the same way we lit a lot of that stuff as well. So we weren't really doing anything new in that respect.

Mark Slater-Tunstill: I think once we did move to Lumen, actually it freed us up a bit more, because that took care of a lot of the hard work.

Digital Foundry: You would light the scene and then go back and sort of artistically adjust things, rather than doing all the work first?

Dan Atwell: On Sacrifice, we were using a real time GI add-in called Enlighten. So it's very similar to what we ended up doing with Lumen. It was a lot slower in terms of updating, but you were getting that baked real-time bounce stuff.

Digital Foundry: We'll get more into Lumen here shortly, but I'm curious if you considered using the hardware path for Lumen, because as far as I know, every version of the game - PC, console, whatever - are limited to the software path. It still looks great, but it does have some negative ramifications, such as the areas around water, where you're using screen space reflections, and there are disocclusion artefacts, etc. It's the one visual thing that falls a little short of everything else in the game.

Mark Slater-Tunstill: Yeah, I totally agree with you. It was a learning process on how we balance in terms of performance and everything. On console, it just wasn't an option this time. But we're still we're still learning, and we've made some incredible profiling tools for this project. We had a self-play system that would do a run of the game overnight, where design could basically make the character go and play the game and that would record all of the data in terms of memory, GPU, CPU. In the morning, we had a little team looking at it and we could see spikes in certain things. At some point, we'll probably look back into hardware Lumen and see if there's a way we can balance that kind of cost. But we have a limited team size, right?

Digital Foundry: Along those same lines, I'm wondering about the settings choices on Xbox Series S, where you don't get the SDF-style reflections, you just get SSR and everything below is cube maps. Is that a memory limitation or a GPU limitation?

Dan Atwell: That was a GPU limitation, so that was why we only used SSR on Series S. It was a tricky balance doing Series S and Series X!

Digital Foundry: From an art design perspective, I'm curious if the game was always designed around being cinemascope from the very first day of its development, or if that was something that came along over a period of time.

Mark Slater-Tunstill: It was quite an early decision, and we knew at that time it was probably going to be a little controversial. But there were very specific reasons for choosing that format. Obviously, there were various cinematographers and movies that we were referencing and looking at. But also, due to the nature of the Icelandic landscape itself, we actually wanted that wide format because it lent itself to showing that kind of environment much better. I think, you know, it obviously wasn't just the black bars, we modelled the lens, so you have the correct barrel distortion, the correct kind of depth of field, lens flares and that kind of stuff. And I think it's those little details that people might not notice, but subconsciously - because you're so used to watching movies and TVs in that format - I think your brain fills in a lot of the gaps and you look at it and go, "oh my god, this really does sort of look a lot more like a movie than some other games might do." And it's those little details that help that. But it's a bit of a bold creative risk because it is different to what people expect from a video game.

Digital Foundry: I really love the lens simulation stuff; it's one of the most unique visual elements in the game's presentation, especially with depth of field, because it actually behaves more like an actual camera lens, like a real piece of glass, which you often don't really see in video games, but it feels very authentic here.

Mark Slater-Tunstill: That was definitely a key thing for us, kind of making it behave like a real-world lens.

Digital Foundry: So one tech art detail that I'm very curious about is how the running water is done in the game. It's really obvious in the first level with the little pitter-patter and dripping of water going down the landscape. So could you tell me how that was achieved?

Mark Slater-Tunstill: So we built a tool in Houdini, actually, that would export mesh from the Unreal Editor, put it into Houdini, and then we generated the flow maps down it. And then we could chunk that up into surface and side flows. Then that was all rendered with the single layer water render pass. Initially we had one giant export and quickly we became aware that the GPU cost of this was way too big. So then we had to go back into Houdini and get it to sort of spit out more chunks that wouldn't be rendered when you don't need them. But yeah, it was a Houdini-based simulation. Then we had sort of a mix of a material and decal layer and shaders to get the scrolling right. But yeah, it kind of feels pretty nice I think when the water looks as though it's flowing correctly, like down the rock surfaces, it's not just a single direction.

Digital Foundry: A different visual effect occurs in the section when you first approach a village where you'll meet your first comrade. In that section, there's a beautiful village shrouded in fog and darkness, with bursts of light and flame that light up the volumetric fog. It's gorgeous, and it's a proper dynamic light that spills onto the scenery. Can you talk about how you built this scene, because it is one of the best lit scenes I've ever seen in a game.

Mark Slater-Tunstill: That's where Lumen comes into its own, the fact that volumetric fog can be lit by individual light sources. I think we referenced Temple of Doom for that section. We wanted a really dramatic scene, and we always wanted the audio and VFX to sync together, because that's powerful. Our teams love working together and interconnecting those things. But being able to light particles and volumetrics with the same lights that are lighting the environment is quite a big thing. And you can do scenes like that now, which is fantastic from an artistic/creative point of view.

Here's the PC version of Hellblade 2, with Alex Battaglia focusing on what Unreal Engine game developers can learn from this early UE5 title. Watch on YouTube

Digital Foundry: We should talk about the audio, because that's a huge portion of the game's presentation. The timing of the audio to the gameplay beats is really, really special. I want to talk about the binaural stuff in a bit, but let's start with the workflow perspective: how does you work with the graphics team to ensure that scenes like this are possible, in terms of synchronising what's happening on screen to the sound, so that it feels like a truly crafted experience?

David García Díaz: This thing in particular is actually really complicated in audio. The music that is happening before and during the battle is all happening half in world and half crafted by us. But this means like if a drum is placed in the world, the acoustic space is taken into consideration. So for this, we use a tool created by Microsoft called Project Acoustics. It's complicated, but it provides the deep immersion we need, where you believe you are in a space and sounds are connected to that space.

We create a box and map of all the static meshes and the landscape and send this to the cloud to do a lot of calculations, which gives us information about the sound emitter and listener, what's in between; it creates obstruction, occlusion and all these sonic modifiers. So that mixes with the VFX; when there is a drum, the fire goes up, and the same in the battle. So it creates this sense of connection, that I saw is actually very important for the character we are portraying -how her brain tries to connect everything. So I think the whole scene is really powerful.

Mark Slater-Tunstill: Yes, you were able to put events on particular drum beats, so if it's a light drum or a heavy drum in the level blueprint, I can then take that and say "I want a huge first burst, or little quick bursts", and that was very prevalent during that combat scene at the end, The lighting on the small drum beats, we cycled around the arena so you got almost a strobe effect, and then on big beats it was a different light that then lit up the entire arena.

David García Díaz: Also there's dynamism, because if you listen to a piece of music and the drum is always placed in the same spot and at the same loudness, it gets boring. In this moment, you're circling around, and your ear gets stimulated because the sound is further away, it's quieter, it's different. It creates a sense of movement. I think it helps in that scene to make sure the audio feels alive.

Digital Foundry: Part of the immersion then stems from the use of binaural audio. That's something I've been fascinated with for decades; the whole head mic with the microphones inserted in the ear canal. Is this stuff actually recorded with a real actual head mic, or is it digitally reproduced, or are you using a combination of the two techniques to create this effect?

David García Díaz: Both! For the voices, we use a Neumann KU 100 microphone, a lot of content is crafted by us with different tools that nowadays are incredible. There's this plugin called DLBR, for example, where you can shape movement, you can shape the width of a sound; before, it was very complicated. A big sound has volume and translating this inside of a wall on a video game is very complicated, even with one emitter. Now we have things like Ambisonic which has big spheres that you can rotate, they can cover a big distance and give you details of different areas without having 250 emitters, you can just put one. We went to record ambiences with the binaural head and because, although the digitally-created binaural sounds are good, they aren't as good as ones recorded with a real microphone.

And yet, every ear is different. I don't think we have a perfect solution for binaural sounds that works as well as how we hear. We hear perfectly in three dimensions, but every human is different, every ear is different, every head is different. And so normally these microphones are an average of a head. I don't know how many tests they do to create these heads, but I know that some people just don't hear binaural audio at all. For our game specifically, I think the choice with the voices in the head is very natural, and I can't think of any other technology that can give us that effect, especially when we want the game to sound good on headphones specifically.

How does Hellblade 2 hold up on Series S - and what about PC handhelds like the Steam Deck and ROG Ally? Oliver Mackenzie investigates. Watch on YouTube

Digital Foundry: How do you tailor the audio to sound good on a home theatre versus headphones? I prefer an Atmos setup myself, but I did end up using headphones to experience the binaural audio at its best. How do you tune the audio to work well in both cases?

David García Díaz: It's a nightmare, really. We are a very small team, so we decided to make the game sound the best on headphones. When you use binaural sound with speakers, they tend to sound much louder, so we did a quick mix to align more or less with what we think is a good place. There are a number of different factors that we cannot control, and it requires a lot of time to make it work perfectly. Hopefully one day we can achieve this, but right now we decided to put all our efforts into something that we can control, which is headphones. It's a very good question and it's a very hard solution.

Digital Foundry: Another water detail I'm curious about is how all the shoreline rendering was done in the game. Similarly, when Senua interacts with water in cinematics, how was that done?

Mark Slater-Tunstill: We actually used an awesome marketplace plugin called Fluid Flux, which has a load of different options including shoreline rendering and real-time 2D fluid sims - the scene where the lake drains, for example, uses Fluid Flux. It's effectively a 2D real-time fluid sim that takes a height map capture of your area, and then you can basically inject or remove water, time, all that kind of stuff. It's really awesome. We did kind of a lot of adjustments to it to get it to work within our game, but this plugin was made by a guy called Christian, an amazing achievement. Something that other Unreal developers can have a look at as well!

For the shoreline, it does a height map capture. You work out where the shoreline bits are, and then you can change parameters on wave speeds, wave heights, that kind of thing. And you can also make Niagara particle systems that read that same fluid data, so if there's wood debris or whatever, it reacts to all the waves perfectly. And actually, even the audio used that. So we had little blueprints placed all along the shoreline, so we knew when the wave was coming in or going out and that played the correct sound. It's not just a random motion sound, it actually is correct for where you're standing at that point.

David García Díaz: It changed the materials of the floor. So when the water comes, it just starts being sand. So you can play the perfect sound. The system works super well.

Digital Foundry: Speaking of effects there, I wanted to ask about a discrepancy between the gameplay reveal and the final game: the fire sequence in the giant's cave. In that original reveal trailer, it's extremely smooth and fluid, but in the final game I feel there was a little bit of degradation in terms of visual quality. I wonder if you can share why this effect in particular changed?

Mark Slater-Tunstill: I ran out of memory [laughter]. I can go a bit more in depth about that. In the original gameplay demo, I basically did a Houdini fire simulation, baked it out and we were streaming it frame per frame. Unfortunately, that way of doing it tanks your CPU. It just cannot stream it in quick enough and pass the data over to the GPU. So I ended up having to do a version where I was kind of interpolating multiple flipbooks, timed to switch on and off in sequence. Honestly, it was a bit of a headache getting that working and it's one of the scenes that I keep looking at going, "argh, if I had a bit more time, I probably could have done something a bit better there." But it's a learning process, and we might make different decisions next time.

Digital Foundry: Another thing I wanted to ask about is the strength of the facial expressions in Hellblade 2. The facial expressions used in cutscenes are impressive, but I was surprised by their quality during actual gameplay too, like during the fight on the beach against Thorgus, which is something we don't see often - if at all.

Mark Slater-Tunstill: That was actually quite a big moment for us as well, because we had a bunch of the cinematics in the game by that point, and we knew that the three lateral face rigs with the cubic motion processing looked great, but we didn't have a pipeline for yet for doing it in-game for Senua and Thórgestr's faces in that fight. We actually used the MetaHuman animator pipeline. None of our faces in the game were MetaHuman creator faces, they're all custom, but we were able to use a mix of the head-mounted cameras and the iPhone technique to get the actors to go in, watch bits of the gameplay and recreate all the facial expressions. We found that when we mapped it back on in game, it actually looked great. Not having to rely on going to a stage and do a full head mount performance and instead put that stuff on afterwards is awesome. That fight really benefits from the interactions they have, so that MetaHuman Animator pipeline was really good for us.

Hellblade 2 faced a polarised reaction on launch, which we discussed in depth in this snippet from DF Direct Weekly.Watch on YouTube

Digital Foundry: You did mention photogrammetry earlier and I assume that applies to the individual mesh assets of the game (eg a rock) that you place in the level, but how did you generate and make the base level geometry, the terrain?

Dan Atwell: Each location is based on a real location - with some exceptions - so we started off with DEM data for the whole of the Arctic Circle. We cut out Iceland, that was about maybe seven pixels per metre or something, which gives you a ballpark background level. On the play space, we did drone-based photogrammetry, a service called DroneDeploy where we mapped out grid-based captures. We took that photogrammetry data, generated height fields, post processed it in Houdini and stitched it into the overworld low-resolution DEM data as well. That was exported out as height maps and brough into Unreal Engine as the kind of base level. That gave us essentially the topography of the whole area. On top of that, you make changes for gameplay reasons, but it gives you a good slate to work on. So it's a kind of two or three tier sort of approach.

Mark Slater-Tunstill: We wanted the locations to feel believable. They're not all exactly one-to-one, but you kind of believe they could be a real place. I think that process definitely helped.

Digital Foundry: That sounds like a really powerful system. And I imagine it could create a much more detailed and realistic shoreline, for example, than a more handmade system.

Dan Atwell: Well, you do. And it takes a big chunk of working it out. It's the same way as doing photogrammetry with a rock, you know, you're doing a big wide brush stroke of the whole area that gives you a good base to work off of. You can't be doing stuff like that by hand. You could lay a Google Maps screenshot down and kind of trace it out, but it's never going to have that same kind of level of nuance and interest and things like that.

Digital Foundry: Thinking of terrain, I really liked the sequences where you would do a time jump - the camera would be moving across the landscape, pointing up towards the clouds and then swoop over through the clouds and come right back down in a different portion of the map. How does this work within the editor and how is it set up? I assume you're doing some sort of data streaming in here, but it'd be great if you could sort of discuss how this actually works.

Mark Slater-Tunstill: Smoke and mirrors. Streaming, yeah. We had a custom tool that allowed us to load chunks of chapters in, so you could visualise everything you needed, but for those transitions we had to stream out and stream in the next scenes we needed, which is tricky. As you've pointed out in your videos, garbage collection and unloading of assets can easily cause hitches. That's something we ran into very quickly, so there was a lot of code resource put on trying to mitigate that - how do we chunk stuff up so it doesn't start hitching.

Effectively, the storage in the new consoles is so fast that you can stream stuff in pretty quick. We actually had slow hard drive mode testing though, because obviously if you're on PC, this is potentially an issue. We found when we were doing run throughs on slower hard drives on PC, it didn't load quick enough, and we needed to change streaming volumes and that kind of thing.

Setting ourselves the challenge of having no cuts just made our lives really, really hard [laughter], so working out how to do these transitions - going into clouds - that kind of stuff is the obvious way. We had a load of ideas of other ways to do it, but we might save that for another day [laughter].

Microsoft's own Developer_Direct gave a voice to Ninja Theory as it neared the end of Hellblade 2 development work. Watch on YouTube

Digital Foundry: What was the impetus for the PC settings menu? It does some interesting things, like a proper preview window, live CPU/GPU/VRAM metrics, and settings that you adjust one by one - and we don't see that often.

Mark Slater-Tunstill: People in the team cared about that stuff. There was an effort to give those options for PC, and it's useful to see, when you toggle something from low to high, what difference it makes, how much memory you have left, how it affects the GPU. There's definitely more we can do, and it is cool, but it requires resources to be put into it - it doesn't just naturally fall out at the end, so you have to plan that stuff.

Digital Foundry: One thing the audience always likes to talk about is frame-rate. I can understand why Hellblade 2 targets 30fps, going for that cinematic look, but did you consider higher frame-rates? guys targeted 30 frames per second, uh, going for sort of that cinematic look. That's what we hear, but I'm wondering at any point during the development process, were you considering higher frame-rates on consoles or was 30fps always the target given that it isn't a fast, twitchy game?

Mark Slater-Tunstill: From the start, we always intended it to be really cinematic and we wanted to really push the characters and the lighting and everything as far as we could. And the reality is that does have some constraints. As in your analysis of the game, we pushed to have a really solid 30 rather than an uneven 60, because having something consistent feels better than reaching higher frame-rates. We're still learning UE5, it's still relatively new to us and we'll keep learning, improving and seeing where we can make cost savings, but a consistent 30fps was a key target for console.

Digital Foundry: As you mentioned earlier, I'm curious if there was any specific moment you can think of in the game where you were fighting against production time or where you realised that Unreal has certain characteristics that you had to work against. I'm just curious what some of the challenges were on the project.

Mark Slater-Tunstill: Time is always an issue in game dev... I think for every game developer in the world, time is always your biggest enemy. From a technical point of view, the character rendering was a difficult one because there's no Nanite option for characters at the moment. When we enabled virtual shadow maps, the biggest cost is the characters - non-Nanite geometry is still very expensive compared to the environments. A lot of work went into making LODs for the characters and wondering if we turn the peach fuzz off on lower LODs. One of the patch notes said "fixed peach fuzz on Senua" [laughter].

Digital Foundry: Hellblade 2 uses Epic's TSR for upsampling and anti-aliasing, but a lot of UE5 games use FSR 2 instead - which works OK with high input results, but on console games lower base resolutions can produce results that are fizzly and noisy. Can you talk a bit about the performance implications of TSR and FSR 2 and why you ended up with TSR?

Mark Slater-Tunstill: A lot of it is just kind of comparing the options, right? You look at it and you just make a call on what's going to give you the best look. I mean, all the sort of temporal techniques, they always introduce a bit of artefacting and it's making a call on what works best for your game. I think, you know, if you were making something that needed really sharp lines, maybe other options work better, but for us TSR works quite well, especially for smoothing out some of the Lumen refresh artefacts you might see. Obviously on PC, we give you a bunch more from Nvidia and AMD, and you can figure out what works best on your system, but you've got to make a call on console.

Thanks to Mark Slater-Tunstill, Dan Atwell and David García Díaz for their time.

Read this next

seductrice.net
universo-virtual.com
buytrendz.net
thisforall.net
benchpressgains.com
qthzb.com
mindhunter9.com
dwjqp1.com
secure-signup.net
ahaayy.com
tressesindia.com
puresybian.com
krpano-chs.com
cre8workshop.com
hdkino.org
peixun021.com
qz786.com
utahperformingartscenter.org
worldqrmconference.com
shangyuwh.com
eejssdfsdfdfjsd.com
playminecraftfreeonline.com
trekvietnamtour.com
your-business-articles.com
essaywritingservice10.com
hindusamaaj.com
joggingvideo.com
wandercoups.com
wormblaster.net
tongchengchuyange0004.com
internetknowing.com
breachurch.com
peachesnginburlesque.com
dataarchitectoo.com
clientfunnelformula.com
30pps.com
cherylroll.com
ks2252.com
prowp.net
webmanicura.com
sofietsshotel.com
facetorch.com
nylawyerreview.com
apapromotions.com
shareparelli.com
goeaglepointe.com
thegreenmanpubphuket.com
karotorossian.com
publicsensor.com
taiwandefence.com
epcsur.com
southstills.com
tvtv98.com
thewellington-hotel.com
bccaipiao.com
colectoresindustrialesgs.com
shenanddcg.com
capriartfilmfestival.com
replicabreitlingsale.com
thaiamarinnewtoncorner.com
gkmcww.com
mbnkbj.com
andrewbrennandesign.com
cod54.com
luobinzhang.com
faithfirst.net
zjyc28.com
tongchengjinyeyouyue0004.com
nhuan6.com
kftz5k.com
oldgardensflowers.com
lightupthefloor.com
bahamamamas-stjohns.com
ly2818.com
905onthebay.com
fonemenu.com
notanothermovie.com
ukrainehighclassescort.com
meincmagazine.com
av-5858.com
yallerdawg.com
donkeythemovie.com
corporatehospitalitygroup.com
boboyy88.com
miteinander-lernen.com
dannayconsulting.com
officialtomsshoesoutletstore.com
forsale-amoxil-amoxicillin.net
generictadalafil-canada.net
guitarlessonseastlondon.com
lesliesrestaurants.com
mattyno9.com
nri-homeloans.com
rtgvisas-qatar.com
salbutamolventolinonline.net
sportsinjuries.info
wedsna.com
rgkntk.com
bkkmarketplace.com
zxqcwx.com
breakupprogram.com
boxcardc.com
unblockyoutubeindonesia.com
fabulousbookmark.com
beat-the.com
guatemala-sailfishing-vacations-charters.com
magie-marketing.com
kingstonliteracy.com
guitaraffinity.com
eurelookinggoodapparel.com
howtolosecheekfat.net
marioncma.org
oliviadavismusic.com
shantelcampbellrealestate.com
shopleborn13.com
topindiafree.com
v-visitors.net
djjky.com
053hh.com
originbluei.com
baucishotel.com
33kkn.com
intrinsiqresearch.com
mariaescort-kiev.com
mymaguk.com
sponsored4u.com
crimsonclass.com
bataillenavale.com
searchtile.com
ze-stribrnych-struh.com
zenithalhype.com
modalpkv.com
bouisset-lafforgue.com
useupload.com
37r.net
autoankauf-muenster.com
bantinbongda.net
bilgius.com
brabustermagazine.com
indigrow.org
miicrosofts.net
mysmiletravel.com
selinasims.com
spellcubesapp.com
usa-faction.com
hypoallergenicdogsnames.com
dailyupdatez.com
foodphotographyreviews.com
cricutcom-setup.com
chprowebdesign.com
katyrealty-kanepa.com
tasramar.com
bilgipinari.org
four-am.com
indiarepublicday.com
inquick-enbooks.com
iracmpi.com
kakaschoenen.com
lsm99flash.com
nana1255.com
ngen-niagara.com
technwzs.com
virtualonlinecasino1345.com
wallpapertop.net
casino-natali.com
iprofit-internet.com
denochemexicana.com
eventhalfkg.com
medcon-taiwan.com
life-himawari.com
myriamshomes.com
nightmarevue.com
healthandfitnesslives.com
androidnews-jp.com
allstarsru.com
bestofthebuckeyestate.com
bestofthefirststate.com
bestwireless7.com
britsmile.com
declarationintermittent.com
findhereall.com
jingyou888.com
lsm99deal.com
lsm99galaxy.com
moozatech.com
nuagh.com
patliyo.com
philomenamagikz.net
rckouba.net
saturnunipessoallda.com
tallahasseefrolics.com
thematurehardcore.net
totalenvironment-inthatquietearth.com
velislavakaymakanova.com
vermontenergetic.com
kakakpintar.com
begorgeouslady.com
1800birks4u.com
2wheelstogo.com
6strip4you.com
bigdata-world.net
emailandco.net
gacapal.com
jharpost.com
krishnaastro.com
lsm99credit.com
mascalzonicampani.com
sitemapxml.org
thecityslums.net
topagh.com
flairnetwebdesign.com
rajasthancarservices.com
bangkaeair.com
beneventocoupon.com
noternet.org
oqtive.com
smilebrightrx.com
decollage-etiquette.com
1millionbestdownloads.com
7658.info
bidbass.com
devlopworldtech.com
digitalmarketingrajkot.com
fluginfo.net
naqlafshk.com
passion-decouverte.com
playsirius.com
spacceleratorintl.com
stikyballs.com
top10way.com
yokidsyogurt.com
zszyhl.com
16firthcrescent.com
abogadolaboralistamd.com
apk2wap.com
aromacremeria.com
banparacard.com
bosmanraws.com
businessproviderblog.com
caltonosa.com
calvaryrevivalchurch.org
chastenedsoulwithabrokenheart.com
cheminotsgardcevennes.com
cooksspot.com
cqxzpt.com
deesywig.com
deltacartoonmaps.com
despixelsetdeshommes.com
duocoracaobrasileiro.com
fareshopbd.com
goodpainspills.com
hemendekor.com
kobisitecdn.com
makaigoods.com
mgs1454.com
piccadillyresidences.com
radiolaondafresca.com
rubendorf.com
searchengineimprov.com
sellmyhrvahome.com
shugahouseessentials.com
sonihullquad.com
subtractkilos.com
valeriekelmansky.com
vipasdigitalmarketing.com
voolivrerj.com
zeelonggroup.com
1015southrockhill.com
10x10b.com
111-online-casinos.com
191cb.com
3665arpentunitd.com
aitesonics.com
bag-shokunin.com
brightotech.com
communication-digitale-services.com
covoakland.org
dariaprimapack.com
freefortniteaccountss.com
gatebizglobal.com
global1entertainmentnews.com
greatytene.com
hiroshiwakita.com
iktodaypk.com
jahatsakong.com
meadowbrookgolfgroup.com
newsbharati.net
platinumstudiosdesign.com
slotxogamesplay.com
strikestaruk.com
trucosdefortnite.com
ufabetrune.com
weddedtowhitmore.com
12940brycecanyonunitb.com
1311dietrichoaks.com
2monarchtraceunit303.com
601legendhill.com
850elaine.com
adieusolasomade.com
andora-ke.com
bestslotxogames.com
cannagomcallen.com
endlesslyhot.com
iestpjva.com
ouqprint.com
pwmaplefest.com
qtylmr.com
rb88betting.com
buscadogues.com
1007macfm.com
born-wild.com
growthinvests.com
promocode-casino.com
proyectogalgoargentina.com
wbthompson-art.com
whitemountainwheels.com
7thavehvl.com
developmethis.com
funkydogbowties.com
travelodgegrandjunction.com
gao-town.com
globalmarketsuite.com
blogshippo.com
hdbka.com
proboards67.com
outletonline-michaelkors.com
kalkis-research.com
thuthuatit.net
buckcash.com
hollistercanada.com
docterror.com
asadart.com
vmayke.org
erwincomputers.com
dirimart.org
okkii.com
loteriasdecehegin.com
mountanalog.com
healingtaobritain.com
ttxmonitor.com
nwordpress.com
11bolabonanza.com