Temporal anti-aliasing: a blessing or a curse?
TAA is a crucial tool for developers - but is the impact to image quality too great?
For good or bad, temporal anti-aliasing - or TAA - has become a defining element of image quality in today's games, but is it a blessing, a curse, or both? Whichever way you slice it, it's here to stay, so what is it, why do so many games use it and what's with all the blur? At one point, TAA did not exist at all, so what methods of anti-aliasing were used and why aren't they used any more?
For around a full decade, from the late 90s until circa 2010, the best anti-aliasing you could get was SSAA - super-sample anti-aliasing. The principles are remarkably straightforward. To remove the jaggies, you deployed GPU resources to render the image at a much higher resolution, then downscaled. 8x SSAA on a 1080p screen effectively rendered internally at 8K (!), while 4x SSAA downscaled from 4K instead. It's the brute force method, delivering a stable image with little sub-pixel break-up and pristine edges. When the PS4 Pro and Xbox One X arrived, Sony and Microsoft didn't just sell gamers on the 4K dream, but also on the image quality benefits from SSAA by rendering at a higher resolution and then downscaling.
However, the brute force approach means that a tremendous amount of GPU resources are required. Using Crysis 3 on an RTX 3070, native 1080p runs at nearly 200 frames per second without it being CPU-limited. However, 4x SSAA takes that down to 70fps, reducing to just 19fps at 8x SSAA.
The solution? MSAA, or multi-sample anti-aliasing. You can think of MSAA as similar to SSAA with one profound difference: only geometry is super-sampled. To the best of our knowledge, Nvidia's 8800 GTX was the first GPU to support 8x MSAA and looking at an example from back in the day - The Elder Scrolls: Oblivion - it lost around 35 percent of its performance running with 8x MSAA. That's significantly but nowhere near the same as our RTX 3070 Crysis 3 example, which lost 90 percent of its throughput. Like SSAA, there are quality modes too so you can use 2x or 4x for less reduction of jagged edges, but less of a hit to performance. However, over time, MSAA disappeared and only a sparing few titles use it today.
There are several reasons for this. The first is due to deferred rendering, a technique that increases rendering complexity in a highly performant way. An example of the effect of deferred rendering would be how games using deferred lighting could greatly increase the amount of visible lights on screen - Killzone 2 from the PS3 era wasn't the first game to use it, but is often cited as a major example. MSAA could be integrated into a deferred pipeline, but the cost to developers was great - as was the memory footprint.
Crysis 3 does support deferred rendering and MSAA but it highlights several problems that spelled the end of its general use in games. Turning on MSAA to 8x in that game nearly halves the frame-rate at 1080p and at 4K it is even worse... and the actual results of its anti-aliasing were unconvincing. MSAA works on geometry, remember, which becomes less important as pixel shading better defined a game's visual make-up. Elements like normal maps, as popularised by Doom 3, showed that game lighting and object detail were driven more by textures than by geometry. Effects like volumetric lighting and screen-space ambient occlusion (SSAO) also didn't play nicely with MSAA. Going back to Crysis 3, edge-smoothing was still present but specular aliasing (the shiny bits!) were not addressed - it looked poor.
Stop-gap measures appeared, such as FXAA or SMAA, which are effectively post-process solutions that look for visible edges after the scene has already been rendered, trying to blur and soften those edges. It's an improvement in many scenarios, has a relatively tiny footprint on GPU resources but definitely softens the image and has no idea of history, meaning that a different treatment per frame could lead to visual discontinuities. And it's here where we see the beginnings of TAA.
Temporal anti-aliasing is another form of super-sampling, but instead of downscaling from a much larger image, data from prior frames is reprojected into the current one. And when working well, it has similar behaviour to SSAA: it cleans up all jaggies of all types. Where MSAA breaks with shader aliasing, temporal AA cleans it up quite nicely, creating a much smoother image than MSAA which may not work at all. Since it works like super sampling, TAA also can affect things that are not made of geometry at all. In Control, the ray tracing effects in the game get proper edge-smoothing and clean-up, which would never happen with MSAA. The ultimate form of TAA right now is Nvidia's DLAA (effectively DLSS rendering at native resolution) which can actually look significantly better even than standard super-sampling.
This impressive super sampling property of TAA was quickly exploited by developers, sometimes to improve performance. Effects like reflections, volumetric lighting, shadow filtering, ambient occlusion or hair rendering are rendered at lower resolutions and only really become 'whole' once TAA aggregates data from prior frames to 'finish off' the effect. Developers are using TAA to not only clean up jagged edges in games, but to optimise games effects and hide where they cut corners on quality. The negative point here is obvious though: if you don't want to use TAA and you can turn it off, the lower resolution effects are easily noticeable and don't look right.
However, the wins can be extraordinary and even in pure anti-aliasing terms, the cost of TAA is relatively tiny. In Deus Ex: Mankind Divided, for example, just three percent of frame-time is occupied with TAA calculations. The technique sounds incredible when you look at its positives in isolation: better performance and better anti-aliasing than the previous tech, but there are issues - and image clarity is obviously one of them. TAA resolves softened imagery.
A key issue with TAA is that it resolves softened images. Part of this is just the nature of what anti-aliasing is. Even SSAA has a softening factor, technically. However, even considering how all anti-aliasing softens an image, TAA is inherently even softer due to how it is integrating information from previous frames by jittering the image ever so slightly to achieve its effect. TAA typically resolves softer than other anti-aliasing techniques by default - and you may find this either pleasingly soft or you may find it annoyingly blurry.
A big aspect of this is how far you are from your screen and how high resolution your game is running. If you are further away from the screen, you may not notice the softness and you can just appreciate the macro aspects of TAA. However, if you're closer, softness is amplified and becomes more obvious. That softness then becomes even more obvious if the screen resolution is lower. 1080p display users fare much worse than 4K, to the point where I consider the softening at 4K to be a non-issue.
This aspect of screen distance and resolution making TAA seem better or worse is key to the general problem with TAA. For people who play console games, they often play on high resolution televisions a few metres away, and softness is not so visible at such distances. However, on PC, many people are playing much closer to their screens, and often at lower resolutions. So if a game's developer balances the image quality options in their game based on the console experience, a large number of PC users who are closer to their screens at lower resolution might be getting subjectively a softer experience where the flaws of TAA are much more noticeable.
Another issue with TAA is how it blurs on movement - this is something I show more easily in the video. Halo Infinite's TAA is one of the blurrier ones out there when the screen moves, but the behaviour of blurring the screen while the camera is moving will be found in nearly all types of TAA - it really is not a bug so much as a feature. How much you will notice it is, once again, a subjective thing, but the lower your resolution is and the closer you are to your screen, the more you will notice it. It also scales with motion speed, so I believe a mouse user will be more impacted than a joypad user.
Another aspect of TAA is that it is frame-rate dependent. The higher the frame-rate, the closer together the sampled frames, meaning less error in reprojection calculations and therefore higher image integrity. If you play games at a high frame-rate this could be viewed as a positive aspect, but I would also say it biases TAA's usefulness, making it less applicable to certain game types of graphical ambitions. Another less than favourable aspect of TAA, is how it can create visible jitter in an image, but the most egregious artefact is ghosting. It's not the constant state of TAA usually, but nearly every TAA type is capable of doing this and I have come across such behaviour hundreds of times in my years of reviewing games for Digital Foundry at any and every resolution. Even the apex of TAA - Nvidia's DLSS - can have this issue.
So, is TAA a blessing or a curse? Let's put it this way: without TAA, real-time graphics could not have progressed to the point we are at now. Achievements like real-time path tracing in AAA games like Cyberpunk 2077 would not exist without the concepts that TAA brought to the table. On the console side, it also means that the difference between console generations would be very minor without TAA. Without TAA existing, a lot of computational power would be spent on cleaning up jagged edges and not improving things like lighting and geometry. The Matrix Awakens demo running on Xbox or PlayStation would not be possible without Epic's TSR solution - its own take on DLSS, effectively.
For all the issues, I see TAA as a great thing - but I am merely one voice and there is a great plurality of voices that should be heard beyond mine. For example, I play a lot of games at 4K and that forms the basis of my experience as most of TAAs key issues are less visible, but if you play at lower resolutions like 1080p the downsides of TAA are much more obvious. This is critical as a huge amount of users play games at 1080p based on the Steam hardware survey.
And since TAA's downsides like ghosting, and blur are going to be more obvious at 1080p, I think the industry could do well to treat TAA like other options which add blur to games. TAA should be able to be turned off much like motion blur should always be able to be turned off as that is an accessibility option for people who get motion sickness.
Furthermore, I really think there should be simple, standard alternatives available to a user who doesn't want to use TAA. In the Spider-Man PC ports by Nixxes, for example, those games were definitely designed around TAA for PlayStation, but you can turn it off or use simple post-process anti-aliasing if you want. Sure, FXAA or SMAA are not good at cleaning up jagged edges really, but it is better than nothing and simple to have as an alternative.
The last reason why TAA should be able to be toggled off is due to future scaling. Right now, TAA exists primarily because it is a convenience, but in 10 years time, maybe users would prefer to just super-sample their games. In putting together this piece, I loaded up a number of older titles I could run beautifully with SSAA on my RTX 4090. So, perhaps developers should allow TAA to be turned off to future proof their game for future, much more capable hardware.
In the short term, however, TAA is here to stay. In a world where scaling up performance via hardware is delivering ever-diminishing returns, the emphasis shifts to software solutions to deliver more from the generational leaps we do get - and TAA is a potent tool in elevating the state of the art.