The Anti-Aliasing Effect
How developers are battling the jaggies.
In the wake of the discussion stirred up by the release of The Saboteur, the results of these tests were shared with the Beyond3D community, leading to this response from Johan Andersson, aka repi, rendering architect at Battlefield developer DICE:
"I did an experimental implementation of MLAA in Frostbite 2 a few weeks ago just because I wanted to see how it looks like on moving pictures," he wrote. "On still pictures it looks amazing but on moving pictures it is more difficult as it is still just a post-process. So you get things like pixel popping when an anti-aliased line moves one pixel to the side instead of smoothly moving on a sub-pixel basis. Another artifact, which was one of the most annoying is that aliasing on small-scale objects like alpha-tested fences can't (of course) be solved by this algorithm and quite often turns out to look worse as instead of getting small pixel-sized aliasing you get the same, but blurry and larger, aliasing which is often even more visible."
Sub-pixel edges are an element that this algorithm simply can't deal with. Take, for example, a pylon structure in the distance. There won't be a continuous edge to analyse, as parts of the edge won't be rendered in full, or will be rendered with a different colour compared to the rest of the edge. In these cases, the blending simply serves to highlight the issue rather than smooth it off. With LODs, fogs and depth-of-field blur, sub-pixel edges can be eliminated (they are not immediately apparent in The Saboteur, for example), but that's the theory - it can't be applied universally to every game.
"[I] also had some issues with the filter picking up horizontal and vertical lines in quite smoothly varying textures and anti-aliased that," Andersson continued. "Again when the picture is moving this change is amplified and looks a lot worse than relying on just the texture filtering. Though I think for this case one can tweak the threshold more to skip most of those areas. I still think the technique has promise, especially in games that don't have as much small-scale aliasing sources. But the importance of MSAA remains as you really want sub-pixel rendering to stably improve on small-scale aliasing in moving pictures. Or some kind of temporal coherence in the MLAA filter, which could be an interesting topic for future research. In games we aren't primarily interested in perfect anti-aliasing of still pictures, but flicker-free rendering of pictures in motion."
We ran another test on another DICE game: Mirror's Edge. Again, the PS3 version runs with no anti-aliasing at all, and we expected that the stark colour schemes and hard edges would make for a instant improvement. Results are somewhat mixed here - the sub-pixel edge issue still doesn't look so great, and the edge-blurring is perhaps too excessive.
It is worth bearing in mind that while the Intel code we ran produces similar results to the technique shown in The Saboteur, there are a number of crucial differences, even though they are both based on the same principle of a screen-space (effectively 2D) filter. The Intel code is fairly heavy in terms of both its edge-detection and indeed its blurring. The artifacts that Andersson points out seem to be overly exaggerated compared to what is seen in The Saboteur. We can see that Pandemic's method of blending is less impactful on texture quality in these comparison shots, which put the 360 version (processed) up against the PS3 version.
However, the principles involved in the Intel sample definitively give you some kind of inkling of what the future may bring in terms of improved edge-smoothing, and it's clear that the further developers push the games consoles, the more customised the solutions will be for all manner of different effects within the frame-rendering pipeline, not just the anti-aliasing. We even have a case now where the hardware inside the GPU designed for handling MSAA is actually being repurposed to provide new and different effects (see the Digital Foundry interview with Sebastian Aaltonen on Trials HD).
Quite how the process of edge-smoothing will work out for the next generation remains to be seen. However, the more usual techniques of throwing enough power and silicon at the problem to affect a solution won't suffice going forward: part of the reason why games developers are moving away from the hardware anti-aliasing solutions is because they consume too much GPU power and bandwidth and have some pretty heavy memory requirements. These concerns can be extended to the hardware design phase of a next-gen console too.
PlayStation 3 and Xbox 360 were power-hungry, hot designs at launch, with the platform holders relying on smaller, more tightly integrated fabrication processes to make their chips more efficient and cheaper as the lifespan of the console progressed. Moving to the next generation, these so-called die-shrinks won't be so readily available. The challenge will be in providing smarter, more efficient solutions, making that silicon budget stretch that much further.
Historically GPU makers have been looking into their own custom solutions. Indeed, going back to the Radeon 8500 days you can see ATI experimenting with edge-detect and luminance-based AA, using similar principles to what is seen in The Saboteur. More recently, its 12x Edge Detect technology is usable now and has the memory footprint of 4x MSAA, but gives results similar to 8x MSAA (unfortunately with the same hit on GPU performance). Sometimes better, sometimes worse, depending on the given scene.
In the here and now with the current-generation consoles, custom anti-aliasing solutions with the potential to outperform the hardware equivalents are a great example of how the console marketplace is maturing and in the confidence developers have in pushing the hardware in new directions. Long may it continue.