Call of Duty: Advanced Warfare - PC performance done right?
Can entry-level enthusiast gaming hardware match the PlayStation 4 experience?
We're going to be reading - and indeed, probably writing - a lot about PC optimisation over the next few days, but for every negative, there tends to be a positive, and we wanted to show that key developers are pushing the boat out to bring us some decent PC work. When we looked at the computer version of Call of Duty: Advanced Warfare in last week's Face-Off, we were impressed with the ease with which we could attain the 'proper' Call of Duty experience - 1080p at a highly consistent frame-rate with lavish quality settings. A typical enthusiast gaming PC - featuring something like a Core i5 quad-core processor paired with a £150 graphics card like the GTX 760 - is capable of excellent results.
All of which made us wonder - how low can we go with PC hardware to get a 1080p experience equivalent to that offered by the PlayStation 4? In terms of raw computational power, a wealth of PC components have the better of the new wave of consoles, but actually finding multi-platform games that make the most of those parts isn't easy. We're pleased to say that Advanced Warfare bucks the trend. We put together a budget gaming PC based on Intel's Core i3 4130 processor, paired with Nvidia's entry-level enthusiast graphics card, the GTX 750 Ti. We matched quality settings as closely as we could to PS4 (FXAA, low-quality texture filtering, medium shadows, depth of field and motion blur) but left everything else at max. As you can see below, the PC outputs turned out to be uncannily similar to the PS4 version of the game, and performance is mostly on par, if not better.
However, the beauty of PC gaming is the ability to ramp up quality levels to create the experience you want that fits the capabilities of the hardware you own. Could we extract more in the way of higher-quality visual effects from our budget PC without compromising the gameplay? Adjusting depth of field and motion blur upwards doesn't actually change overall image quality that much, but definitely impacts performance. However, the PS4's FXAA and texture filtering issues are areas ripe for improvement, so we selected SMAA T2x anti-aliasing, and ramped up anisotropic filtering as high as it would go. Performance dropped a little, but the overall experience remained similar - and overclocking the GTX 750 Ti (one of the easiest, safest and most power-efficient overclocks on any graphics card) clawed back the difference, for the most part.
It's all a far cry from last year's Call of Duty: Ghosts - a game that we could never run at anything approaching a locked 1080p60, not even with an overclocked Core i7 working in combination with a powerful Radeon R9 290X, as seen here. In fact, even an overclocked six-core Intel workstation paired with the 11.5 teraflop Radeon R9 295X2 failed to maintain a minimum 60fps frame-rate when running at 2560x1440 - a pretty remarkable state of affairs. Advanced Warfare is different. Sledgehammer's recommended specs are a 3.3GHz Core i5, 8GB of RAM and a 4GB version of the GTX 760. That should work quite nicely for running the game fully maxed (as long as you steer clear of the super-sampling anti-aliasing options), but as we've demonstrated, even a dual-core processor and a significantly weaker GPU still provides creditable results.
Obviously, there are good points and bad points to this. On the plus side, there's a great degree of scalability - smooth frame-rates at much higher resolutions are possible, and these benefits scale back to those with standard 1080p displays via the in-built SSAA options. It also means that a good COD experience is possible on 2560x1440 and 4K monitors, without requiring multi-GPU set-ups. But on the minus side, scaling up rendering quality and maxing the game out with improved shadow quality, depth of field and motion blur effects represents a clear case of diminishing returns, while extreme 120/144fps frame-rate options are only available for the single-player mode (multiplayer has a 90fps cap).
In our E3 preview of the campaign, we had some concerns about performance, which are mostly resolved in the final code. However, it looks like Sledgehammer quashed its frame-rate issues through a combination of optimisation along with subtly paring back features - the motion blur seen below is absent in the final game, for example. It's interesting to note that ramping the PC's motion blur setting up to the max doesn't return this feature, which is a bit of a shame.
There's the sense that the latest Call of Duty has been built to a specific spec, and image quality improvements beyond brute force options are fairly limited. The fact that a GTX 750 Ti can match and indeed exceed the visual quality set by PS4 in certain areas may also lead many to once again question the next-gen credentials of the new wave of consoles. However, to do so would be a disservice to Sledgehammer's achievements with the new Call of Duty engine, delivering a state-of-the-art visual experience within the confines of a very tight render budget. In short, the developer has seemingly worked miracles in order to do much, much more with console hardware while at the same time bringing across those new technologies and optimisations to PC owners too.
What we have here is a title with a performance profile that's a great base for console, while also making for a more inclusive experience on PC. On top of that, the scalability of the tech is impressive. We tried the game on an i7 system with GTX 980 and found that with small tweaks to the quality levels, we could sustain frame-rates north of 50fps (often hitting the magic 60) when playing at 4K resolution. Bearing in mind that many demanding titles require dual high-end GPUs to get anything close to that, that's quite the achievement.
Genuine problems with the PC version are few and far between, but one area we pointed out in the Face-Off deserves more exploration - our observation that the use of AMD graphics cards seems to create a much higher CPU load than utilisation of Nvidia equivalents. This week, AMD released a new beta driver optimised for Call of Duty. We saw a boost in frame-rates compared to our measurements taken last week, but the underlying problem remains - and you can see the phenomenon in action in the video below.
If you have a Core i3 - or an alternative offering much the same processing performance - paired with an enthusiast AMD card, the game often hits CPU limits during both cut-scenes and gameplay, resulting in a collapse in frame-rates. It's a state of affairs only amplified by playing with v-sync engaged. In the here and now, we'd rank the AMD R9 280 as the best 'bang for your buck' GPU on the market. It outperforms its Nvidia equivalent - the generally excellent GTX 760 - and it offers an extra 1GB of onboard VRAM on top. It's usually cheaper too. However, owing to the CPU load issue, its performance is considerably throttled when paired with the i3 in Advanced Warfare, to the point where demanding scenarios actually see it bested by Nvidia's sub-£100 GTX 750 Ti.
We saw the same results on both our Core i3 gaming PC, and also when we disabled two cores on our i7, downclocking it to simulate an i3's capabilities. In both cases, the Nvidia cards' performance barely saw any difference at all. Basically, if you're using an AMD card, you'll need a more powerful CPU too - the i5 2500K in the recommended spec, for example.
However, it's one sticking point in what is otherwise a creditable game. PC-specific features that genuinely elevate the experience may be absent, but in an age where next-gen game engines are transitioning across to PC with wildly varied spec recommendations and obvious optimisation issues, Sledgehammer's efforts here are worthy of commendation.