Call of Duty Black Ops: Cold War - what PC hardware do you need to match PS5?
And how does PC improve over consoles?
The arrival of a new gaming generation always shakes up the status quo - not just in the console space, but for PC owners too. With new Xbox and PlayStation hardware effectively resetting the baseline, how does your existing gaming rig stack up? Do you need to upgrade? What kind of PC kit is now required to match the console experience? We've already tested Assassin's Creed Valhalla, finding that relatively high-end PC kit is required to get the job done, while Watch Dogs Legion lowered expectations - provided you had an Nvidia RTX card. Call of Duty Black Ops: Cold War is our next port of call.
You can watch our entire process in matching PlayStation 5's visual make-up to PC in the embedded video below, but through a process of careful testing, it was possible to provide a very close match, up to and including ray tracing features, so let's get straight into the nitty-gritty. The volumetric lighting setting controls the resolution of lit volumetric fog in the game, where PS5 is closest to PC's low setting. Water tessellation controls the displacement of water, offering up more detail and this is turned off on PlayStation 5. Other features are engaged though - motion blur on PS5 is equivalent to the PC game with its 'all' setting, yet has fewer samples in motion than the closest 'high' quality level. Meanwhile, texture quality on PS5 is equivalent to PC's maxed out setting, as long as the high quality texture pack is installed.
The object detail setting is interesting - it controls the draw distance of odd one-off bits of environmental detail, usually only noticeable in the far distance. Through a process of elimination, PlayStation 5 looks closest to PC at medium. Other settings reveal the developers erring more towards the high end - screen-space reflections, for example, are an important aspect of the game's visual make-up and so PS5 runs at what looks like a match with PC's high preset, while the model quality option (which is self-explanatory) is again for a match for PC at its best. Ordered independent transparencies increase sorting quality for transparent effects - with some very odd artefacing when disabled. However, I could find no difference between low and high settings, so PS5 could be equivalent to any of them. Meanwhile, sub-surface scattering more realistically renders the way that light interacts with skin on character models. It's an on/off setting and it's definitely enabled on consoles.
Before we talk about ray tracing, the final setting to discuss is the effects preset, which throws up a certain level of ambiguity. This controls the fidelity of transparent effects like fire and it's closest to PC's medium, though the age-old console performance trick of using lower precision buffers is in play. This demonstrates something we often find in console builds - while they have much in common with the PC versions, developers can make customised renditions of specific effects, designed to offer more bang for the buck. And often, these compromises are not present in the PC rendition of the game - even if there's no technical reason they could not be implemented. Typically, lower resolution buffers for transparencies is an easy performance win, drastically reducing bandwidth.
With standard rasterisation out of the way, the ray tracing settings are intriguing. It is supported on PS5 and Xbox Series X in the form of ray traced shadows, but I swiftly discovered that even when enabled, RT isn't present throughout the game. For example, it seems that the Fractured Jaw mission on consoles does not use ray tracing at all - presumably because the nature of the content is just too taxing to make the effect viable with a target 60fps. Memory usage of RT is also significant, which may offer one reason why Xbox Series S does not feature ray tracing at all. Another aspect to factor in is the additional CPU burden - RT requires a 'BVH structure' which is effectively a copy of scene geometry to shoot rays at in order to calculate the relevant effects. Fractured Jaw is a very complex stage, so setting up this scene for RT will not be insignificant. Consoles drop back to standard shadow maps here, and it's interesting to note that PS5 seems to possess shadow quality that's in excess of PC's ultra preset.
Actually putting all of this knowledge to the test in practical conditions is challenging owing to the implementation of dynamic resolution scaling on the console versions, meaning an ever-shifting pixel count dependent on GPU load. However, the air strip set piece in Turkey throws up an interesting anomaly: on PlayStation 5, it seems that DRS is disabled and all of my pixel counts suggest native 4K rendering throughout - and yes, performance can't sustain the 60fps target as a consequence, hitting a 45.6fps average across the sequence. This is our best shot at directly stacking up console vs PC, but there's a very important caveat to factor in - the lower precision effects buffer on PS5, which we can't replicate on PC, and where we can't even begin to measure the possible performance penalty on our GPUs. Put simply, ballpark is the best we're going to get.
Regardless, at the top end, an RTX 3090 delivers an 81.2 per cent boost to performance in this segment at equivalent settings, while RTX 3070 is just 8.6 per cent faster. An RTX 2070 Super can't match PlayStation 5 - in fact, it's 20 per cent slower. On the AMD side of things, I found the RX 6800 XT's result to be off-pace - it has 72 compute units vs the 36 inside PlayStation 5, it's based on the same architecture, and clock speeds are broadly equivalent, yet it delivered just 29.4 per cent of extra performance. Whether it's an optimisation issue, or a driver issue, I expected more.
It's an interesting exercise, but in this case mostly an academic one in several respects. Beyond the issue we have in precisely matching settings and where we can't access dynamic resolution scaling on PC (more's the pity!) it's not entirely representative of actual use-case scenarios. Take the RTX 2070 Super, for example. While our test shows a significant performance drop up against PlayStation 5 in like-for-like situations, engaging DLSS quality mode gives the GPU a 17.7 per cent performance boost over the console - a stat which only rises when dropping to DLSS balanced or performance modes, which still look great at 4K resolution. I'd also say that the quality mode delivers improved image quality over native rendering too.
All told, this has been a fun way to 'take the temperature' of console launch titles and to see what kind of performance the new machines can deliver - and the results are impressive. When we carried out a similar exercise back in 2014, we found that a sub-£100 graphics card could match Xbox One and PlayStation 4 on key titles - a situation that clearly is not the case this time around, suggesting that Microsoft and Sony have been significantly more ambitious in delivering a generational leap. And of course, this is just the beginning. As the generation progresses, we'll see developers get much more from the hardware - and it'll be fascinating to see how PC evolves to keep its technological edge.