Lost Planet: DirectX 9 vs. 10
We check out the differences between the two PC demos.
The other DX10 promise though, apart from the visual benefits, is performance. Microsoft has proffered some real chinny-reckon percentage stats of how much faster they think games will run under DX10 in Vista. Unfortunately, Lost Planet turns out to be an extra insult to the framerate injuries Vista has already inflicted upon most games. With all settings save shadow quality at max, at 1600x1200 with 8x anisoptropic filtering and no anti-aliasing, on the Foxconn GeForce 8800 GTS in Windows Vista, the DirectX 10 demo knocked out 26 frames per second in the first, outdoor level, and 33 in the other, indoor one. The Direct X 9 demo on the same system - including Vista - went to 29 and 40. Just a little more than could be safely attributed to natural margin of error (confirmed by repeat tests), and thus proof that DirectX 10, in Lost Planet's implementation at least, is hungry.
The difference is also that between perfectly playable and intermittent distracting chug. Not a price worth paying for more motion blur, frankly. On either demo, by the way, dropping the shadow resolution and HDR levels down one notch adds five-ten frames per second with minimal visual difference, ditching the sporadic slowdown entirely on our test system. Incidentally, the DX9 demo on the same PC but in XP added just a couple of extra frames - possibly margin of error, but more likely the well-documented Vista universal performance penalty at work again.
It's also worth noting that the DX10 demo allows shadow quality to be set one notch higher than the DX9 one. You really can't see the difference in-game on that unless you've golden eyes, but it slices the frame rate quite literally in half, way down to unplayable levels on a single 8800. It's possible that 3D card driver updates will fix that, but it's certainly not a visual improvement worth dropping a second graphics card in for, in case you're pondering the SLI or Crossfire route.
While we're on 3D cards, it's been reported that Lost Planet DX10 plays like a dog on the very recently-released first DX10 card from ATI, the HD 2900 (one of which we haven't gotten hold of for testing just yet). Lost Planet is an NVIDIA ‘The Way It's Meant To Be Played'-branded game, and, if these reports are accurate, would seem to have a GeForce bias in its coding as a result. This is something to potentially watch out for in future DX10 games. NVIDIA and ATI alike offer graphics engine help to some developers, usually in exchange for intro screen and back of the box branding, and the significant differences between the two firms' latest GPUs mean the performance gulf on different games may vary wildly. The decision on which graphics card to upgrade to could become a whole lot harder. But that's another story - one for later perhaps, if such tech-talk hasn't already dragged Eurogamer's traffic to a sleepy halt.
So now, back to games! Lost Planet, whilst having some flashes of extreme prettiness, is unfortunately not the DirectX 10 showcase we were hoping for, not even slightly. The PC is certainly king of graphics, it's just that lately it's been a little cackhanded when it has to prove as much. That will change later in the year - Crysis, we need a hero. Get a bally move on, will you?
As to Lost Planet's merits as a PC game, well, some answers can be had from Eurogamer's review of its Xbox 360 iteration. Speaking personally on the strength of the demo alone, I'm not convinced its blatantly linear tunnel structure, references to A buttons and foes with unnaturally-glowing weak spots suggest it's entirely appropriate for this platform. Fancy snow effects aside, it feels pretty throwback and generic in the two levels on show here, but maybe the full version will do a better job of breaking the ice.