Digital Foundry vs. OnLive
At stake: the fundamentals of gaming as we know it.
Performance Analysis
One of OnLive's many claims is that the service offers 720p60 gameplay. It's a crucial part of the marketing in that it gives the system a markedly higher performance level than vast majority of games on the current HD console platforms. Very few games run at 60FPS on console, and for those that do, even fewer can sustain that frame-rate. By targeting this higher performance level, OnLive is able to market this concept to hardcore gamers: upgrade to the cloud and never buy another piece of gaming hardware again.
Using PC-based architecture, achieving 720p60 on most games is a doddle. In PC enthusiast terms, 1280x720 barely counts as high-definition, and even entry-level enthusiast GPUs costing around £80 can offer 720p60 gameplay on most titles: certainly the console conversions that OnLive's shop is mostly stocked with, and definitely on the compromised performance profiles the system servers use.
But OnLive hardware needs to cope with multiple players making use of the same hardware, so the question is, can the servers sustain 720p60? The short answer is that it depends, but mostly it's a "no". In PC circles, the performance analysis tool of choice is FRAPS, but for OnLive its measurements are not accurate. It will measure the frame-rate of the video stream, not of the server-side game output.
Our method of analysis is to connect the PC running OnLive to one of our TrueHD capture cards, thus ensuring that the gameplay PC is doing nothing other than running the game, with the raw output of the machine itself completely at our disposal. From there we then run the pixel-perfect captures through our analysis tools.
On a basic level, frame-rate analysis works on the concept of counting unique frames per second, but OnLive hinders this approach somewhat when non-unique frames can look like the shots below. In the first screen we have the new frame offered up by the server. The second shot should be the same, but the compressor is trying to resolve more detail. The analysis tool think it's a unique frame because it's so different, but clearly it's not.
So, an initial pass by the automated tool is then followed up by a painstaking manual check by eye to confirm accuracy. Yup, frame by frame. (The lengths we go to, etc.)
First up, one of the most variable gameplay experiences on the network: Assassin's Creed II. Here you can see that frame-rate varies enormously between 20FPS to 60FPS, making for a much jerkier experience than the same game running on Xbox 360, which has a lower average frame-rate but far more consistent performance.
Average frame-rates are also a bit of a problem with OnLive because - being an average - they smooth out performance that varies dramatically frame-by-frame into a single number. More pertinent is that ACII, and many of the other games, can render one frame at 16ms and the next could take anything up to 66ms or higher to arrive. This is probably a major contributing factor to the varying latency results.
Next up, Batman: Arkham Asylum. Detail levels are obviously pared back compared to the full-fat PC experience, and there's no PhysX support, but in common with Unreal Tournament III (another Unreal Engine 3 title), frame-rates are consistently high. Batman's visual make-up is good for improved consistency in the video encoder and playability isn't bad either. At 60FPS we get 166ms latency - compared to around 133ms for the Xbox 360 version.
Quite apart from the inconsistent frame-rate, there's also the issue of the graphical settings. OnLive visuals are bereft of stuff that we'd want as standard on a PC or console title. Despite the GDC developer briefing mentioning FSAA (full-screen anti-aliasing) as a mandatory requirement at 720p, it appears to be absent on the games we tested. When an £80 GPU combined with a dual-core CPU will give 720p60 as a matter of routine on most console conversions these days (and usually with 2x multi-sampling anti-aliasing at the least), OnLive's initials claims that you're getting a state-of-the-art gaming experience from the cloud fall short.
On a related matter quite apart from the server-side mechanics, we also found that the OnLive client itself has trouble updating at 60 frames per second on lower-spec hardware. We've found that games themselves seem to be v-synced (it aids compression), but playback on certain PCs can be compromised. This is something that FRAPS can measure but it's also something you can clearly see as it manifests in the form of screen-tear.
The tearing is definitely generated client-side. This can be easily discerned as there is absolutely no sign of video compression artifacts around the cut between the two images. This is somewhat surprising as the 2.33GHz Core 2 Duo CPU we were using to test OnLive spec levels should easily be able to cope with a 720p60 video decode. A dedicated GPU was on hand with plenty of 2D horsepower to spare on rendering the image so that's not an excuse either.
Overall conclusions on the performance side of things are mixed. On the one hand, OnLive clearly offers higher average frame-rates than console on many of its titles. However, this is not the entirely good thing that it should be for two reasons: firstly, there is very little consistency in the overall performance. Video can be silky smooth one moment, horribly jerky the next. Performance will change enormously split-second. One frame could be rendered at 16ms, the next could be rendered at 66ms or with an even higher latency.
This ties in closely with the second major factor. With local gaming, higher frame-rates are typically accompanied with a much crisper, more precise degree of control, and it's exactly for this reason that games like Burnout Paradise and Call of Duty target 60FPS on console. There is no such precision with OnLive. A 150ms-200ms variable response is worse than most PS3 or Xbox 360 titles running at 30 frames per second, so there is something of a jarring disconnect between the higher-than-console frame-rate and the inconsistent response from the controls.
There are definite reasons why console game-makers prefer to target one specific frame-rate (typically 30FPS or 60FPS), and there are very few who run with an unlocked frame-rate in their wares. It's all about consistency in image and response. DiRT 2 might run with a slower frame-rate on console on average, but thanks to the locked 30FPS and the use of motion blur, it actually manages to look smoother than OnLive, which varies immensely between 30FPS and 60FPS.
To get some idea of the effect, PS3 owners should check out BioShock or BioShock 2 running with unlocked frame-rate (it's an option in the options menu) - there you will you see exactly the difference between a locked 30FPS and an uncapped frame-rate in terms of inconsistency in response and jerkiness of the on-screen image.
Few developers run with an unlocked frame-rate in their console games and the only really notable example of late has been God of War III on PlayStation 3, which uses sophisticated per-object and camera-based motion blur in an attempt to mitigate the judder. OnLive doesn't do anything like that and the difference is immediately apparent.