The Criterion Tech Interview: Part One
Meet the team behind Burnout's engine evolution.
You've got a lot of memory in these consoles in comparison to the previous generation, but your disc isn't really faster, so filling the memory becomes a lot harder. The challenge was managing the data and how to store it on disc, and how you get it off the disc in time to be used by the game. So, very, very heavy decompression is key.
Just like the consoles themselves, they've all got their ups and downsides. You need to take a holistic view of all of them, and everything there, you can just pick balance points that just work on both.
The vast majority of our code is completely identical. Very very little bespoke code. You pick a balance point and then you tailor each one accordingly. You'll find that one upside will counter another downside...
Yes, even on PC.
We try to set budgets for things like performance or the amount of memory you're using, bandwidth we're using from the disc, then we work out what that means in terms of constraints for the artists and designers for what we can do in the game because of the technical limitations. We stick to that rigidly, enforcing that in the rules that we have. For example, when artists try to add new pieces to the world: if you stick to the rules and to the budgets it'll work. That's the balance we try to strike as a team. We make some commitments on the technical side, like we're going to hit the frame rate, we're going to keep the latency low, but we look to the artists, the content creators, the designers to play by the rules as well and stick to the constraints we give them and ultimately that means we'll end up with a better product. Some teams will want to give as much power as possible to the content creators, and that's fine – you don't want to rein them in too much, but that's going to give you a ridiculously choppy frame rate or areas where you're just not going to cope. We'd rather allow our tech guys to impose constraints and as long as our team know about it up front, they're very good at getting something that looks great out of budgets that might not be as high as some other teams.
You get inventive, if you've got constraints you can be inventive to get around them.
Yes, but some of our biggest technical challenges were about supporting the artists and supporting the people building the world. We put a lot of effort into the database asset management system at the backend so we could have lots of artists working on the same scene at the same time, which a normal art package isn't really designed to do.
If they wanted to. Most of the time, if they realise that a certain bit is locked and they can't modify it, that's fine.
So it was all about putting that effort in, and we wrote stuff that would go away overnight and measure performance... flying a camera around the whole world and render it in many directions and tell us whether people were in budget, basically, drawing a map with big red blobs on it pointing out which bits were too expensive.
More importantly to explain why, ask if we can help... Just telling them it's too expensive is no help. They might just go and take some textures out and that wouldn't be any use at all.
There can be many reasons why one scene might be more expensive to render than another. It might be that someone's stuck a huge texture onto something that's absolutely tiny, killing our bandwidth, or there's a bunch of alpha polygons one on top of another spending ages chewing GPU and drawing nothing – it happens by mistake sometimes. It's just a case of giving those guys the tools and the clues to see what's going wrong and fix it.
Just talking about the artists for a second, they would have metrics when they built the game to say, will this fit in memory or not, will this load in from disc fast enough or not, by and large, will this render fast enough or not. It wasn't much fun to work around those constraints at times, but as long they knew how, they could.
There've definitely been improvements throughout the course of the DLC. For example, by adding the night time, we actually changed the entire lighting model. Hopefully people wouldn't have noticed too much.
It became scalable in a different way, cheaper in some situations, more expensive in others, but scalable in the direction we wanted. Of course, we saw the bikes added, animation added, the physics changed quite a lot to support two-wheeled vehicles, lots and lots of behind-the-scenes evolutions. The physics changed constantly throughout the course of the DLC, especially with the island and its immense jumps – the forces that go on there are quite "special". And in terms of tools, being able to create content faster, more efficiently – just to be able to produce this amount of DLC. And not break the game.
Alex is constantly coming up with good ideas on how we can improve the quality of the visuals and the speed and performance. But with the DLC, one of our biggest constraints was that we didn't want to ship that entire world again – people wouldn't enjoy downloading that again as an add-on. We were constrained by the fact that the data on the disc had to be the data we would render.
In a sense, yes. We took some decisions during Paradise because we thought we'd do some sort of downloadable content, but at the time we weren't sure of the form it would take. Some of those decisions made it easier, some of them didn't, but we certainly learnt a lot.
It's been an interesting time. One of the less pleasant surprises is that we've been free to change the code, it's a small amount of memory, a small patch, you can change more or less all of the code. But as soon as you do that, the QA team have a nightmare because they have to test the entire game again. If we change lots of code, it becomes much more expensive.
That's largely what it is.
There are huge amounts of test scripts, things that are more prone to breakage than others, things you suspect are more prone to break based on the code you've changed, so working with the QA guys on that, and other things like checking that your saves are going to work, that a small change to the physics hasn't broken thisvehicle in this particular Stunt Run...
That's one of our challenges as a team: that we have to get far better at automating this kind of testing. If we can get the game to play itself, or play parts of itself then we can convince the QA managers that they won't need to crash into every millimetre of every wall because we automatically throw a million cars at a million walls for you and we can tell you that the walls no longer have any holes in them.
The team should be playing the game and making sure that absolutely everything is fun, rather than playing it to make sure it isn't broken.
We don't mind spending a lot of money on QA to make sure the game is fun, we don't really want to spend it making sure we haven't made any silly mistakes.
Coming up in part two: in-depth discussion with the team on the leaps in the quality of physics the current generation consoles allow for, the move from PS2 to Xbox 360/PS3 and how the evolution of PC technology merged with Criterion's parallelisation philosophy. Plus: the team talk Black, the ultimate festival of first person destruction on the previous generation platforms.