If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Assassin's Creed: Unity and the stupidity of #resolutiongate

Assassin's Creed: Unity won't display at 1080p, even though the previous one did. Are we going backwards? Does it matter?

assassins_creed_unity_1

There's been a lot of talk about resolution recently, which I enjoy because watching people work themselves into a rage over numbers amuses me.

If a developer dares to suggest that there are resolutions other than 1080p (or, for PC types, a minimum of 1080p) that might be pretty okay to look at, or that a particular genre works perfectly well at some other frame rate than 60 FPS, or - scandalous - that developers can make aesthetic choices about these matters, they are howled down in the comments section. The numbers are supposed to go up, they scream. I won't enjoy this game unless it has bigger numbers than the one before.

We are obsessed with numbers going up, to the point where it seems we'd rather see that happen than anything genuinely new, interesting or beautiful. Forget about resolutions, will you? They don't matter.

To understand why resolution and frame rates don't matter that much we need to talk about a couple of other things: how graphics have evolved over the years, and how we engage with that evolution.

How many console generations have you been around for? This one we just had was my fifth, but the PS3 and Xbox 360 era was so abnormally long that it's quite possible some of you have only been really paying attention for one or two, even if you lived through a couple more.

The evolution of triangular boobs

If you can cast your mind back to the end of the 16-bit era you'll recall one of the most dramatic transitions in terms of graphics. The change from mostly 2D, sprite-based games to the explorable 3D worlds of the 32-bit era was one of the most obvious graphical evolutions the industry has ever produced. The next three generations haven't introduced anything as groundbreaking, despite the massive leaps in graphics technology, and that, unfortunately, makes generational leaps much harder to sell.

Although 3D displays gave it a half-hearted go, I don't think we'll see another revolution like that again until we embrace VR or direct brain interfaces or [insert future technology here]. Everything that has happened since we hit the "world you can run around in" point with the magic of polygons has been improving on that formula.

The improvements have been amazing, of course. Just compare any 3D game developed specifically for PS4 and Xbox One with one from that first generation of 3D world games - the first Tomb Raider, for example.

The numbers are supposed to go up, they scream. I won't enjoy this game unless it has bigger numbers than the one before.

When you look at a 2014 and 1996 game side by side, the nearly two decades between them are super obvious. "Ah," you say. "The power of new hardware."

But of course it's not just the hardware, and the proof of that is when you look at games within the same generation. Tomb Raider, with its triangle boobs, mysteriously flat lighting, and tiny enclosed spaces (no, they really were tiny; they felt huge to you, but they weren't. Check the Anniversary designer commentary) isn't a patch on Tomb Raider 3, which appeared on many of the same platforms just two years later.

assassins_creed_unity_2

In that space of time, Core Design (and everyone else) learned a great deal about how to make games look good. It's not just a matter of improving optimisation, so that more polygons can be packed into a console's limited memory: it's about numerous effects that you only notice if you really look for them. Water. Lighting. Physics. Particles. Dynamic animations. The Lara Croft of Tomb Raider 3 isn't shaped like a Toblerone, but she also walks under an open sky, which occasionally rains. She tramps through water that doesn't look like sheets of paper on a treadmill. Her hair hangs behind her and is affected by gravity, inertia and momentum as she flips about the environment.

These changes and their modern equivalents may seem small, but on the back end they're not. They take huge amounts of work and present tremendously difficult problems for developers. Tech like Unreal and CryEngine, as well as various unlicensed engines like Square Enix's Luminous, is making great strides in getting better and better and better. We're seeing huge improvements in the simulation of cloth, and the way light hits skin. Fog that eddies and swirls at your feet, and enemies who react to attacks as they come rather than when they finish their previous animation.

Developers do their best to communicate these evolutions, but as Crytek has said, it's increasingly "difficult to wow people", because the things that are changing in graphics tech are subtle. There are no more huge leaps from 2D to 3D, from a handful of textures to oodles of them, from closed worlds to open ones.

Yeah, no, this will probably never happen.

Despite the great progress being made, I think we're a long way off another major evolution in graphics tech of the sort we saw back in the 1990's. I suspect we'll make the jump to VR (a genuine revolution in gaming despite the "worse" graphics) or, you know, direct brain interfaces or whatever, before graphics does anything so massive that you stop and stare the way you did a few transitions ago.

Making games with graphics that meet generational expectations is alarmingly expensive, and the fidelity we see in tech demos simply isn't possible in interactive games, and may not be for a very long time. The next major evolution in graphics tech may well be something we, as the end user, don't really appreciate: better tools to automate the creation of graphics, taking a lot of the work and more importantly money out of making lusciously beautiful environments.

That's what's happening this console generation. Graphics are "better", sure, but the things that are better are hard to communicate and go largely unappreciated by the mass market who don't ever stop to think about how what they're seeing is made.

assassins_creed_unity_5

The numbers racket

Now let's wind it back, and talk about what happened a few generations ago. My colleagues may start to squirm in their seats a bit here, but I blame consumers, developers and publishers just as much: between the lot of us, we created a narrative and a dialogue space about graphics that was dominated by numbers.

This is an old problem. There's a joke that German games writers always ask how many polygons are on-screen in any game, but we all do it. We've been doing it for years. SHOOTY MCBANGBANG HAS MORE POLYGONS ON SCREEN THAN ANY GAME IN HISTORY - that's a story you and I can understand. It makes you click things, and it makes you buy games. Next time a game comes out, you naturally want to know how many polygons it has, and if it is more, you are happy.

SHOOTY MCBANGBANG HAS MORE POLYGONS ON SCREEN THAN ANY GAME IN HISTORY – that’s a story you and I can understand. Next time a game comes out, you naturally want to know how many polygons it has, and if it is more, you are happy.

So when we made the jump to PS3 and Xbox 360, we all together made HD a Thing, in order to sell a lot of video games. (Selling a lot of video games is good for everyone, see. The more video games that sell, the more diverse they are, and the more there is for everyone to enjoy, and hopefully get paid for.)

HD was a really compelling story. At the time the console transition happened, HD TVs still weren't super common. It's hard to remember, because it was a long time ago, but a lot of people hooked their first PS3 or Xbox 360 up to a CRT TV with an AV cable. Now everyone has a flatscreen; you can't even buy a CRT in most countries, and you have to pay recyclers to take them away.

The difference between an SD display and an HD display is immediately noticeable. It's not like the PS3 and Xbox 360 didn't have plenty to offer besides a higher max resolution, but you know how it is during console transitions - every game is being made cross-gen, and nobody knows how to optimise the damn things yet, so the difference between the old and new versions of a game aren't so noticeable that you can really sell consoles with them. (Many hardcore gamers - myself among them! - buy them based on the shimmering promise of what's to come. But we make up a very small fraction of the consumer base Sony, Microsoft and Nintendo are chasing.)

HD still hasn't really taken off in Japan. Everyone might have a great TV, but a lot of the most popular games aren't designed to take advantage of it. In the west, though, it became the dominant conversation about PS3 and Xbox 360. Suddenly everyone needed an HD display and had strong opinions on upscaling. The fact that the PS3 was initially the only console with "true" HD games (1080p) while Xbox 360 meandered along at 720p was a beautiful gift to flaming fanboy wars everywhere.

assassins_creed_unity_4

It went away after a while. Xbox 360 tech caught up with the PS3's (which had other, much more difficult optimisation walls to climb before approximate platform parity was achieved), games were either 1080p or they weren't, and only PC gamers carried on talking about not wanting pixels stretched across their screens.

Suddenly, a few years ago, it all came back: now everything had to be in 1080p, and it had to be 60 FPS. If it wasn't, there was a problem. The PC Master Race rose ascendent as ageing consoles failed to meet the challenge, although it was regularly brought crashing down by being stuck with resolution and frame rate-locked ports.

This recent generation has fed the drama extravagantly. The Xbox One's slight handicap led to platform disparity during the first few months of its lifetime, and oh boy, does that make a story. As games media, it's impossible not to talk about, no matter how bored you might be of doing so; fail to mention every single instance and you're showing some bias or other. And it is genuinely interesting, as one measure of how hardware is stacking up in these early, bitter battles, to see how well developers are able to optimise for each.

There are so many other measures of a console or game’s success, and its technical achievement, than this arbitrary standard of 1080p/60 FPS that we’ve all decided is the Holy Grail of gaming.

But there are so many other measures of a console or game's success, and its technical achievement, than this arbitrary standard of 1080p/60 FPS that we've all decided is the Holy Grail of gaming. We've decided that 60 FPS is better because it's twice as much as 30 FPS, or because we think it has something to do with refresh rates, or because we've been told it's better for shooters or racers or fighters for quite nebulous reasons. We've decided that no matter what resolution our display actually is, a game needs to fit it pixel-for-pixel, and anything else is substandard because we can definitely see the difference.

The resolution conspiracy

Whatever, people; if you want to insist that only 1080p 60 FPS will do, good luck with that boycott. I'm sorry you'll miss out on some really interesting games because of it. You will, though, because although this generation of consoles is no doubt capable of 1080p 60 FPS when a developer decides - technical limitations aside - that's the best aesthetic choice, it will still be a while before the tools used to build games are optimised for this generation, so as to make that achievable without compromise.

Here we hit the crux of the issue: compromise. Armchair developers seem to believe that it's easy to reach the standards they've decided are best for games, and there's a lot of wild talk about Ubisoft (and others) deliberately holding back games when they could have made them "better". When you take off your tinfoil hat and rejoin the human race, you'll realise that no publisher willing to throw millions of dollars at a project is going to deliberately keep it from being its best. Why would it do that? Ubisoft didn't do that with Assassin's Creed: Black Flag - it bloody well patched the damn thing at the first opportunity, even though that made the PS4 version "better" in this stupid competition of numbers. It doesn't care what Microsoft thinks; it wants you to buy its game because it's the best game there is.

assassins_creed_unity_3

Ubisoft has said there may be "thousands" of active AI NPCs at any time.

If we assume that a developer has elected to use a lower resolution and frame rate due to technical limitations rather than aesthetic choice, then there's still no reason to start frothing at the mouth (not that it ever stops you). You have to ask yourself: what is it about this game that makes it different to others that came before it, which did reach these standards I have decided are the best ones?

Ubisoft has made no secret of it. Ubisoft Montreal chose to devote its resources to developing a new kind of city for Assassin's Creed: Unity. Unlike Rogue, which is an iteration, Unity is, well, a revolution - it makes big changes to the underlying tech in order to build a new kind of experience for the player. That means real crowds, not two dozen cookie-cutter extras filtering endlessly though a few select streets. It means AI that reacts to the player in more realistic ways than exclaiming in surprise every few seconds. It means buildings built on a 1:1 scale, and the largest single city ever seen in the series. It means seamless co-op (multiplayer is never friendly to resolution and frame rate), customisation, better stealth and parkour - everything Ubisoft has been trying to message while you've been marching up and down shouting "Give me 1080p 60 FPS or give me death a comment thread to complain in!"

Far Cry 4 creative director Alex Hutchinson recently said that he thinks that all this talk of resolutions and frame rate is a product of the echo chamber of comments threads, which is certainly a possibility. He'd play a game even if it had SNES graphics, he said, as long as it was doing something cool and new and interesting.

So would I, Hutch (I can call him Hutch because he's Australian and we're probably related). But I have to admit the visuals are important to the experience - in a way that goes far beyond numbers and standards. When I go back and play older open world games, I'm immediately struck by how much less beautiful they are than current games, and it makes them less fun.

That said, it's not just the textures, lighting, cloth, particle effects, shadows and water - things I notice more and more. It's the empty spaces; the clusters of useless assets; the gormless two-dimensional NPCs. It's the feeling that, apart from smashing a few crates, my actions make no real impact on the world around me. It's the stiff and staid way I'm funnelled between invisible walls rather than being free to explore the space. It's the way the game cannot cope with my attempts to engage with it.

These are all part of the visual experience of a game that mean a lot more to me than how many hours someone spent painting a rock. If Ubisoft delivers on its promises to build the best, more realistic and most of all alive city video gaming has ever seen, than I don't care if it runs at 720p and locks me to 24 FPS.

If you do, I wonder what it is you want out of gaming - an experience, or a tech demo for your expensive gadgets?

Sign in and unlock a world of features

Get access to commenting, homepage personalisation, newsletters, and more!

In this article

Assassin's Creed Unity

PS4, Xbox One, PC

Related topics
About the Author
Brenna Hillier avatar

Brenna Hillier

Contributor

Based in Australia and having come from a lengthy career in the Aussie games media, Brenna worked as VG247's remote Deputy Editor for several years, covering news and events from the other side of the planet to the rest of the team. After leaving VG247, Brenna retired from games media and crossed over to development, working as a writer on several video games.

Comments