Greetings from the low-end side of town

If there’s one thing I’ve always envied console users for, it’s the universal system compatibility. You buy a disc, pop it in, and performance (I’m ignoring TV quality here) will be the same for you and other players you encounter.

Not quite so for a PC gamer. I’m sure everybody is familiar with the so-called minimum requirements for a game. This brings me to the first problem. There’s a huge ambiguity over what is acceptable as game performance. In game A, meeting the minimum requirements might be good enough to actually load the game into memory and gaze at a slideshow of half-baked pictures, while for game B, meeting them might result in a fluid – although low-end – gaming experience.

So here’s my proposal for what a true minimum requirement would have to look like. A configuration which:

Taken an average over the full game (main quest / campaign if we’re talking non-linear gameplay)

  • renders at an acceptable framerate, say 25 or 30
  • at a fixed resolution (1024px)
  • benchmarked on a fresh system with the given hardware.

Options should – of course – be set to low quality for this benchmark, but players should not have to deal with serious gameplay disadvantages. All vital objects should be rendered, and ‘pop-in’ (the sudden appearance of game objects at the horizon, to preserve GPU memory) should be non-obtrusive to gameplay. I’m running out of fingers counting the occurences of a low-settings-configured game pointing to things in the horizon which haven’t been rendered yet.

I am aware that game demos are the proverbial hook to draw new buyers in, but it is very important that they represent the state of the actual game, performance-wise. A lot of games are pirated to check out overall performance, because the demo lacks in benchmarking possibilities. And once you’ve got a pirated copy of a game which performs so-so, who’s going to run to the retailer?

Now, I am aware that unlike consoles, not every PC system is the same, even if the hardware matches. I too shrug every time I get my hands on a box which is loaded with bloatware, degrading overall system performance. All I’m asking is for these requirements to be realistic. If an internet connection is required to activate the game, it has to be on the box. If there’s an install limit, same story.

Another thing: it is not that hard to come up with new and shiny graphical improvements for games. The algorithms are there, and hardware evolution is catching up. The bottom line is: do we really need them? In the middle of a trans-galaxial firefight, who’s going to wonder whether or not the smoke on the battlefield is volumetric, has soft edges and correct physical reaction? As a game developer, you’ve got to draw the line somewhere, because nowadays, it’s a thin line between a fantastic tech demo and a very crappy game experience.

Sure, if your game is all about smoke, you better make sure it’s the best damn smoke I’ve ever seen. If the game is about much more, careful planning of the performance budget is required. Example in case: Call Of Duty 4. Praised as one of the best games of 2007, it effectively balances showdown and performance. Sure, it’s all scripted sequences and sprite-based-explosions, but did anyone really bother?

As a conclusion, some games which I’ve been playing in this busy release season:

  • Left4Dead demo: I don’t know how they pull it off every time, but Valve surprised me once again with the performance of their latest build of the Source engine. I barely meet the system requirements, but medium settings is all the rage.
  • Fallout 3: It looks fugly on my rig. But it’s Fallout.
  • World Of Goo: I yet have to find the first person who doesn’t play this game straight on for the first hour. The music, the animations, the puzzles … it’s actually really good.