If you're reading this, you're probably a PC gamer. You've probably invested a decent amount of money in a fast graphics card, a decent-sized monitor, and more cheap RAM than you probably needed. I'm willing to bet you've also played some of the latest shooters on that gaming rig of yours.
If my description fits you, then you must have realized that your PC can carry much bigger loads than the lightweight Modern Warfare engine and its ilk. The sad truth is that today's games are developed with six-year-old consoles in mind, and they look the part, too. High-end gaming PCs are roughly an order of magnitude more powerful than the Xbox 360 and PlayStation 3. Playing Modern Warfare 3 on the PC is a bit like taking a Ferrari to go grocery shopping; as flashy as it might look, the resources at hand are being woefully underused.
None of that should be news to you. The question is, what happens next?
Epic Games Technical Director Tim Sweeney said in September that Unreal Engine 4 won't be ready 'til "probably around 2014." Speaking to Develop the following month, Epic President Mike Capps noted, "I want Unreal Engine 4 to be ready far earlier than UE3 was; not a year after the consoles are released. I think a year from a console’s launch is perfectly fine for releasing a game, but not for releasing new tech. We need to be there day one or very early."
Unless there's some miscommunication inside Epic, those two statements tell us the successors to the Xbox 360 and PlayStation 3 won't be out until late 2013 or early 2014. That's a long time to wait with PCs getting more powerful and game developers still forced to target the same old platforms. However, I don't think that means we have to suffer continuing stagnation in PC graphics for the next two years. There's plenty that can be done to improve visual fidelity without tessellating everything and soaking images in photorealistic shader effects.
Mainly, I'm talking about four little visual eccentricities we've been living with for far too long—eccentricities that fast PC hardware could eradicate while we wait for the next generation of games.
- Jaggies. That's a big one, and some games have already made good progress toward eliminating jagged edges, but we ain't there yet. Ideally, multisampling and post-process techniques like FXAA need to be combined to offer accurately smoothed polygon edges and to eliminate jaggies inside and around transparent textures. Some games already do that, but it needs to become the norm rather than the exception.
- Shimmering. That one is trickier but just as bothersome. In all too many titles, distant textures shimmer as you move around the game world. Sometimes, it's because of inadequate antialiasing. Sometimes, even cranking antialiasing and anisotropic all the way up doesn't help. It needs to be fixed. Can you imagine watching a Pixar movie where some parts of the scene shimmer as the camera pans around? Unthinkable.
- Vsync. Unfortunately, vsync is the red-headed stepchild of the PC gaming world. Hardcore gamers leave it off, because it can induce low frame rates and input lag, and us reviewers and benchmarks keep it disabled in our tests, because if we didn't, we'd be obscuring the differences between different graphics solutions. I'm not saying those aren't good reasons to leave vsync off, but the fact remains: screen tearing is an ugly thing that needs to go away. Again, imagine a Pixar movie; now think how it would look with screen tearing in action scenes.
- Frame rates below a constant 60 Hz. Playing id Software's Rage earlier this year was something of a revelation. Here was a game that prioritized fluidity of motion above all else, adjusting graphical detail dynamically in order to offer consistent smoothness. And it was good. There's nothing more frustrating than walking into a wide outdoor scene and having your frame rate plummet all of a sudden. I saw that all too much in Skyrim, which was buttery smooth almost everywhere and frustratingly choppy when I was trying to take in the view from a mountaintop. That's no good. Movies don't have smooth parts and stuttery parts. Neither does real life, and neither should video games. Some of you might interject that 30 Hz is a good enough target, since that's the norm for TV, but I disagree. If you're playing a fast-paced action game, you want quick-twitch responsiveness in the blink of an eye, or quicker.
- Microstuttering. This kinda ties into the 60 Hz thing, but it's an entirely different problem. Even if a game does manage to pull off a 60 FPS average frame rate, that's no guarantee the illusion of motion will be preserved. Multi-GPU solutions especially have problems with microstuttering, but single-GPU solutions aren't immune, either. You'll want to peruse Scott's Inside the second article for all the specifics and watch the video here to see the real-world implications. In a nutshell, microstuttering is difficult to detect but easy to see; it damages the illusion of motion by having objects skip across the screen instead of gliding smoothly. It needs to go disappear one way or another. I hate to bring up the Pixar movie analogy again, but I don't recall any skipping or jittering in The Incredibles.
I think those are the big ones. Rage already got us part of the way there with a hard 60 Hz target and beautifully effective vsync. Now, other games need to follow suit and iron out the other kinks mentioned above. I certainly hope AMD and Nvidia will push developers in that direction, too. After all, extra graphics horsepower can be put to good use making games look smoother, cleaner, and more seamless—graphics horsepower that would otherwise go unused... or, more crucially, un-purchased. Yes, I know about PhysX, stereoscopic 3D, and PC-only DirectX 11 eye candy, but the GPUs that come out next year and the year after that will no doubt have the brawn to handle those things with cycles to spare.
Of course, if my wishes are fulfilled, then we'll be in an interesting position when the next-gen consoles do come out. If Epic's Samaritan demo is any indication, future titles will take another step toward photorealism. I expect hardware requirements will suddenly spike up, but does that mean we'll be forced to trade silky smooth, shimmer-free graphics just for a taste of all the eye candy future games can throw at us? I certainly hope not. I hope next-gen titles will manage to offer smooth, distraction-free imagery with an added dose of realism. Otherwise, what would be the point? Photorealism with screen tearing, shimmering textures, and microstuttering wouldn't be photorealism at all.