Single page Print

Now for a whole mess of a issues
The Radeon R9 295 X2 is a multi-GPU graphics solution, and that very fact has triggered a whole mess of issues with a really complicated backstory. The short version is that AMD has something of a checkered past when it comes to multi-GPU solutions. The last time they debuted a new dual-GPU graphics card, the Radeon HD 7990, it resulted in one of the most epic reviews we've ever produced, as we pretty much conclusively demonstrated that adding a second GPU didn't make gameplay anywhere near twice as smooth as a single GPU. AMD has since added a frame-pacing algorithm to its drivers in order to address that problem, with good results. However that fix didn't apply to Eyefinity multi-display configs and didn't cover even a single 4K panel. (The best current 4K panels use two "tiles" and are logically treated as dual displays.)

A partial fix for 4K came later, with the introduction of the Radeon R9 290X and the Hawaii GPU, in the form of a new data-transfer mechanism for CrossFire known as XDMA. Later still, AMD released a driver with an updated frame pacing for older GPUs, like the Tahiti chip aboard the Radeon R9 280X and the HD 7990.

And, shamefully, we haven't yet tested either XMDA CrossFire or the CrossFire + 4K/Eyefinity fix for older GPUs. I've been unusually preoccupied with other things, but that's still borderline scandalous and sad. AMD may well have fixed its well-documented CrossFire issues with 4K and multiple displays, and son, testing needs to be done.

Happily, the R9 295 X2 review seemed like the perfect opportunity to spend some quality time vetting the performance of AMD's current CrossFire solutions with 4K panels. After all, AMD emphasized repeatedly in its presentations that the 295 X2 is built for 4K gaming. What better excuse to go all out?


Source: AMD.

So I tried. Doing this test properly means using FCAT to measure how individual frames of in-game animation are delivered to a 4K panel. Our FCAT setup isn't truly 4K capable, but we're able to capture one of the two tiles on a 4K monitor, at a resolution of 1920x2160, and analyze performance that way. It's a bit of a hack, but it should work.

Emphasis on should. Trouble is, I just haven't been able to get entirely reliable results. It works for GeForces, but the images coming in over HDMI-to-DVI-to-splitter-to-capture-card from the Radeons have some visual corruption in them that makes frame counting difficult. After burning a big chunk of last week trying to make it work by swapping in shorter and higher-quality DVI cables, I had to bail on FCAT testing and fall back on the software-based Fraps tool in order to get reliable results. I will test XMDA CrossFire and the like with multiple monitors using FCAT soon. Just not today.

Fraps captures frame times relatively early in the production process, when they are presented as final to Direct3D, so it can't show us exactly when frames are reaching the screen. As we've often noted, though, there is no single place where we can sample to get a perfect picture of frame timing. The frame pacing and metering methods used in multi-GPU solutions may provide regular, even frame delivery to the monitor, but as a result, the animation timing of those frames may not match their display times. Animation timing is perhaps better reflected in the Fraps numbers—depending on how the game engine tracks time internally, which varies from game to game.

This stuff is really complicated, folks.

Fortunately, although Fraps may not capture all the nuances of multi-GPU microstuttering and its mitigation, it is a fine tool for basic performance testing—and there are plenty of performance challenges for 4K gaming even without considering frame delivery to the display. I think that'll be clear very soon.

One more note: I've run our Fraps results though a three-frame low-pass filter in order to compensate for the effects of the three-frame Direct3D submission queue used by most games. This filter eliminates the "heartbeat" pattern of high-and-then-low frame times sometimes seen in Fraps results that doesn't translate into perceptible hitches in the animation. We've found that filtered Fraps data corresponds much more closely to the frame display times from FCAT. Interestingly, even with the filter, the distinctive every-other-frame pattern of multi-GPU microstuttering is evident in some of our Fraps results.

The 4K experience
We've had one of the finest 4K displays, the Asus PQ321Q, in Damage Labs for months now, and I've been tracking the progress of 4K support in Windows, in games, and in graphics drivers periodically during that time. This is our first formal look at a product geared specifically for 4K gaming, so I thought I'd offer some impressions of the overall experience. Besides, I think picking up a $3000 4K monitor ought to be a prerequisite for dropping $1500 on the Radeon R9 295 X2, so the 4K experience is very much a part of the overall picture.

The first thing that should be said is that this 31.5" Asus panel with a 3840x2160 pixel grid is a thing of beauty, almost certainly the finest display I've ever laid eyes upon. The color reproduction, the uniformity, the incredible pixel density, the really-good-for-an-LCD black levels—practically everything about it is amazing and wondrous. The potential for productivity work, video consumption, or simply surfing the web is ample and undeniable. To see it is to want it.

The second thing to be said is that—although Microsoft has made progress and the situation isn't bad under Windows 8.1 when you're dealing with the file explorer, desktop, or Internet Explorer—the 4K support in Windows programs generally is still awful. That matters because you will want to use high-PPI settings and to have text sizes scaled up to match this display. Reading five-point text is not a good option. Right now, most applications do scale up their text size in response to the high-PPI control panel settings, but the text looks blurry. Frustrating, given everything, but usable.

The bigger issues have to do with the fact that today's best 4K displays, those that support 60Hz refresh rates, usually present themselves to the PC as two "tiles" or separate logical displays. They do so because, when they were built, there wasn't a display scaler ASIC capable of handling the full 4K resolution. The Asus PQ321Q can be connected via dual HDMI inputs or a single DisplayPort connector. In the case of DisplayPort, the monitor uses multi-stream transport mode to essentially act as two daisy-chained displays. You can imagine how this reality affects things like BIOS screens, utilities that run in pre-boot environments, and in-game menus the first time you run a game. Sometimes, everything is squished up on half of the display. Other times, the image is both squished and cloned on both halves. Occasionally, the display just goes black, and you're stuck holding down the power button in an attempt to start over.

AMD and Nvidia have done good work making sure their drivers detect the most popular dual-tile 4K monitors and auto-configure them as a single large surface in Windows. Asus has issued multiple firmware updates for this monitor that seem to have helped matters, too. Still, it often seems like the tiling issues have moved around over time rather than being on a clear trajectory of overall improvement.

Here's an example from Tomb Raider on the R9 295 X2. I had hoped to use this game for testing in this review, but the display goes off-center at 3840x2160. I can't seem to make it recover, even by nuking the registry keys that govern its settings and starting over from scratch. Thus, Lara is offset to the left of the screen while playing, and many of the in-game menus are completely inaccessible.



AMD suggested specifying the aspect ratio for this game manually to work around this problem, but doing so gave me an entire game world that was twice as tall as it should have been for its width. Now, I'm not saying that's not interesting and maybe an effective substitute for some of your less powerful recreational drugs, because wow. But it's not great for real gaming.

Another problem that affects both AMD and Nvidia is a shortage of available resolutions. Any PC gamer worth his salt knows what to do when a game doesn't quite run well enough at the given resolution, especially if you have really high pixel densities at your command: just pop down to a lower res and let the video card or monitor scale things up to fill the screen. Dropping to 2560x1440 or 1920x1080 would seem like an obvious strategy with a display like this one. Yet too often, you're either stuck with 3840x2160 or bust. The video drivers from AMD and Nvidia don't consistently expose even these two obvious resolutions that are subsets of 3840x2160 or anything else remotely close. I'm not sure whether this issue will be worked out in the context of these dual-tile displays or not. Seems like they've been around quite a while already without the right thing happening. We may have to wait until the displays themselves get better scaler ASICs.

There's also some intermittent sluggishness in using a 4K system, even with the very fastest PC hardware. You'll occasionally see cases of obvious slowness, where screen redraws are laborious for things like in-game menus. Such slowdowns have been all but banished at 2560x1600 and below these days, so it's a surprise to see them returning in 4K. I've also encountered some apparent mouse precision issues in game options menus and while sniping in first-person shooters, although such things are hard to separate precisely from poor graphics performance.

In case I haven't yet whinged enough about one of the coolest technologies of the past few years, let me add some about the actual experience of gaming in 4K. I've gotta say that I'm not blown away by it, when my comparison is a 27" 2560x1440 Asus monitor, for several reasons.

For one, game content isn't always 4K-ready. While trying to get FCAT going, I spent some time with this Asus monitor's right tile in a weird mode, with only half the vertical resolution active. (Every other scanline was just repeated.) You'd think that would be really annoying, and on the desktop, it's torture. Fire up a session of Borderlands 2, though, and I could play for hours without noticing the difference, or even being able to detect the split line, between the right and left tiles. Sure, Crysis 3 is a different story, but the reality is that many games won't benefit much from the increased pixel density. Their textures and models and such just aren't detailed enough.

Even when games do take advantage, I'm usually not blown away by the difference. During quick action, it's often difficult to appreciate the additional fidelity packed into each square inch of screen space.

When I do notice the additional sharpness, it's not always a positive. For example, I often perceive multiple small pixels changing quickly near each other as noise or flicker. The reflections in puddles in BF4 are one example of this phenomenon. I don't think those shader effects have enough internal sampling, and somehow, that becomes an apparent problem at 4K's high pixel densities. My sense is that, most of the time, lower pixel densities combined with supersampling (basically, rendering each pixel multiple times at an offset and blending) would probably be more pleasing overall than 4K is today. Of course, as with many things in graphics, there's no arguing with the fact that 4K plus supersampling would be even better, if that were a choice. In fact, supersampling may prove to be an imperative for high-PPI gaming. 4K practically requires even more GPU power and will soak it up happily. Unfortunately, 4X or 8X supersampling at 4K is not generally feasible right now.

Don't get me wrong. When everything works well and animation fluidity isn't compromised, gaming at 4K can be a magical thing, just like gaming at 2560x1440, only a little nicer. The sharper images are great, and edge aliasing is much reduced at high PPIs.

I'm sure things will improve gradually as 4K monitors become more common, and I'm happy to see the state of the art advancing. High-PPI monitors are killer for productivity. Still, I think some other display technologies, like G-Sync/Freesync-style variable refresh intervals and high-dynamic-range panels, are likely to have a bigger positive impact on gaming. I hope we don't burn the next few years on cramming in more pixels without improving their speed and quality.