Earlier this week, we posted a news item about an article written by Ryan Shrout over at PC Perspective. In the article, Ryan revealed some problems with using a Radeon CrossFire multi-GPU setups and multiple displays.
Those problems look superficially similar to the ones we explored in our Radeon HD 7990 review. They were partially resolved—for single displays with resolutions of 2560x1600 and below, and for DirectX 10/11 games—by AMD's frame pacing beta driver. AMD has been forthright that it has more work to do in order to make CrossFire work properly with multiple displays, higher resolutions, and DirectX 9 games.
I noticed that many folks reacted to our news item by asking why this story matters, given the known issues with CrossFire that have persisted literally for years. I have been talking with Ryan and looking into these things for myself, and I think I can explain.
Let's start with the obvious: this story is news because nobody has ever looked at frame delivery with multi-display configs using these tools before. We first published results using Nvidia's FCAT tools back in March, and we've used them quite a bit since. However, getting meaningful results from multi-display setups is tricky when you can only capture one video output at a time, and, rah rah other excuses—the bottom line is, I never took the time to try capturing, say, the left-most display with the colored FCAT overlay and analyzing the output. Ryan did so and published the first public results.
That's interesting because, technically speaking, multi-display CrossFire setups work differently than single-monitor ones. We noted this fact way back in our six-way Eyefinity write-up: the card-to-card link over a CrossFire bridge can only transfer images up to to four megapixels in size. Thus, a CrossFire team connected to multiple displays must pass data from the secondary card to the primary card over PCI Express. The method of compositing frames for Eyefinity is simply different. That's presumably why AMD's current frame-pacing driver can't work its magic on anything beyond a single, four-megapixel monitor.
We already know that non-frame-paced CrossFire solutions on a single display are kind of a mess. Turns out that the problems are a bit different, and even worse, with multiple monitors.
I've been doing some frame captures myself this week, and I can tell you what I've seen. The vast majority of the time, CrossFire with Eyefinity drops every other frame with alarming consistency. About half of the frames just don't make it to the display at all, even though they're counted in software benchmarking tools like Fraps. I've seen dropped frames with single-display CrossFire, but nothing nearly this extreme.
Also, Ryan found a problem in some games where scan lines from two different frames become intermixed, causing multiple horizontal tearing artifacts on screen at once. (That's his screenshot above.) I've not seen this problem in my testing yet, but it looks to be a little worse and different from the slight "leakage" of an old frame into a newer one that we observed with CrossFire and one monitor. I need to do more testing in order to get a sense of how frequently this issue pops up.
The bottom line is that Eyefinity and CrossFire together appear to be a uniquely bad combination. Worse, these problems could be tough to overcome with a driver update because of the hardware bandwidth limitations involved.
This story is a bit of a powder keg for several reasons.
For one, the new marketing frontier for high-end PC graphics is 4K displays. As you may know, current 4K monitors are essentially the same as multi-monitor setups in their operation. Since today's display ASICs can't support 4K resolutions natively, monitors like the Asus PQ321Q use tiling. One input drives the left "tile" of the monitor, and a second feeds the right tile. AMD's drivers handle the PQ321Q just like a dual-monitor Eyefinity setup. That means the compositing problems we've explored happen to CrossFire configs connected to 4K displays—not the regular microstuttering troubles, but the amped-up versions.
Ryan tells me he was working on this story behind the scenes for a while, talking to both AMD and Nvidia about problems they each had with 4K monitors. You can imagine what happened when these two fierce competitors caught wind of the CrossFire problems.
For its part, Nvidia called together several of us in the press last week, got us set up to use FCAT with 4K monitors, and pointed us toward some specific issues with their competition. One the big issues Nvidia emphasized in this context is how Radeons using dual HDMI outputs to drive a 4K display can exhibit vertical tearing right smack in the middle of the screen, where the two tiles meet, because they're not being refreshed in sync. This problem is easy to spot in operation.
GeForces don't do this. Fortunately, you can avoid this problem on Radeons simply by using a single DisplayPort cable and putting the monitor into DisplayPort MST mode. The display is still treated as two tiles, but the two DP streams use the same timing source, and this vertical tearing effect is eliminated.
I figure if you drop thousands of dollars on a 4K gaming setup, you can spring for the best cable config. So one of Nvidia's main points just doesn't resonate with me.
And you've gotta say, it's quite the aggressive move, working to highlight problems with 4K displays just days ahead of your rival's big launch event for a next-gen GPU. I had to take some time to confirm that the Eyefinity/4K issues were truly different from the known issues with CrossFire on a single monitor before deciding to post anything.
That said, Nvidia deserves some credit for making sure its products work properly. My experience with dual GeForce GTX 770s and a 4K display has been nearly seamless. Plug in two HDMI inputs or a single DisplayPort connection with MST, and the GeForce drivers identify the display and configure it silently without resorting to the Surround setup UI. There's no vertical tearing if you choose to use dual HDMI inputs. You're going to want to use multiple graphics cards in order to get fluid gameplay at 4K resolutions, and Nvidia's frame metering tech allows our dual-GTX 770 SLI setup to deliver. It's noticeably better than dual Radeon HD 7970s, and not in a subtle way. Nvidia has engineered a solution that overcomes a lot of obstacles in order to make that happen. Give them props for that.
As for AMD, well, one can imagine the collective groan that went up in their halls when word of these problems surfaced on the eve of their big announcement. The timing isn't great for them. I received some appeals to my better nature, asking me not to write about these things yet, telling me I'd hear all about AMD's 4K plans next week. I expect AMD to commit to fixing the problems with its existing products, as well as unveiling a newer and more capable high-end GPU. I'm looking forward to it.
But I'm less sympathetic when I think about how AMD has marketed multi-GPU solutions like the Radeon HD 7990 as the best solution for 4K graphics. We're talking about very expensive products that simply don't work like they should. I figure folks should know about these issues today, not later.
My hope is that we'll be adding another chapter to this story soon, one that tells the tale of AMD correcting these problems in both current and upcoming Radeons.
|Cloud surge, Surface sales buoy Microsoft's quarterly results||53|
|First-person parkour zombie-fest Dying Light is out now||14|
|Unreal Engine 4 demo blurs line between rendered and reality||39|
|EVGA unleashes four new ambidextrous gaming mice||5|
|Details leak out on AMD's first Zen-based desktop CPUs||128|
|Some 840 EVOs still vulnerable to read speed slowdowns||78|
|Nvidia: the GeForce GTX 970 works exactly as intended||147|
|Report: 4GB of RAM coming to GTX 960 in March||119|
|Early deal of the week: A 27" G-Sync monitor for $480||43|