Triple-screen gaming on today’s graphics cards

Nearly three years have passed since AMD made multi-screen gaming a reality with Eyefinity-equipped Radeons. Matrox technically got there first with its ill-fated Parhelia graphics card—and a decade ago, in fact—but its TripleHead scheme never caught on with gamers. The Parhelia’s underlying GPU wasn’t really fast enough to produce smooth frame rates across multiple displays, and that didn’t encourage developers to take advantage of the capability.

Eyefinity has been much more successful. It appeared first in the Radeon HD 5870, which had ample horsepower to deliver a smooth gaming experience across multi-display setups. More than a year later, Nvidia followed up with its own implementation, dubbed Surround. By then, the ball was already rolling with developers. Most of today’s new blockbusters support the obscenely high resolutions multi-screen setups can display.

Over the last couple of years, other factors have conspired to make Eyefinity and Surround configs more attractive. While the pixel-pushing power of PC graphics cards continues to grow at a rapid pace, games still tend to be designed with anemic console hardware in mind. Forget about having enough graphics grunt to deliver a smooth gaming experience. High-end GPUs can easily run some of the latest titles at six-megapixel resolutions with the eye candy turned all the way up.

Six megapixels is the approximate total resolution of three 1080p monitors. LCDs with 1080p resolution have gotten a lot more affordable; even those featuring IPS panels have migrated south of $300. Three-screen setups often cost less than a single 30″ monitor, and their additional screen real estate has productivity perks that extend beyond wrap-around gaming. You don’t need to wear dorky 3D glasses, either.

The stars would seem to be aligned for triple-head gaming to really take off. To handicap its chances, we rigged up a three-screen array and played a stack of the latest games on a couple of high-end graphics cards: Asus’ Radeon HD 7970 DirectCU TOP and Gigabyte’s GeForce GTX 680 OC. Both cards have juiced-up clock speeds, beefy custom coolers, and display outputs galore. Keep reading to see how they fared in our look at the state of surround gaming on the PC.

The allure of multiple displays

The case for triple-display setups comes down to money, pixels, and the allocation of screen real-estate. When we said 1080p IPS displays could be purchased for under $300, we were being conservative. Newegg has Dell’s DisplayPort-equipped UltraSharp U2312HM for only $250. The screen’s 23″ panel has 1920×1080 pixels, and it’s not even the least expensive option with a 1080p resolution. Asus’ VS229H-P costs a scant $164 and spreads the same number of pixels over a 21.5″ panel. Budget IPS monitors typically feature an e-IPS variant of the technology that offers six rather than eight bits of color per channel. e-IPS displays still tend to look better than budget LCDs based on TN panel tech, though. Those TN displays cost even less than low-end IPS models.

Do the math, and it’s clear: a triple-wide monitor setup costs a lot less than a king-sized single display like Dell’s 30″ UltraSharp U3011. The U3011’s 8-bit panel and 2560×1600 resolution are impressive, but the $1200 price tag is daunting, to say the least. The 27″ UltraSharp U2711 is only $750, though its 2560×1440 resolution is a little lower. Massive IPS monitors don’t get much cheaper unless you hit eBay and buy bare-bones displays direct from Korea.

Granted, there’s definitely some appeal to having one’s desktop consolidated on a single, large surface. Monster monitors are particularly well-suited to photo editing, and it’s hard to argue with 30 inches of uninterrupted goodness. Still, the 30″ UltraSharp offers 50% fewer pixels than a triple-wide 1080p config. The total screen area is much smaller, too, and it lacks the wrap-around feel that makes surround gaming unique.

Some folks have taken advantage of the fact that flat-panel TVs offer PC-compatible inputs and even larger dimensions. TVs are relatively inexpensive, too, but their resolutions generally top out at 1080p. To avoid seeing individual pixels, one has to sit farther back, making the screen appear smaller. Playing games from a distance also takes away some of the intimacy.

When I last upgraded the displays in my own home office, I settled on a trio of Asus 24″ ProArt PA246Q monitors. These sub-$500 screens have 8-bit panels, loads of input ports, plenty of adjustment options, and a resolution of  1920×1200. I bought these displays primarily for productivity purposes. I’ve had multiple monitors connected to my desktop PC for years, and the extra screen real estate is extremely helpful when juggling the various computer-driven tasks that make up a typical work day in the Benchmarking Sweatshop. There’s more to it than just having more pixels. Being able to group applications on separate displays is the first thing I miss when switching to a single-screen desktop or notebook.

In addition to providing a large digital workspace, the three matched displays are perfect for gaming. Having three of the same screen is essential. Subtle variations in brightness, contrast, and color temperature become readily apparent when you’re staring at an image spanning multiple monitors. Even with three identical models, I had to break out our colorimeter to match the calibration of each screen exactly.

To further ensure a consistent picture, the side screens should be angled inward to provide a dead-on view when you swivel your head. TN panels don’t look very good from off-center angles, and even IPS displays can suffer from subtle color shift due to anti-glare screen coatings. The ideal angle will depend on how close you sit to the center display.

When we first looked at Eyefinity, Scott concluded that a triple-wide landscape configuration was the best option for games. I concur. Running three screens in portrait mode produces an image that’s much less stretched—3240×1920 versus 5760×1200 for my monitors—but the bezels are closer to the middle of the display area and are therefore much more annoying. Regardless of whether you’re running a portrait or landscape config, you’ll want to seek out displays with the narrowest bezels possible.

Obviously, tripling the number of displays results in a much heavier rendering load. To keep the pixels flowing smoothly, a fast graphics card is required. Let’s look at two candidates.

Asus’ Radeon HD 7970 DirectCU II TOP

We’ll start with Asus’ Radeon HD 7970 DirectCU II TOP because, well, it’s the biggest. This monster is 5.1″ tall, 2.1″ thick and occupies three expansion slots. The card’s 11″ length is about average, making the DirectCU a rather stout offering. Behold, the Henry Rollins of graphics cards:

Maybe the racing stripes are more befitting of a muscle car, but I’ve exceeded my allotment of automotive analogies for probably all of eternity. The DirectCU simply looks badass. A lot of that is due to the mass of metal sitting under the matte-black cooling shroud.

Beneath the blades of one of the dual cooling fans, we see an intricate network of six copper heatpipes. The plumbing feeds into a pair of finned radiators that wouldn’t look out of place atop a desktop CPU. Baby’s got back, too. Check out the brushed metal plate affixed to the back side of the card:

The screwed-on panel is riddled with ventilation holes to prevent hot air from accumulating. These holes are hexagonal, nicely complementing the angular lines of the rest of the cooler.

All this additional cooling hints at higher clock speeds, and Asus delivers. The core clock speed of the DirectCU’s Tahiti GPU has been raised from its default 925MHz to an even 1GHz. That speed matches the Radeon HD 7970 GHz Edition, but the Asus card lacks AMD’s new opportunistic clock-boosting mojo. The DirectCU’s 3GB of GDDR5 memory operates at 5.6 GT/s, a little slower than the 7970 GHz Edition’s 6 GT/s memory.

Of course, those speeds aren’t written in stone. The DirectCU is geared toward overclockers, and its top edge features solder points for hard9core tweakers who want to monitor and control onboard voltages precisely. The card can draw more power than typical Radeons, too. It has dual 8-pin PCIe power connectors, an upgrade over the 6+8 config found on standard flavors of the 7970. Fancy power regulation components abound, and Asus even includes an auxiliary heatsink that should be slapped onto the MOSFETs when the card is cooled with liquid nitrogen.

The DirectCU also has quite a collection of display ports. And DisplayPort ports. Mmmm… ports.

Radeon HD 7970 cards usually offer dual Mini DisplayPort outputs alongside single DVI and HDMI connectors. The DirectCU has two DVI ports and four full-sized DisplayPort outs, enough connectivity for a six-screen Eyefinity wall. There are a couple of caveats, though. The included HDMI adapter works only with the right DVI port. Also, the left DVI output offers a dual-link connection only when the left-most DisplayPort output is disabled. A switch near the CrossFire connectors flips between the output configurations.

Running a three-screen setup on the DirectCU requires the use of at least one DisplayPort connector. Unless you have a compatible monitor, you’ll need an active DisplayPort adapter, which runs about $25 on Newegg. The DisplayPort requirement isn’t unique to the Asus card. All Radeons are afflicted with this limitation except for a handful of custom Sapphire models that integrate active DisplayPort adapters onto their circuit boards.

Our LCDs have DisplayPort inputs, so we didn’t have to bother with active adapters. Since we didn’t want to make things easy for the cards, we used a mix of connectors: one DisplayPort, one DVI, and one HDMI with the adapter included in the box. The setup process was a breeze.

Once the displays are positioned and the standard Catalyst drivers are installed, putting together an Eyefinity array takes all of a couple minutes in the control panel setup wizard. The user is presented with a few configuration options, including the 3×1 setup we prefer. Once the basic layout is set, the next step ascertains the position of each screen and delivers an ultra-wide desktop. Ours measured 5760×1200 pixels to start.

We then applied bezel correction, which AMD’s drivers pretty much nailed automatically before we did a little fine-tuning. This feature extends Eyefinity’s virtual display beneath the bezels, creating the illusion that they’re merely bars blocking your view of the world. Without bezel correction, the images produced by multi-screen setups are distorted; the scene stops at the bezel border and continues on the other side as if there’s been no interruption. AMD’s bezel-correction interface is easy to use, and it left us with a 6084×1200 desktop.

The final step is deciding whether to have the Windows taskbar span all three screens or just sit on one of them. Ideally, the spanning option would intelligently groups one’s taskbar items on each display. Instead, it puts the Start button all the way over on the left and proceed from there, which isn’t terribly convenient. You’re probably better off with a single-screen taskbar centered in the middle. (Incidentally, the stretched taskbar’s flaws extend to Nvidia Surround setups, too. Blame Microsoft.)

That’s AMD’s hat in the ring. Let’s see what Nvidia has to offer.

Gigabyte’s GeForce GTX 680 OC

The GeForce camp is represented by Gigabyte’s hot-clocked spin on the GTX 680. At first glance, the card appears less imposing than its Asus counterpart. Physically, it is. The 10.8″ GeForce is slightly shorter and a fair bit squatter than the Radeon, and the cooler monopolizes only two slots. There’s no metal skin on the card’s back, just a stabilizing spine that runs along its top edge.

While the Asus card looks brutish, the Gigabyte has a certain sleekness. Credit the smooth, flowing lines of the WindForce cooling shroud. This plastic piece channels airflow from the three fans that adorn the card. Although the glossy finish picks up fingerprints with ease, the card will likely spend most of its life face-down in a mid-tower enclosure and out of view. If you can’t keep your hands off the thing once it’s been installed, you may have bigger problems.

The GTX 680 OC has more fans than the Radeon, but each of them is smaller. Gigabyte’s blades are 74 mm in diameter, while the Asus spinners have 93-mm wingspans. We’ll look at the thermal and acoustic performance of the two cards a little later in the review.

For now, feast your eyes on the copper heatpipes lurking under the smoked fan blades. The heatsink may have only three pipes, but those pipes are longer than the six on the Asus card.

Under the heatsink, the card’s Kepler GPU runs at 1071MHz, 65MHz faster than the GTX 680’s default of 1006MHz. The maximum boost clock is 1137MHz, up 79MHz from stock. There’s no increase for the memory speed, though. The GTX 680’s 2GB of GDDR5 memory remains at 6 GT/s.

Like Asus, Gigabyte populates its card with fancy electrical components. However, there are fewer extras for extreme overclockers looking to ride the ragged edge. There are also fewer outputs in the rear cluster.

The GeForce makes do with DisplayPort, HDMI, and dual DVI outputs. Unlike Eyefinity, Nvidia’s Surround scheme doesn’t require a DisplayPort connection. Triple-screen arrays can be driven using the DVI and HDMI outputs alone. (You can use DisplayPort if you wish, of course.) A fourth screen can be connected, but it can’t participate in multi-monitor gaming.

Multiple GPUs in an SLI team used to be a requirement for 2D Surround configurations, but the latest Kepler-based GeForces enable multi-screen gaming using a single card. The GTX 680 seems more cooperative than the dual GeForce GTX 580s we used for some SLI testing a while back. It didn’t complain when we plugged in our diverse collection of display cables. As on the Radeon, the setup process was a breeze.

Nvidia uses a slightly different interface for organizing one’s display array, but it’s no more difficult to navigate. The UI for bezel compensation does take a little longer to work through, but only because the Nvidia drivers start with zero correction instead of applying their own estimate.

Although bezel correction is fantastic for games, I’m not sure I’m a fan for standard desktop work. When the Windows desktop spans multiple displays, it’s rare to have a single application stretched across more than one. If that does happen, you typically don’t want any information hidden behind the bezels.

Even in games, menus and vital on-screen elements can be obscured by bezel compensation. The problem is particularly notable with triple-wide portrait configs. Fortunately, Nvidia has a keyboard shortcut to allay the issue. Hitting Ctrl+Alt+B provides a peek at the pixels behind the bezels. You don’t want to be peeking in the heat of battle, but it’s nice to have the option when negotiating game interfaces that haven’t been optimized for multi-monitor arrays.

Apart from bezel peeking, which we didn’t find necessary in the games we played, there’s little practical difference between Nvidia Surround and AMD Eyefinity. That said, adding a couple of screens has a substantial impact on the gaming experience.

At the wheel

First, a warning: not all games can take advantage of multiple displays. Mirror’s Edge, the first one we tried, spit out the following:

The side screens were blank, the resolution was way off, and the interface was impossible to navigate. There’s apparently a way around the problem, but it requires downloading and running an FOV-hacking executable, which may raise red flags with Steam. Disappointed, we turned to a game we knew would work: DiRT 3. This poster child for Eyefinity has been showcased across three screens in numerous demos, and surely it would deliver.

On a single 1920×1200 display, the game looks like so:

At 6084×1200 across three screens, you get this:

The cockpit view offers a much wider perspective. The image is a little stretched at the left and right extremes, which is more visible if we look at a shot of the actual monitors.

Yeah, that’s still pretty awesome. With the room’s lights turned off, the bezels look like little more than the bars of a roll cage surrounding the camera. The angled-in side screens fill one’s peripheral vision, wrapping the player in the game. It’s terribly cliché to say this, but the experience is more immersive. I was drawn into the game, quite literally, and found myself leaning forward and focusing intently as the blurred landscape whipped past my periphery.

Having a full view of those side windows really adds to the sensation of speed. The windows are helpful when weaving through traffic, too. My primary focus didn’t need to drift too far from the center display to notice other cars coming up on either side. On a couple of occasions, while sliding through corners at extreme angles that would impress Jeremy Clarkson, I found my head swiveling to look through the side window—the part of the car facing forward. For the first time in a while, I felt like I was having a next-generation gaming experience.

The resized images don’t convey the scale properly. Here are a couple more you can click on to get full-sized screenshots at 6084×1200:

Click to see the full-size screenshots

The cockpit view is by far my favorite for driving games played on a single screen, so it’s no surprise I preferred it with our triple-wide config. Switching to the third-person camera altered the experience a little. I felt a little removed from the action and found myself leaning back into the chair. I wasn’t disengaged, but was instead inclined to take in the larger picture. I’d also switched from rally racing to the game’s gymkhana mode, which involves the sort of hooning around that’s more entertaining to admire from an external perspective.

Click to see the full-size screenshot

DiRT 3 is old news, so we fired up its successor, DiRT Showdown, to see if anything had changed. The game also ran smoothly at full detail with 4X antialiasing enabled. However, it wasn’t as enjoyable. Showdown lacks a cockpit view, and the hood-mounted camera is a poor substitute. The floating rear camera does suit Showdown‘s arcadey flavor; it’s just not as compelling on a three-screen setup.

Want to really feel like you’re at the wheel? Try Shift 2 Unleashed, a simulation-flavored title in the Need for Speed franchise with a slick helmet-cam option. Instead of sitting stationary inside the car, the helmet cam pans toward the apex of each corner, as your eyes naturally would.

Click to see the full-size screenshot

On a single display, the shifting gaze makes the helmet cam even more engaging than the cockpit view. In surround, it’s sublime. Even though a good chunk of the side screens is dominated by the black interior of the helmet, the automatic panning definitely enhances the wrap-around effect of a triple-screen setup. The extra displays make Shift 2 feel more like a simulator and less like a video game.

Again, resized screenshots don’t do the experience justice. You can click on the image above for the full-fat version. If you don’t want to wait for that to load, check out the image below. It’s the little car to the left side of the windscreen, only at full size.

Now imagine a few of those zooming in and out of your peripheral vision while jockeying for position in the pack. “Intense” is a good word to describe the experience. “Fun” works, too. Best of all, Shift 2 was buttery smooth and hiccup-free at 6084×1200 with all the detail levels maxed, including a dose of antialiasing. Smooth frame delivery is essential to games in this genre; you don’t want any stuttering while flirting with the limits of traction on the final corner of a long race. No doubt thanks to its console roots, Shift 2 doesn’t present much of a challenge to modern high-end graphics cards.

Click to see the full-size screenshot

Notice that the HUD elements are all confined to the center of the screenshots; they’re laid out on the middle display as if there were no flanking screens. The same is true in the DiRT games, and that’s how I prefer the HUDs to be positioned. Having HUD elements that spill over onto the side screens requires too much head turning, diverting attention from what’s right in front of you.

Before it was called up for surround testing, Shift 2 had been sitting untouched in my Steam library for months. Having the helmet cam spread across three displays got me hooked again, though. I found myself making excuses to indulge in one more race. The bumper cam had to be tested, and it felt insanely fast. A night race was next, and it was a bit of a disappointment at first. The side screens were mostly dark and added little to the experience. Then I whizzed through a few lit areas and saw the lights streak from the foreground to the edges of my previously desolate periphery. Wow.

Particularly when played with first-person cameras, driving games are a natural fit for multi-display rigs. The experience is so good that I’m pondering a new wheel-and-pedal setup to create the ultimate virtual driving machine in my home office.

A selection of shooters

If three-screen setups are ideal for first-person driving, what about shooting? At last, an excuse to log more time with Battlefield 3. Life as a hardware reviewer has its moments. Playing BF3 across a combined 72 inches of display area just a couple of feet from my face qualifies as one of the better moments in recent memory.

Click to see the full-size screenshot

Battlefield 3 is one of only a handful of games we tried with an easily adjustable field of view. Tweaking may be necessary depending on the angle of the side screens. 75 degrees seemed to be the perfect FOV setting for our three-screen array. Any higher, and the picture felt warped, as if it were being stretched down a tunnel. Lower FOV settings felt flatter and two-dimensional.

Again, more immersive seems like the right way to describe the experience. Sorry, the thesaurus is drawing a blank on synonyms. Having one’s peripheral vision filled with the game world added considerably to the illusion of being right there on the battlefield. The first-person perspective felt even more natural than the cockpit and helmet cams from the driving games, perhaps because there was no dashboard or windshield hogging the field of view. The bezels were visible, of course, but they didn’t bug me; bezel compensation works pretty well. Battlefield 3 also does a good job of keeping the important HUD and UI elements within the boundaries of the center display.

At least in the single-player portion of the game, most everything goes down on that middle screen. The missions are largely linear progressions through waves of enemies that tend to pop up right in front of you. On the larger conquest maps that make up much of Battlefield 3‘s multiplayer component, however, the action comes from all sides.

Click to see the full-size screenshot

Here, the side screens can offer a real tactical advantage. Flanking attackers became easier to spot, though my reflexes weren’t always quick enough to pick them off before I went down in a hail of bullets. At least you’ll see it coming.

When resized to fit our pages, the ultra-wide screenshots are admittedly a little lacking. For a sense of scale, here’s an element from the above scene at full resolution. See that little guy on the left? He’s large enough to spot easily, even with one’s attention focused on the center display.

And, yes, he looks a little chubby. The distortion is hardly noticeable when you’re actually playing the game, though. BF3 multiplayer provides little downtime to take in the scenery, and my attention was mostly concentrated on the center display. Although I’d see other players in my periphery, I rarely looked right at them on the side screens. Flicking the mouse to reposition the crosshairs felt more natural than turning my head.

Battlefield 3 is one of the most graphically demanding games around, and rendering it at more than six megapixels was no easy task for our high-end graphics cards. The game felt smooth with the “ultra” detail preset, but only after we disabled multisampled antialiasing. More on BF3 performance in a moment.

Click to see the full-size screenshot

Of all the games we lined up for surround testing, Battlefield 3 multiplayer was easily the highlight for me. The experience was engrossing and highly addictive. The game looks even more stunning on three screens than it does on one, and the wider perspective perfectly suits the chaos that ensues when 64 players race between multiple conquest points.

First-person shooters in general are perfect candidates for the wider field of view that multi-display setups provide. Serious Sam: BFE works quite well with multiple displays, offering a much broader view of the action. Here’s how the game looks at 1920×1200:

Now, behold the same scene stretched across three screens:

The HUD in the 1920×1200 shot is a little wonky because I forgot to re-adjust the screen width setting. Users can change the width of the HUD to ensure details like their health, armor, and ammo are all shown on the middle display of a three-screen config. The HUD can be widened to push those details to the outer edges of the side screens. There’s also an adjustable field-of-view setting, just like in Battlefield 3.

Click to see the full-size screenshots

Serious Sam: BFE is a fresh spin on old-school shooter gameplay, so the levels are largely linear journeys through one group of baddies after another. The maps are massive, though, and enemies often pop up beside or even behind you. I tend to take advantage when given the room to roam, and I often found my peripheral vision filled with rocket trails and potential targets. BFE is packed with huge battles against hordes of enemies, and those encounters felt even more epic when painted across three screens.

Even though widening one’s window on the world doesn’t change the scale, the game’s already towering architecture felt somehow larger and more imposing. Fortunately, rendering those massive structures didn’t prove too challenging. We didn’t have to dial back the detail at all when running the game at our full bezel-corrected resolution of 6084×1200.

Click to see the full-size screenshots

Rage, id Software’s latest opus, wasn’t quite as cooperative when played at such a high resolution. We had to disable antialiasing to get smooth frame rates, and even then, the game’s texture pop-in issues sullied the experience. Most of the textures on the middle screen looked fine, but there was plenty of low-res ugliness to the left and right of center. Higher-resolution textures could routinely be seen popping into view, taking away from the otherwise gorgeous vistas draped across our triple-wide array.

Texturing problems aside, this game is poorly optimized for multiple monitors. The HUD’s elements are drawn on the farthest corners of a three-screen config, making them all but useless in the heat of battle. I can’t count how unexpected reloads or weapon switches were required because I’d lost track of my ammo. Having the mini map in the upper-right corner is problematic, as well.

Then there’s the main menu, which sits on the far edge of the right display. So does the dialog box that presents new missions and the interface governing item purchases. Want to sell something? Swivel your head all the way to the left, because the menu pops up on the other side. Mercifully, the inventory interface appears in the middle of the main display.

Rage applies a letterbox effect to in-game cinematics, which looks fine on a single screen. However, the black bars didn’t stretch all the way across our three-screen array. The underlying picture wasn’t distorted, but the odd cropping served as another reminder that Rage wasn’t designed to exploit the potential of modern PCs. That’s a shame, because the game’s rich environmental detail deserves a wider perspective.

Despite its flaws, Rage still felt more engaging on three screens than on one. The first-person perspective is ideally suited to surround configs even when the implementation isn’t perfect. Let’s see what happens when the camera takes a step back.

Taking the third

The latest Batman title is one of the best games of the past year. We’ve invested a lot of hours gliding and brawling through Arkham City, and we were curious to see how the game’s third-person perspective translated to a triple-screen setup. Here’s what you get with a single display:

And this is the view with a triple-wide config:

Again, adding screens provides a much broader view of the game world. The thick atmosphere that permeates Arkham City feels even more enveloping with three displays. There is some distortion on the left and right screens, especially at the extremes. The horizontal stretching wasn’t distracting, probably because my attention was focused primarily on the middle screen.

Arkham City keeps the action fairly centered. While the side screens definitely give the environment more body, actual bodies are rarely seen in one’s peripheral vision. Even in massive brawls, the action is almost entirely confined to the center display.

Click to see the full-size screenshot

The screenshot above is pretty typical of Arkham City combat. Virtually all the enemies cluster on the middle display, leaving little to see on the flanking screens. Here’s a resized crop of the above image showing only what’s pictured on the middle monitor.

Yep, that pretty much covers all the important elements in the scene. The side screens contribute little beyond additional ambiance, at least in a triple-wide config. We did encounter a few instances where it would have helped to have a taller display array. Batman might be a good candidate for a multi-portrait setup, provided you can tolerate the bezels. They were barely noticeable on our three-way landscape config but would have crossed right over the action in portrait mode.

Arkham City‘s HUD placement is problematic for landscape setups, though it probably wouldn’t be an issue for portrait configs. The vital details appear near the top-left corner of the display area, which is definitely out of the way on a triple-wide setup.

The rest of the game steers clear of trouble save for one artistic flourish. At the end of a combat sequence, as the Dark Knight deals the final blow to the last enemy standing, the camera pans and zooms for a more cinematic view of the violent finale. This effect looks great on a single screen, but it falls apart in surround because the climax-cam sticks to the middle display, causing the side screens to go dark. The first time I saw the effect on our three-screen config, I felt like I had been yanked out of the game.

Click to see the full-size screenshot

Perhaps that’s a credit to just how much the side displays add to the immersion. Losing the additional perspective felt jarring, especially if it had been a while since the last instance. The worst part is knowing there are more interruptions to come—and that they’ll strike after some of the most intense moments of the game.

At least Batman doesn’t break into cinematic sequences as often as Max Payne 3. The third chapter in the bullet-time-infused trilogy can barely go a few minutes without taking a break from the action to advance the narrative. As in Arkham City, the cinematics are confined to the center display, breaking the wrap-around feel that the side screens create.

Click to see the full-size screenshot

When I’m at the controls, most of the action in Max Payne 3 goes down in slow motion. Even with time ticking away at a crawl, I didn’t find myself peering beyond the center display. That’s the focus of the action, where the bulk of the enemies appear, and thankfully, where the HUD is displayed. The surround displays add extra flavor, of course, but they don’t convey a tactical advantage.

Max Payne 3 is gorgeous, with incredibly detailed textures that seem to cover every surface, whether it’s right in front of you or on the periphery. Aside from the issue with the cinematics, the game looked great and ran smoothly with all the details maxed out at full resolution.

Click to see the full-size screenshots

Unfortunately, the ultra-wide perspective that triple-landscape configs provide seems less than ideal for third-person games, at least when the camera is this close. Third-person titles tend to keep the action intimate, which leaves little to fill one’s peripheral vision.

Our testing methods

We don’t intend for this review to provide a comprehensive look at graphics performance with triple-screen setups, but we have run a few tests to offer a sense of how the two cards stack up against one another. If you’re looking for performance results from a more extensive collection of games and cards, see our GeForce GTX 690 review, which employed a similar three-screen setup.

Today, we’re focused on a couple of hot-clocked cards from Asus and Gigabyte. We’ve put these beasts in the ring against each other to see which one comes out on top.

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least five times, and we’ve reported the median result.

Our test system was configured like so:

Processor Core i7-3890X
Motherboard Asus P9X79 PRO
Chipset Intel X79 Express
Memory size 16GB (4 DIMMs)
Memory type Corsair Vengeance DDR3 SDRAM at 1600MHz
Memory timings 9-9-9-24 1T
Chipset drivers INF update 9.2.3.1022

Rapid Storage Technology Enterprise 3.1.0.1068

Audio Asus Xonar DSX with 7.12.8.1800 drivers
Hard drive Intel 520 Series 240GB SATA
Power supply Corsair AX850
OS Windows 7 Ultimate x64 Edition

Service Pack 1

DirectX 11 June 2010 Update

 

  Driver revision GPU base

core clock 

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

Asus Radeon HD 7970 DirectCU II TOP Catalyst 12.6 beta 1000 1400 3072
Gigabyte GeForce GTX 680 OC GeForce 304.48 beta 1071 1501 2048

Thanks to Intel, Corsair, and Asus for helping to outfit our test rigs with some of the finest hardware available. Asus and Gigabyte supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • We used the Fraps utility to record frame rates while playing a 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card in order to counteract any variability. We’ve included frame-by-frame results from Fraps for each game, and in those plots, you’re seeing the results from a single, representative pass through the test sequence.

  • We measured total system power consumption at the wall socket using a Watts Up Pro digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Arkham City at with DirectX 11 at 6048×1200 with FXAA enabled.

  • We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was placed approximately 8″ above the graphics card and out of the path of direct airflow. The CPU cooler’s fan was also unplugged when we took our noise readings.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used MSI’s excellent Afterburner utility keep tabs on GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Batman: Arkham city

For this test, we threw down with a pack of the Joker’s henchmen in an all-out brawl for 90 seconds.

We tested at 6048×1200 with the detail levels maxed, DirectX 11 effects enabled, and antialiasing at its highest setting.

Now, we should preface the results below with a little primer on our testing methodology. Along with measuring average frames per second, we delve inside the second to look at frame rendering times. Studying the time taken to render each frame gives us a better sense of playability, because it highlights issues like stuttering that can occur—and be felt by the player—within the span of one second. Charting frame times shows these issues clear as day, while charting average frames per second obscures them.

For example, imagine one hypothetical second of gameplay. Almost all frames in that second are rendered in 16.7 ms, but the game briefly hangs, taking a disproportionate 100 ms to produce one frame and then catching up by cranking out the next frame in 5 ms—not an uncommon scenario. You’re going to feel the game hitch, but the FPS counter will only report a dip from 60 to 56 FPS, which would suggest a negligible, imperceptible change. Looking inside the second helps us detect such skips, as well as other issues that conventional frame rate data measured in FPS tends to obscure.

We’re going to start by charting frame times over the totality of a representative run for each system—though we conducted five runs per system to be sure our results are solid. These plots should give us an at-a-glance impression of overall playability, warts and all. (Note that, since we’re looking at frame latencies, plots sitting lower on the Y axis indicate quicker solutions.)

Frame time

in milliseconds

FPS rate
8.3 120
16.7 60
20 50
25 40
33.3 30
50 20

The Gigabyte GTX 680 OC suffers from more frequent latency spikes than the Asus card. With few exceptions, the magnitude of those spikes is relatively low. The GTX 680’s frame latencies rarely exceed 40 milliseconds, which means the corresponding frame rate doesn’t often dip below 25 FPS. Although the 7970 has fewer latency spikes, the magnitude of those spikes is much greater, often hitting 50-60 ms. That works out to a frame rate of just 17-20 FPS.

We can slice and dice our raw frame-time data in other ways to show different facets of the performance picture. Let’s start with something we’re all familiar with: average frames per second. Though this metric doesn’t account for irregularities in frame latencies, it does give us some sense of typical performance.

The Gigabyte card has a clear lead in the FPS arena, but we’re not done yet. We can demarcate the threshold below which 99% of frames are rendered. The lower the threshold, the more fluid the game. This metric offers a sense of overall frame latency, but it filters out fringe cases.

Of course, the 99th percentile result only shows a single point along the latency curve. We can show you that whole curve, as well. With single-GPU configs like these, the right hand-side of the graph—and especially the last 5% or so—is where you’ll want to look. That section tends to be where the best and worst solutions diverge.

In our frame-time plots, the Asus HD 7970 TOP exhibits more severe latency spikes than the Gigabyte card. Those deviations are illustrated nicely by our percentile curves, which show the 7970’s frame times rising sharply for the last ~3% of frames. The GTX 680’s frame times also increase for the final few percent, but they mostly stay below 40 milliseconds.

Finally, we can rank solutions based on how long they spent working on frames that took longer than 50 ms to render. The results should ideally be “0” across the board, because the illusion of motion becomes hard to maintain once frame latencies rise above 50-ms or so. (50 ms frame times are equivalent to a 20 FPS average.) Simply put, this metric is a measure of “badness.” It tells us about the scope of delays in frame delivery during the test scenario.

The 7970 spends nearly twice as long as the GTX 680 working on frames that take longer than 50 milliseconds to render. That said, we’re looking at less than 0.6 seconds over the course of a 90-second session. Arkham City feels smooth on both of these cards, which is impressive considering the six-megapixel resolution and maxed detail settings we used for testing.

Battlefield 3

We tested Battlefield 3 by playing through the start of the Kaffarov mission, right after the player lands. Our 90-second runs involved walking through the woods and getting into a firefight with a group of hostiles, who fired and lobbed grenades at us.

Apart from disabling deferred antialiasing, we kept Battlefield 3‘s “ultra” detail settings and ran the game at a bezel-corrected 6084×1200.

Frame times were more consistent in BF3 than they were in Arkham City. The plots largely hover around 30 milliseconds per frame, which translates to an average frame rate of about 33 FPS. As you can see, though, there are several latency spikes beyond 40 ms. During our 90-second test runs, the spikes were a little more frequent on the GTX 680 cards than on the 7970.

That discrepancy is obscured by the FPS averages, which show the Gigabyte GTX 680 OC ahead of the Asus HD 7970 TOP.

However, if we look at the frame time below which 99% of the frames were delivered, the TOP comes out on, er, top. Let’s bust out the percentile curves for a more detailed look.

While the turbo-charged Asus card has the lowest frame latencies at the far right side of the curve, the GTX 680 is pretty close. Both cards are comfortably under the 50-millisecond threshold we used to quantify “badness” in Batman, so we’ve adjusted our time-beyond calculation to tally the amount of time spent on frames that take longer than 33.3 milliseconds to render. 33.3 ms corresponds to an average frame rate of 30 FPS.

The difference here is just 8 milliseconds, which is pretty inconsequential over a 90-second run. Battlefield 3 played very well on both cards as long as multisampled antialiasing was disabled. Turning on AA resulted in noticeable latency spikes—the sort you’d definitely want to avoid in a twitchy first-person shooter.

Power consumption

The Asus HD 7970 TOP looks far more imposing than the Gigabyte GTX 680 OC, but does it actually draw more power?

Yes. The difference amounts to just 15W under load, but it’s more twice that at idle. Surprisingly, the Asus HD 7970 TOP failed to drop into its ultra-low-power ZeroCore mode when the display went into standby. The problem persisted after we disabled DisplayPort audio, and we’ve seen similar behavior from other 7-series Radeons connected to DisplayPort monitors. ZeroCore may not work correctly with DisplayPort, which handicaps its usefulness for Eyefinity configurations.

Of course, the ZeroCore issue affects only the “display off” results. The Asus card still has much higher power draw than we’d expect from a Radeon HD 7970.

Noise levels and GPU temperatures

Since the Radeon consumes more power, its cooler has more heat to dissipate. Let’s see how the noise levels and GPU temperatures compare.

When little is being asked of the GPU, the 7970 is five decibels quieter than the GTX 680. The difference is audible from several feet away, although the hum of the Gigabyte card didn’t prove to be distracting when running on an open test system.

The delta narrows to just a few decibels under load. The GTX 680 is still louder, but the difference is less pronounced.

Our temperature results reveal that the 7970 isn’t quieter because Asus is letting the GPU cook before spinning up the fans. Just two degrees Celsius separate our GPU temperature results.

Conclusions

In a few short years, surround gaming has gone from being somewhat of an exotic luxury to something far more attainable. A potent graphics card is still required, but with the latest generation of graphics hardware, one needs only a single high-end card to play recent titles at six-megapixel resolutions. In most games, the detail can be turned all the way up without sacrificing smooth frame delivery.

Not all games will work perfectly with multi-screen configs, of course. Some are tuned better than others, and older titles probably won’t work at all. We didn’t encounter any newer releases with show-stopping issues, though—just unfortunate HUD and menu placements and annoying cinematic quirks. That’s progress, but there’s clearly more to be made.

Triple-screen landscape setups seem to be the most popular configurations right now, and for good reason. The ultra-wide perspective is perfect for first-person shooters and driving games. All the examples we played from those genres were considerably more engaging and enjoyable on three screens versus one. The wider perspectives also conveyed competitive benefits: enemies could be spotted in one’s periphery, and so could traffic. Even when we encountered issues, they weren’t severe enough to ruin the overall experience.

With third-person titles, the case isn’t as compelling. A wider perspective certainly adds atmosphere, but peripheral vision doesn’t feel as natural when the camera is hovering behind one’s avatar. It didn’t help that two of the third-person games we tried, Arkham City and Max Payne 3, have an annoying tendency to blank the side monitors and display cinematic sequences on only the center screen. Nothing spoils the immersive benefits of a multi-screen array like interrupting the action to put something on just one display.

Multi-monitor configs will likely remain in the minority among gamers, which doesn’t give developers a lot of incentive to cater to them. That’s a shame, because I think a three-screen setup might just be the best upgrade for hardcore gamers. I’d take one over a single large display any day, not only for wrap-around gaming, but also to have the extra real estate for desktop tasks.

At the moment, AMD’s Eyefinity and Nvidia’s Surround schemes each have benefits and limitations. Eyefinity works with up to six displays but requires DisplayPort connectivity for even three-screen arrays. Nvidia’s Surround implementation has no DisplayPort requirement but can’t be extended beyond a trio of screens. Surround’s bezel-peeking option is nice, though. The truth is: both approaches work well and are easy to configure.

The same is true for the two graphics cards we used for testing. Both cards sell for about the same price right now; the Asus Radeon HD 7970 DirectCU II TOP costs $500 at Newegg, while the Gigabyte GeForce GTX 680 OC rings in at $525. Newegg has both cards in stock, and the Radeon comes with a handful of free games. It also runs quieter but blocks an additional slot. The Gigabyte is noticeably louder. However, it’s a two-slot affair that draws less power. Which card is best depends largely on your current system configuration and whether you can afford to wait for the 7970 GHz Edition cards poised to hit the market. Let’s hope some of those come with active DisplayPort adapters.

And three displays.

The biggest endorsement I can give multi-screen gaming is the fact that I have little desire to return to a single screen. Now that I’ve experienced gaming in surround, it feels like something’s missing when there’s nothing in my peripheral vision—a wider perspective I no longer want to go without.

Comments closed
    • C10 250
    • 7 years ago

    I have to thank this article. I’ve had my GPU for almost a year now. When I first I originally tried it I hooked up 3 monitors using a DVI to displayport cable in I foolishly assumed I could not use three monitors. One active adapter and a third monitor later, I’ve got a whole new appreciation on PC gaming.

    This left me with one problem…

    Even when gaming, I’m a compulsive multitasker. With two monitors I’ve always got something active in the second screen when I’m playing a game. Eyefinity has got me using all my monitors one game. My solution was to order this just prior to this post:
    [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814500221[/url<]

      • C10 250
      • 7 years ago

      The card arrived today and doesn’t seem to work.

    • Ph.D
    • 7 years ago

    Nice write-up.
    I have to say though, I was worried at first and the more I read the more those worries were confirmed.

    Setups like these are just so ridiculously expensive. My god. The screens are one thing, at least they tend to last for a while, but the GPUs are another thing altogether.

      • Airmantharp
      • 7 years ago

      I sense a little anxiety in your post, so let me add to it:

      This is only 6MP; you can get 4MP today in a 30″ panel, and either can be successfully driven by a single high-end GPU, as shown. But what about tomorrow?

      4K panels are upon us, [url=http://en.wikipedia.org/wiki/4K_resolution<]with 7MP-13MP resolutions in store.[/url<] I'm looking forward to the day I can put one of those on my desktop, and I'm glad to know right now that we'll have the GPU power on tap to utilize all of those pixels!

    • C10 250
    • 7 years ago

    Am I the only one on eyefinity at 3840×1024?

      • Airmantharp
      • 7 years ago

      Probably- monitors in that resolution range are hard to find these days!

        • C10 250
        • 7 years ago

        I was able to pick up two Acer V173 monitors for practically nothing from my brother and his Computer Class classmate. It made it a little easier to pay 119.00 for a matching model on newegg, Besides at that resolution, my cheap little Saphire 2GB 6850 can keep up pretty well.

          • Airmantharp
          • 7 years ago

          It should be a blast on one monitor, but that surround resolution is four megapixels. You’ll need to upgrade your GPU to push that; I was using a pair of HD6950 2GB cards with moderate success, and while your card has enough memory if you bottom out the settings, to enjoy anything recent you’ll need something faster than your HD6850 2GB :).

    • sigher
    • 7 years ago

    How is three screens for people that wear glasses? Does the frame of their glasses make it useless since they have to maintain a distance to not have it obscure the sidescreens? I don’t think I ever heard anybody say anything about that.

      • Meadows
      • 7 years ago

      You haven’t heard anyone complain about that because it’s a ridiculous notion.

        • sigher
        • 7 years ago

        Great, we got the childish trolish stuff done, so now someone can give a proper reply if so inclined.

          • Corrado
          • 7 years ago

          Its no different than your regular peripheral vision when you wear glasses.

          • Meadows
          • 7 years ago

          What Corrado said. You seem to believe that wearing glasses is like wearing scuba goggles, or something.

      • Firestarter
      • 7 years ago

      Wearing glasses does not render your peripheral vision useless. You really only see sharply in a tiny section of your field of view, namely that which your eyes are currently focused on, and that is the only bit where glasses really help (well except for people who are otherwise completely blind of course).

        • sigher
        • 7 years ago

        But in the described setup it relies on the secondary screens giving you a ‘sense’ of your surroundings rather than a superclear image, and in the described setup the two extra screens are angled to the side, and with glasses that is where the frame have the arms that go over the ears, and although in normal life it doesn’t matter too much I imagine that if you pay a lot of money to get sideviews that make you aware and then that part becomes useless until you turn your head it’s a bit of a disadvantage.
        But as I said that depends on how close you are and how much to the side the other screens are.

        And I tried it with some glasses to see if the frame does affect you and with the ones I tried it with it did block the direct sideview in a central line next to the eyes, although I could see under and over it but I think with two screens it would largely block things depending on distance and positioning.
        I’m not wearing glasses normally though so I’m not sure how it works out in reality for those that do.
        Oh and I guess if you wear wireframes or a design with large lenses it also would not be that much of a blocking effect possibly.

        So do any of you that replied wear glasses yourself? Because guessing isn’t the same as testing.

          • DeadOfKnight
          • 7 years ago

          I wear glasses every day and the frames cut off a part of my vision. Does that mean I should just take the side view mirrors off of my car? The extra data you’re getting is still valuable and immersive with 3 screens, probably even more so for those who wear glasses because it should be easy to ignore the bezels just like the frames.

            • sigher
            • 7 years ago

            What an aggressive vibe here, I’m just making a comment wondering if anybody ever tested it with people wearing glasses and I get an attitude as if I just advocated murdering kittens..

            As for your remarks,. you will I hope direct your view occasionally at those mirrors at appropriate moments, and you obviously can’t compare those moments of necessity for safety with the cost and effort of adding two screens and a beefy GFX-card for more immerse gaming. And then possibly find it’s half negated by the presence of frames, which I wasn’t saying is so but was actually [b<]asking[/b<] about. And yes you can look at those screens and you will of course always have that benefit, but as stated in the article the immerse thing comes a lot from noticing large objects and movement in your peripheral vision and I just wondered if that works for people with glasses and if any testing has been done in that regard, so people with glasses can hear about it beforehand, which is what review sites are all about you know. Incidentally I recall seeing various reviews of 3D glasses where they frequently specifically went on about how they were not suitable for people with glasses, so it's not uncommon to take such things in account in reviews.

            • DeadOfKnight
            • 7 years ago

            3D is completely different. I would never recommend 3D to people who wear glasses because it’s terribly uncomfortable to wear both sets of glasses and in the case of a lazy eye the 3D effect will seem blurry and horrible because you need both eyes correctable to 20/20 for it to work nicely, and even then some people still get headaches and it just doesn’t work as intended.

            Here we’re talking about peripheral vision. People who wear glasses automatically have worse peripheral vision because the lenses don’t wrap around your entire face and are really only working according to prescription when you are looking straight through the optical center of the lens. The fact of the matter is, even for people who have perfect vision their peripheral vision is still not as good as what is being directly focused on. It’s true that people who wear glasses are still at a disadvantage, but not specifically from the frames being in the way (this is something you have to get used to even in real life scenarios, your vision is going to be cut off at the edge of the lens regardless of how thick the frames are because of the magnification making the picture larger and overlapping). The point we’re trying to make is that even at a disadvantage, there is still a benefit and an extra level of immersion when given that extra peripheral data to simulate actually being there in the game, because unless you are completely blind then you’re still going to detect shapes and movement to your left and right and then you can adjust your camera to focus on those objects if you so desire. Unless you have some crazy thick goggles that will obscure the entire left and right screens then it’s still going to be a better experience, even if your vision is bad.

    • just brew it!
    • 7 years ago

    On a somewhat related tangent, I was pleasantly surprised to discover that the cheap (sub-$100) Radeon card I used in my most recent build at work fully supports triple head, even in Linux. The third head needs to be analog VGA, but still… it has me thinking about scrounging up a third monitor. Been doing a fair bit of FPGA work lately, and being able to stretch out the logic analyzer window across triple monitors would be really nice.

    • dashbarron
    • 7 years ago

    I have a lot of issues getting surround to work on a pair of 570s and still haven’t successfully been able to make it work. NVIDIA was definitely slacking in this department.

      • Airmantharp
      • 7 years ago

      Try letting us help you in the forums? That’s why they’re there :).

    • RealPjotr
    • 7 years ago

    I don’t get this 48:9 aspect ratio!? It would be much nicer to put these three monitors vertically, for 3×1080 by 1920, which is about 15:9 aspect ratio. That means same game display as a single monitor, but 3 times the resolution at a reasonable price. And no bezel in the middle of the screen that you get with two monitors.

    • thermistor
    • 7 years ago

    Great “State of the Nation” article. Thanks!

    I’ve enjoyed my 3x 22″ @ 1680×1050 for a while. The HL2 series is the best supported, til I ran BFBC2…and that looked very good, general fisheye distortion notwithstanding.

    The ony plan I have is to get a second 6850 and Crossfire it, then maybe try BF3.

    My opinion is that switching in and out of Eyefinity takes ony a few minutes, and the pain of going back and forth is just not an issue.

    It’s also exciting to see several titles in the comment thread as well as the article with multi-monitor support. That will have at least some influence on future purchases.

      • Airmantharp
      • 7 years ago

      I’d recommend selling the HD6850 if you’re interested in playing BF3, and getting a faster single card.

        • Airmantharp
        • 7 years ago

        For whichever coward down-ranked me, why don’t you go research my point instead?

        I’m not against SLi/Crossfire, and even ran Crossfire for over a year; I know quite well that one faster card, even if it isn’t as fast as two slower cards combined, will definitely provide a better experience first hand. Reviews from TechReport, Anandtech, and the [H] at least back me up, and I’m sure there are others out there.

    • south side sammy
    • 7 years ago

    I didn’t and don’t have time to read the whole thing but………..all the pictures except for the first one don’t show the bezels………… I still can’t see how having 2 big black strips in front of your eyes blocking what’s on the screens in front of you make it better……… yes, I’d like 3 big displays, but something needs to be done to make the bezel disappear. Or one big wide screen that will stretch the image the same way 3 monitors will.

      • Airmantharp
      • 7 years ago

      Thin bezels are here, there’s just not much of a real market for them on computer desks.

      And a perspective-correct curved display is an easy thing to do with OLED, and will probably exist when that technology becomes cheap enough to produce.

    • Meadows
    • 7 years ago

    Crappy geometry distortion again. Especially in the Batman title, it was rather egregious there.

    This won’t be mainstream until game engines embrace multiple monitors by default, and use a truncated torus viewfield instead of a frustum. Therefore pixel information will “hit the monitors” always at about 90 degrees regardless of how many of them you chain together, instead of the ever fewer degrees as you add monitors to the side – resulting in the stretched geometry with straight forward-looking viewing frustums.

      • Airmantharp
      • 7 years ago

      I assume that you’re referring to the difference in how the game perceives the location of the pixels to be rendered versus where they are actually placed on the desk in reference to the seated position?

      I’m not sure how FOV calculations work, but I assume that a 90 degree FOV would me 45 degrees from the stare-point (where the cross-hair would be) in both directions on the horizontal plane, and that the ‘render plane’ would actually be equidistant from the stare-point.

      So essentially, the render-plane would be a perfect curve, while the monitors themselves are three separate planes that approximate that curve if properly placed. That would mean that the only spot on any of the monitors that would be undistorted would be the very center where the panel hopefully intersects with the render-plane, right?

        • Meadows
        • 7 years ago

        Yes, and the rest of it would only be *mildly* distorted, thus difficult to notice.

        As it stands, if you combine 3 or 5 monitors today, only the very centre of ONLY THE FIRST monitor shows perfect image. That means you have a whole 2+ monitor areas showing [i<]miscellaneous garbage[/i<] (in a geometrical, overstated manner of speaking). [i<]Anything[/i<] is better than that.

          • Airmantharp
          • 7 years ago

          I’m not sure why the surround monitors couldn’t be angled so at least the center of each intersected the render-plane at a tangent, but I’m not trying to solve the problem geometrically on graph paper either :).

          The problem still pops out at me, and makes me wonder whether we’ll get some way to correct for it in software before we get actual curved screens; with as much work as these companies put into their drivers for multi-display rendering, I’m not counting on it.

            • Meadows
            • 7 years ago

            It’s not the screens. If you angle the monitors using a videogame that currently renders with a viewing frustum (read: any), the game will still pretend all the monitors make up [i<]one, flat, unbroken rectangle[/i<] (the faraway plane of the frustum - it's flat, no matter how wide). Therefore, if you [i<]angle[/i<] the side monitors, you make it [i<]even worse[/i<] than it already is.

            • Airmantharp
            • 7 years ago

            That answers my question, thanks!

            I was assuming that FOV meant that the render-plane was a curve, not understanding what a frustum is. If it’s an unbroken plane, then yeah, there’s no way to get this right.

            I’d like to see some real perspective correction in the future from these companies, it looks like it would pay off, especially when we start having the ability to put more screens together with higher performance cards.

            • DeadOfKnight
            • 7 years ago

            I feel that a 3:1 (width:height) ratio is about as far as you can go before it starts becoming a nuisance. Unfortunately, putting two 16:10 screens together isn’t so pleasant with the bezels running right down the middle.

            If you put 3 16:9 screens in portrait mode this largely eliminates most of the problem but that’s a totally different experience without the ultra wide aspect covering your peripheral and would be a similar experience to a 27″ display, only bigger.

            3 16:10 screens in portrait mode (30:16) would be a little better and is probably the best you can get avoiding obscene geometry distortion without going up to 6 panels.

            A 6 monitor setup (of some cheap 1080p TN panels, of course) would probably be the ideal way to go for immersive multi-monitor gaming. [url=https://techreport.com/articles.x/18756<]Example Here![/url<] At least we now have bezel compensation, or it wouldn't even be worth the trouble.

            • Airmantharp
            • 7 years ago

            Definitely, which is why I’d love to use a portrait orientation with higher resolution screens. Three of my 30″ ZR30w’s would be perfect if the bezels wouldn’t also be a nuisance there.

            • DeadOfKnight
            • 7 years ago

            I only wish they’d provide support for a PLP setup with a 30″ 2560×1600 sandwiched between two 20″ 1600×1200 displays. The distortion wouldn’t be so great and the bezels would be spaced out enough to be out of the way of the UI in most cases. It’d also be a great setup for doing actual work. [url=https://techreport.com/forums/viewtopic.php?f=37&t=76693<]Example[/url<]

            • Airmantharp
            • 7 years ago

            Hell, I actually have the monitors for that, once believing that support was around the corner, over a year ago. It still isn’t here, as easy as implementing a solution in drivers appears to be.

            • Meadows
            • 7 years ago

            [url<]http://en.wikipedia.org/wiki/Viewing_frustum[/url<]

    • Entroper
    • 7 years ago

    I really feel like multi-monitor gaming is a lot of cost for not enough benefit. It’s clear to me that what we really need are better head-mounted displays that can produce a superior experience with fewer pixels and far less desk space.

    I’m really rooting for the Oculus Rift project linked from here a few weeks back. Hopefully with Carmack supporting such efforts, the rest of the industry will follow.

      • Airmantharp
      • 7 years ago

      It’s really just a need for more pixels, on one or more screens- bezels are also getting in the way.

      Without bezels, setting up three 27″ 2560×1440 monitors in portrait would be a fairly good solution, once you get around to powering them properly.

    • kamikaziechameleon
    • 7 years ago

    I could have gone for some super nerdy no holds bared sorta stuff like how dual GPU’s this gen fair on games like BF3, also how a 2560×1600 triple monitor setup fared.

      • Airmantharp
      • 7 years ago

      That’s been done on other sites; this is his personal setup.

    • drfish
    • 7 years ago

    I was on the [url=https://techreport.com/forums/viewtopic.php?f=37&t=70862<]triple screen bandwagon[/url<] for over 2 years and just recently switched to a single 27" Korean monitor. Couple thoughts - the bezel "problem" and the "distortion" are IMO something you can completely write off if you are worried about them. It just wasn't an issue for me at all and I just didn't even see them in use. I had some huge bezels on my monitors as well. For FPS (not into racing) there [i<]were[/i<] a handful of times where the extra FOV unquestionably allowed me to live a little bit longer dodging spies or whatever in TF2 but I can't say it happened often. However - going back to a single screen is like a breath of fresh air. For one, [u<][b<]and this is unfair in [i<]some[/i<] ways[/u<][/b<], the higher single panel resolution of my 27" vs my 22" (1680x1050) screens makes a remarkable difference on the screen I pay the most attention to. Second (and also unfair), the quality of an IPS panel vs the cheap TNs I was running is remarkable. Its almost the difference between color and black & white - but I'm being a little dramatic I guess. Third, dropping ~40% of the pixels I'm trying to draw cut my GPUs (2GB 5870s) some slack and extended their useful life while allowing me to run at higher settings in games like Arma II/Day Z - make no mistake, triple monitors gaming is a lot cheaper to get into than it used to be but it will change your upgrade cycle/investment if you want to keep it. Forth[ly], not having to fiddle with all the custom HUD and FOV hacks is amazing - what a pain and I can't believe that even for a niche product something as simple as giving those controls to gamers still isn't universal. Lastly, most of the games [i<]I[/i<] play are not FPS or similar titles, I gravitate to strategy and simulations like Civ, Anno, and Sins as well as a decent number of indie titles which had me only using 1/3rd of my screens a lot of the time. Make no mistake, I have 250+ hours of Civ5 on triple screens under my belt and it can be "cool" but its not clearly [i<]better[/i<] like an FPS tends to be. In summary, I've been there and back again and while 3 screens can be really neat I found that for me one screen is the better option.

      • dashbarron
      • 7 years ago

      Nice writeup.

      Do you play 1404 or have you moved to 2070? Any love for 1701 still?

        • drfish
        • 7 years ago

        1404 was the first one I put any serious time into but 2070 overcame it pretty quickly. I think I tried the demos for some older versions in the past but never bought them. I’d go back to 1404 but since Ubi are jerks I can’t buy the expansion on Steam anymore. 🙁

    • Pantsu
    • 7 years ago

    I’ve been using 3x1080p for a year or so, and it’s been great for the most part. There have been lots of issues of course, but many of them have been ironed out since and game support is getting better. Still, Eyefinity/Surround is very much a niche enthusiast thing, and you can’t expect it to work perfect.

    I’m using mixed monitors, and while there is a difference between the image quality, after hand calibrating them it’s not big enough to really bother me very much. My side panels are IPS and the central monitor is a thin bezel Samsung 120 Hz monitor. This gives me a nice thin bezel since I can hide the side monitor bezels behind the thin Samsung. Also, when Eyefinity doesn’t work well, I still have 120 Hz or S3D to use.

    I do wish display makers would offer more thin bezel models, especially IPS and 120 Hz models. Also 1920×1200 would be maybe better since 3×1080 can get a bit too stretched and doesn’t offer enough vertical viewing area. The stretching is an issue, though not necessarily as big as it might seem on the pictures, since you’ll concentrate on the center monitor.

    I agree that 3×1080 can also be a bit of a strain when working if you need to constantly change your view from left to right. 30″ and a smaller side monitor in pivot might be a better option, but that’s much more expensive. Also in gaming 30″ doesn’t offer any more viewing area. It does make the picture sharper and bigger which can be a benefit in FPS games though, but many of the 30″ panels might also be a bit slow, certainly compared to 120 Hz monitors.

      • NeelyCam
      • 7 years ago

      Has Apple patented “zero bezel” yet? If not, I got first dibs

        • DeadOfKnight
        • 7 years ago

        I plan to patent a robot that will clean your entire house and do your homework for you, I just haven’t gotten around to it.

    • CityEater
    • 7 years ago

    Has anyone mentioned this

    [url<]http://rationalcraft.com/Winscape_Blog/Entries/2012/7/5_Winscape_v4!.html[/url<] I've got three screens in landscape but would shift to portrait if I could get this going as my desktop. BYO red footage.

    • Bensam123
    • 7 years ago

    Bam! Definitely a subject I was looking forward to you guys taking another look at.

    Something that is keeping me away from this is the bezels. Honestly it’s sorta sad that there aren’t any monitors with ultra thin bezels designed for something like this or at least bezels so small that they’re barely noticeable. I sort of cringe at the thought of bezel compensation as well though as you lose information and that’s something that really bugs me. While some games make up for it, it’s something that would really bug me.

    It would be nice if there was sort of a halfway bezel compensation mode in which the angle of the monitors is taken into account so the images match up cleanly on each side of the bezels, but there is no information under the bezels themselves. So that you don’t actually lose any information, but at the same time it matches up more smoothly.

    Another thought is having the monitors take into account the angle of the side monitors and conveying it to the game so the game can correctly display images across all three displays without warping on the outer edges. I suppose the game would have to see three monitors instead of one large one in order to do this and it would make more work for developers, but I think in the end it would end up as a better option. I suppose Eyefinity and Surround are almost hacks to make gaming work like this as no developers are doing anything like this. The only game I’ve seen with a option to utilize a secondary monitor is Supreme Commander 2, which you can cleanly move across both displays. It’s really flipping awesome.

    Batman is a good example of where the game gets really fisheyed with too wide of a FoV.

    I think there should be a desktop mode and a gaming mode, not just for bezel correction. Rather when you’re in desktop mode, the desktop functions as if you have three separate monitors connected (like a normal multi-monitor setup) and when in gaming mode all three monitors function as one large one. Cleanly being able to tab between these two states would be outstanding. I could see something like this happening with a bit of virtualization. Having all three displays functioning as two different groups of logical displays depending on what is being done.

    Furthermore, being able to disable it in games you don’t want to use it for or find the experience undesireable. Being able to setup profiles for games or better yet, there being default profiles that they test out before hand and just enable (but you can change). Just like how they do Crossfire/Sli rendering profiles.

    There definitely seems like there could be a lot of room to grow for surround gaming. I STILL believe 3D gaming is a huge gimmick and will continue to do so. This has come from someone who has watched 3D movies, watched home movies in 3D, and played games in 3D. It’s just not worth it for what it adds in my opinion. If AMD worked harder (or Nvidia) at marketing this and showing the benefits of it with very little work – making it seem more transparent, then they may really have a leg up on their competitor.

    Just a note about RAGE. Look for RAGE tweaks. The game was pretty terrible for me till I tweaked the .ini turning off all the console BS. Although I don’t know if it would be against the benchmarking ethos here as it isn’t a typical experience for someone playing RAGE.

    “…except for a handful of custom Sapphire models that integrate active DisplayPort adapters onto their circuit boards.”

    That actually sounds like a selling point I never heard of before.

    • CaptTomato
    • 7 years ago

    3 screens, sounds like a flop to me, looked silly when I saw it in action as well.

      • ermo
      • 7 years ago

      In my experience, 3 screen setups are very good for when you are driving racing sims in cockpit view because the extra screen real estate gives you much more spatial awareness and allows you to focus naturally on the apex of the corner you are approaching.

        • CaptTomato
        • 7 years ago

        I saw Dirt3 spread across 3 screens and it looked silly to me.
        Anyway, whatever floats your boat, I’m using a 26in 8 bit Acer and I’m not eager to go to a single TN screen let alone 3 of them, lol.

    • The Dark One
    • 7 years ago

    Hey Geoff, the link to [url<]https://techreport.com/image.x/surround-gaming/giant_dirt3-cockpit.jpg[/url<] on the fourth page doesn't actually bring up a panoramic image like it should.

    • jeffz6
    • 7 years ago

    I have gone back to single display. Switching between eyefinity and single display when certin games dont support eyefinity is too hard. The two side images are way to strecthed. Id rather play with full AA on one screen then tone it down for triple head.

    Although Dead Space was amzing with 3 screens! You really needed it for that game.

    • indeego
    • 7 years ago

    I’ve done this setup before and there is a point of diminishing returns: It occurs where your head moves. If you game a lot and find a setup that works, I can see the appeal, but for everyday computing there is a downside to having too much information off in your peripheral. You find yourself turning your body, and not just your head, and sitting further away to get more information in, when sitting closer to get detail get back to resting position.

    Also bezels. For some reason one bezel break is fine, two is/was very distracting and annoying.

    Hopefully Google’s Glasses won’t be nearly as gimmicky as they appear now.

    Thanks for the article.

      • Ifalna
      • 7 years ago

      Yeah Bezels suck. I don’t really get it why they don’t create a screen that has none on the side that “connects” to the other screens. Panel should still be perfectly stable with 2 / 3 Bezels holding it. They could sell “sets” of three screens.

      • Bensam123
      • 7 years ago

      That wouldn’t be a diminishing return as it’s a single condition, just a FYI. I agree though. That’s why it’s important for developers to actually make a UI that’s aware of multi-monitors.

    • torquer
    • 7 years ago

    I currently run 3 Dell 20″ 1080p monitors from 2 GTX 670s in SLI. Overally I’m pretty happy with the solution and it works fairly well. Battlefield 3 spanning 3 displays is a dream, though I cannot achieve an acceptable frame rate in “ultra” with MSAA. I have to use FXAA, which is fine, but bothers my geekitude just a bit. I imagine if the 670s had 3GB of RAM each it might not be such an issue. As it stands I’m generally pegged at 60 fps, dropping as low as 40-45 in very explosion intensive multiplayer on some maps.

    Some games don’t play nice with it though. Civ V is a good example – it will play for a few minutes then crash. Installing new drivers always seems to mess it up, and the latest beta has changed how the taskbar is handled. It now spans 3 displays instead of only on the center one and there seems to be no way to change this. I also use it for running 3 simultaneous and windowed copies of EverQuest 2, and it works great for that though thats hardly a well designed game with a modern graphics engine.

    All in all its a very nice solution and its hard to go back to single monitor gaming. However, I would sure prefer that Nvidia come up with a way to revert back to a single monitor SLI setup automatically for games that don’t properly support Surround. I don’t like having to go manually undo my whole setup to play an unsupported game, then completely set it up again once I’m done. That seems very clunky to me.

    • Nardella
    • 7 years ago

    Your article is not properly researched. Matrox has had a product called the Triple Head 2 Go TH2G for years now making triple screen gaming a reality on virtually all graphics cards and many many games.

      • Airmantharp
      • 7 years ago

      And Matrox has not made a GPU that can play games in over a decade- let alone a competitive one.

      Not hating on Matrox, they were cool back in the day. Just not for gaming :).

        • Nardella
        • 7 years ago

        There was no reason to, the TH2G made it possible to use three screens on any card. I am still wondering why my comment has been down ranked.

          • BobbinThreadbare
          • 7 years ago

          I don’t know why either, but I will say it’s not the same as native support for a variety of reasons.

            • Airmantharp
            • 7 years ago

            Yeah, I haven’t looked this thing up yet, but I assume that it does the same basic thing as current Nvidia/AMD drivers do by telling the OS that the display space of three monitors is instead one large display space.

            I might even work really well, except that it will at least have all of the problems current solutions do and probably more, and that it cannot do it’s job without adding latency- which kills FPSs.

          • indeego
          • 7 years ago

          Your comment is not properly researched. Runs at 50Hz at highest resolution. That is a massive, massive penalty. I’m willing to bet there’s added latency or interference (every Matrox extender product I’ve used has it.) Reviews are stale/sub-par. Costs as much as an entire monitor. This is a far worse solution than just using the built in features of the graphics card.

            • Nardella
            • 7 years ago

            Currently the TH2G has limited applications. It was designed to be used with three 1280*1024 displays which for a long time were the highest quality screens that were affordable. I will test the latency in the next few days and report back.

            I reiterate that for a long time after the Parhelia the TH2G was supreme.

            The author of the article makes it seem like the first viable option for triple screen gaming was Eyefinity. This is incorrect.

            • Airmantharp
            • 7 years ago

            ‘Viable’ is pretty loose; a Dell U3011 is less viable for gaming than an HP ZR30w, because the Dell’s circuitry introduces input lag to add unneeded features and cost.

            Parahelia was great for it’s time, the first card with a true 256-bit memory bus, not that it could render for crap (at the time!). And at the time, there wasn’t a solution that could render to 3x1280x1024@60+ (hopefully 85+), so I do think that the point is moot.

      • Jason181
      • 7 years ago

      I disagree; the context is [i<]video cards[/i<], and not an add-on box. I think you're being downvoted for the less than diplomatic tone of your post. I saw the reviews of the triple head when it came out, and it was not suited to gaming.

        • Airmantharp
        • 7 years ago

        I knew there was a reason it didn’t make a splash in the enthusiast world. It’s just one of those things that I’ve forgotten due to un-importance.

      • Bensam123
      • 7 years ago

      He mentions the Parehelia in the article, but that didn’t go anywhere. TR talked about the duel and tripple head 2 go multiple times in news snippets. I’m pretty sure they never denied its existence.

      Just because they offer a product, doesn’t make it viable as well. Latency as others mentioned as well as signal quality. TH2G is also a dumb device that has absolutely no support available for it. So while both Nvidia and AMDs solution can improve and add new features (like better bezel compensation), this product is a relic from back when Matrox did things remotely related to the interests of their consumers.

      There is no way to make a game TH2G aware either.

    • Airmantharp
    • 7 years ago

    For those interested, here is a review of three of each of these GPUs with the same monitor setup at [url=http://www.hardocp.com/article/2012/04/25/geforce_gtx_680_3way_sli_radeon_7970_trifire_review/4<]HardOCP,[/url<] and I've linked it straight to the BF3 MP page. The thing I'm highlighting here is author's subjective comments on the playability of GCN versus Kepler, and how GCN in Crossfire seems to be less 'smooth' than Kepler does in SLi, for anyone looking at this setup and wanting to duplicate it, but with two or more GPUs.

    • Airmantharp
    • 7 years ago

    Before I make any comments, I want to thank you for this article. Your perspective, measurements and opinions on this setup just made the tech-collective smarter as a whole. We’ve been waiting for it!

    Further, great pictures, and great example of how this kind of setup should be done, kudos.

    I almost feel bad now for having anything not positive to say, because I cannot fault you for any of it.

    First, the whole ‘surround’ thing is a little overblown. Or perhaps not, as it’s not really that common. The perspective I’m coming from is summed up in your pictures and comments. Basically, the side screens at best at some blurry details to the experience, either because the rendering is warped as it should be, or because you’re seeing it out of your peripheral vision. This makes a good case for using less expensive monitors for the surround position. When you say that ‘it was more natural to move the mouse than my head/eyes’, you expose one of the basic human limitations of this setup- why put nice monitors there if you’re not going to use them?

    Second, the case can be made much more for a higher resolution single screen, exceeding that of today’s 30″ panels at 4MP, probably in 16:9 guise. One of the things I absolutely love about having a single 4MP 16:10 display is that the things rendered on it are of higher resolution. I get more detail up close and over distance, which makes a difference in certain circumstances, such as shooters. I can see, or the monitor can resolve, stuff that other people simply cannot with lower resolution setups.

    Now I’m not knocking your setup at all; I still think it’s awesome, and I completely understand having the desktop space for improved workflow, the value of three smaller monitors over one larger one, and the sheer joy of being able to render games across all three screens. So again, thank you for your time, effort and expertise!

      • Bensam123
      • 7 years ago

      Somethings you have to see first hand to believe. Simply looking at objective material doesn’t yield the subjective experience which really changes things for you.

      I definitely could see where you’re coming from with using crappy monitors on the side. But if the monitors don’t match up it would serve as a distraction. Not just the bezels, but the refresh rate, response rate, color, any sort of overdrive feature that is in use. Basically any detail down to the smallest nuance you’d start to notice as the picture is supposed to look fluid over all three displays. You’d be actively comparing the two side displays to the center whenever you use it.

    • TurtlePerson2
    • 7 years ago

    Congrats on the /. Mr. Gasior.

    • MaxTheLimit
    • 7 years ago

    Product key?

    [url<]https://techreport.com/image.x/surround-gaming/giant_batman-scape.jpg[/url<]

    • leor
    • 7 years ago

    I recently got on the 3 screen train, and it is very cool, but I’m having a problem of a different sort.

    My 3 screens are Dell 3011s and even 2x 7950s OCed to 1ghz still can’t get me smooth frame rates.

    I’ll understand if this problem garners little sympathy but it’s damn annoying. It fells like driving a Ferrari with a boot attached.

      • DeadOfKnight
      • 7 years ago

      Should have gone with 3 Dell 2410s. With the money saved you could buy 2 7970s with money to spare and you wouldn’t have that problem. It would still be an impressive setup and the screens are still very big and would deliver the immersion you’re looking for. You’d also have a lot more desk space to work with.

        • leor
        • 7 years ago

        They’re actually wall mounted on Ergotron arms, very cool setup, looks like I have terminators coming out of the walls and holding up my screens.

        I originally had one 3011 flanked by a pair of 2410s for my work setup, and I wanted to go Eyefinity, so i swapped the 2410s for 2 more 3011s. Coming from a pair of 6950s, which ran the previous setup in eyefinity (although not at the native res of the 30), I had no idea the 7950s would be so poorly up to the task. From what I’ve seen the difference between 7950s and 7970s in crossfire is not big enough to bridge the gap.

        It seems for the moment there is literally no setup out there that can drive these 3 screens smoothly.

          • Airmantharp
          • 7 years ago

          Specifically, two or more 4GB Kepler cards are what you’re looking for. 6MP (3x1080p/1200p) is about the limit of Kepler’s 2GB VRAM and GCN’s 3GB VRAM solution.

          Further, Kepler has been shown to be smoother than GCN in multi-GPU configurations. From personal experience, going from a pair of HD6950 2GB cards to one slower GTX670 2GB at 256×1600 was like night and day. I have no doubt HD79X0 would probably have provided the same result, highlighting the choppiness of CFX, but I feel that SLI is a better solution and I wanted to keep that option open, as one GTX670 is really the bare minimum for a single-GPU at 4MP for games like BF3 MP.

          • DeadOfKnight
          • 7 years ago

          All I’m saying is, if I had that kind of money I wouldn’t be complaining; I’d probably be out buying 3 more U3011s and a couple more 7950s for the ultimate eyefinity experience, even if I have to use a non-native resolution for games.

          • Bensam123
          • 7 years ago

          Tri or Quad 7950s… Although that’s starting to edge into territory that’s not normally tread, so it wouldn’t be paved as well.

            • leor
            • 7 years ago

            I actually tried trifire, 3x 7950s, but I didn’t see much of a difference, so I returned the card. This was a couple of months ago and maybe the drivers are better now, but I suspect what some other people said might be right and I’m running into a VRAM issue.

            I might be SOL until big kepler comes out or whatever AMD had up their sleeve with the 8000 series. Of course when the next generation of consoles are released and games become more demanding I might be back at square one. This type of setup might just be too much of an edge case to be a viable gaming system.

            • Airmantharp
            • 7 years ago

            I’d suggest that you start looking for buyers, and take a closer look at two or three of [url=http://www.amazon.com/EVGA-SuperClocked-Dual-Link-Graphics-04G-P4-2673-KR/dp/B007Z3HZLM/ref=sr_1_3?ie=UTF8&qid=1342020331&sr=8-3&keywords=gtx+680+4gb<]Evga GTX670 4GB Superclocked cards[/url<] :).

            • Bensam123
            • 7 years ago

            If it has something to do with the memory buffer or how the card deals with information (a technological limitation), simply throwing more memory at it wont necessarily fix the problem.

            It could just be AMD cards that have problems dealing with this many pixels and simply using a Nvidia card would fix things. Although I sort of doubt it’s the case. He’s talking about quite a few pixels.

          • Darkmage
          • 7 years ago

          Just a guess, but I think your problem is more related to Crossfire than it is to EyeFinity. Reading back over the TR reviews since they went to their “inside the second” analyses, there is a significant penalty to game smoothness when you add multiple GPUs. Whether it’s SLI, Crossfire or a multi-GPU graphics card, the overhead in coordinating the multiple pieces seems like it would be a likely culprit.

          At the very least, it’s worth investigating if one 7950 can’t drive a smoother game for you. 🙂

            • Airmantharp
            • 7 years ago

            I’d corroborate that with my own experience, but one problem that leor is going to have is that he’d have to test this at 2560×1600, for which a pair of current high-end GPUs is a little overkill. The problem there is that the cards may be so much faster than needed that they effectively mask any micro-stuttering that he’s seeing at 7680×1600.

            And it won’t solve his problem, unfortunately. He needs at least two current high-end GPUs to make 12MP playable, and they each have to have the VRAM to boot. Given that multiple sources have pointed to GCN not being as smooth in configurations of two or more cards than Kepler is, the only real solution seems to be replacing his HD7950s with at least a pair of 4GB GTX670s.

            I’d like to point out that I’m not trying to bash AMD here, they make great stuff and I use and recommend them where deserved, but both my personal experience and prevailing observations have shown it to be inferior to SLi. This is not to say that SLi is perfect or that it provides an experience that is as smooth as either brand’s single GPUs muster, but just that it’s better done than Crossfire right now.

        • Airmantharp
        • 7 years ago

        You know, I’ve got to down-rate you for this one bro (should I say Devildog?). In most cases, including mine with a single 30″, I resolved to buy the screen and then to figure out how to power it.

        I’m absolutely jealous of leor’s setup, it’s the kind of overkill I can appreciate, and isn’t appreciated as much at TR!

          • DeadOfKnight
          • 7 years ago

          Well I have to say it’s a good setup, and those screens are likely to last much longer than the GPUs which will probably be replaced by a solution that is up to the task later on. Basically I was just addressing his complaint with a “what did you expect?” I wouldn’t be complaining if I had that setup, I would be thrilled. If he wanted fluid gaming at native resolution, there’s really nothing up to the task of driving those simultaneously for current games other than a few odd titles with quad crossfire optimization and even then he might need custom cards with boosted RAM.

            • Airmantharp
            • 7 years ago

            We’re still waiting for confirmation, probably coming from affluent hobbyists on the [H]. It’s reasonably expected based on current VRAM usages and fill-rate results that the 12MP 3×30″ setup will require, at least, a pair of 4GB Kepler cards, if not three or four. A pair of 8GB GTX690s, if they were ever produced, would work as well.

            Steep prices to be sure, but then again purchasing and providing space for three $1000-1200 monitors isn’t so inexpensive either. You do get what you’re paying for though!

            • DeadOfKnight
            • 7 years ago

            It’s [url=http://rog.asus.com/116862012/graphics-cards-2/first-look-asus-mars-iii/<]coming[/url<]!

            • Airmantharp
            • 7 years ago

            You know, with TR showing the frame-times for the GTX690 being half of what they are on GTX680 SLi, I’d think that one of these cards would be great for an 11MP-12MP setup, and two would be ideal, but I’d venture to say that you might want to use an X79 board with a full 32-lane PCIe 3.0 complement; just make sure you get one without a PLX chip where the lanes are hard wired to the X79 silicon.

            It’d even be cooler and quieter than 4xGTX6x0 given the absurd way they designed those coolers. I just hope that the price isn’t beyond the $1200-$1300 range. $200 for four more gigabytes of GDDR5 and another $100 more for custom PCB and cooler over the stock card would be reasonable I think.

            • DeadOfKnight
            • 7 years ago

            I think it was only specific scenarios that showed that much of an improvement (although with the lower clockspeed it’s still very much noteworthy), otherwise they were pretty much the same. Still, I would question anybody wanting to go with a 680 SLI setup over the 690 because the 690 is clearly using top shelf binned chips and has a much better reference design for basically the same price and performance, plus it allows for a greater variety of SLI configs using many more motherboards and PSUs. Given the value of the 670 though, it would be smarter to go that route (not that the OP seems to have a very tight budget to work with).

      • Airmantharp
      • 7 years ago

      If you can afford it, give a pair of GTX670’s a try. Murmurings about (especially at the [H]) are showing that Kepler is working quite a bit better in multi-GPU, for whatever reason. And they’re not really faster, they’re just smoother.

      My experience with 2xHD6950’s may not apply to GCN, but man that setup didn’t feel as nearly as fast as I was expecting it to be on a single ZR30w (same panel, HP make).

        • Bensam123
        • 7 years ago

        TR benchmarks point out that Sli configs have less variance in them too, but that doesn’t fix his problem if this isn’t micro-stuttering and rather is a limitation because he’s pushing so many MPs.

          • Airmantharp
          • 7 years ago

          True-

          But with Kepler’s better memory efficiency and the availability of 4GB cards, I’d think that he’d be able to get a substantial boost in performance by switching to the green team.

          The frame-rates may not improve drastically, but I’m willing to bet that the difference in ‘feel’ will be night and day.

      • MrJP
      • 7 years ago

      I’d imagine it’s the VRAM limitation you’re running into with 12MP to drive, since even with Crossfire you’re still limited to 3GB effectively. Hence a pair of current 670s or 680s would likely be even worse. 4GB 680s might be enough when introduced, and there’s also a 6GB 7970 on the way from Sapphire if you want to be really sure. I’d think the extra memory bandwidth of the 7970s might be quite valuable in your situation, once you’ve got enough VRAM.

        • Airmantharp
        • 7 years ago

        GTX670 4GB cards are available right now; I linked an Evga one above for leor. If a single Kepler can handle 6MP with 2GB, two should be able to handle 12MP with 4GB, I believe. He might want to leave room for a third though :).

          • leor
          • 7 years ago

          It’s a decent thought, but at this point it seems like it would make more sense to wait for big kepler, than buy the second best little kepler.

          Can’t be more than a few months right?

            • Airmantharp
            • 7 years ago

            Some have said September, so consider that the earliest. We know quite a bit about the actual GPU from the Tesla K20 unveiling.

            But if you can fix the problem now, well… 🙂

            • DeadOfKnight
            • 7 years ago

            Also consider that big kepler, if it is even released in a consumer product, has a lot more die space dedicated to compute. While it would likely still blow away the 680, it won’t be doing it at great value unless NVIDIA for some reason decides to lower all their prices. I think you’d be better served by a set of GTX 670 4GB cards.

    • rhysl
    • 7 years ago

    Would love a rig like that.. but the Wife does not like the words “immersive and gaming” to go together sadly

      • squeeb
      • 7 years ago

      lol

      • Vaughn
      • 7 years ago

      Tell her to give you back your balls 🙂

    • Jambe
    • 7 years ago

    Nice assembly of shots there, and a fun read. I do wish 4K would take off so you wouldn’t have to mess with all the extra cabling and the bezels, though (e.g. QFHD). I guess you’d lose the wraparound qualities of multiple monitors, though. Unless… can they make curved LCDs like that crazy-expensive curved CRT monitor I recall?

    The image below the para on [url=https://techreport.com/articles.x/23217/5<]page five[/url<] beginning "At least in the single-player portion of the game..." is not showing up. [url=https://techreport.com/image.x/surround-gaming/giant_bf3-multi.jpg<]The link[/url<] just leads to a TR placeholder image.

      • Dissonance
      • 7 years ago

      Doh. Fixed the link. Thanks!

    • flip-mode
    • 7 years ago

    This was a fun read. Some of those screenshots are pretty neat.

      • Darkmage
      • 7 years ago

      Likewise. I definitely enjoyed the article.

      Once I finish the basement and move my home office… oh yes, new displays and new GPU shall be mine.

        • flip-mode
        • 7 years ago

        Student loan and home loan with negative equity heaping upon regular family expenses have me hurting pretty bad. I’ll be surprised if I ever gaze upon 3 displays in my household.

          • MKEGameDesign
          • 7 years ago

          I feel that pain too. I keep saving up for upgrades but pull that money out to actually buy some games for it in its current state.

    • wingless
    • 7 years ago

    Great write-up Geoff! I’ve been waiting for a test like this for a few months now.

    PS: I prefer Tri-Crossfire/SLI to power each monitor in a woefully inefficient manner. /1%’er

Pin It on Pinterest

Share This