Stereoscopic 3D has made its way into just about every medium of visual entertainment over the past few years. Most blockbuster films are in 3D nowadays, with movie theaters delighted to charge extra for the privilege—and for the disposable polarizing glasses. 3D televisions from the likes of Sony and Samsung are being sold at Best Buy. Even some game consoles, both set-top and handheld, now offer stereoscopic graphics.
The PC, too, has jumped on this bandwagon, thanks in large part to the efforts of Nvidia and AMD. Though these two companies have different philosophies and largely incompatible implementations, they’ve now been pushing stereo 3D on the PC for years.
Our last in-depth look at stereo 3D gaming was in February 2009, shortly after the debut of Nvidia’s 3D Vision technology. Back then, game compatibility left much to be desired, the performance hit was sizable, and the entry price was awfully steep—$199 for the glasses, $349 for a compatible display, and even more for a graphics card fast enough to do them justice. Our verdict was that, while promising, 3D Vision just wasn’t ready for prime time.
Much has happened since. Prices for both the glasses and compatible displays have fallen. Updated Nvidia glasses, as well as displays based on a new backlight technology, have hit the market. Some monitor vendors are now bundling the Nvidia glasses with their displays. AMD has entered the field with a looser standard called HD3D, which promises many of the same benefits as 3D Vision. Most importantly, the list of supported games has grown—substantially. Both companies now tout stereo compatibility with hundreds of titles, including recent triple-A releases like Battlefield 3.
Things are looking up.
Over the past few weeks, I’ve been tinkering with a pair of stereo 3D setups—one based on 3D Vision 2, the other based on HD3D—to get a sense of the current state of affairs. I was curious to get a feel for not just how well the technology works and how the AMD and Nvidia solutions compare, but also whether stereoscopy is a worthwhile addition to the PC gaming experience.
Before we get to the big questions, we should start by explaining what 3D Vision 2 and HD3D entail.
Stereoscopic 3D glasses: Nvidia (3D Vision 2) on the left, Samsung (HD3D Gold) on the right.
Nvidia’s GeForce 3D Vision 2
With 3D Vision, Nvidia was the first GPU maker to really push stereoscopic 3D gaming on the PC in a big way. The latest version of this technology still involves the same three basic components: glasses, displays, and software.
Nvidia’s 3D goggles are bundled with a number of compatible monitors and laptops. They’re also available on their own for $149.99 with a wireless receiver in the box, or $99.99 for the glasses only, if your display or laptop already has a receiver built in. That’s not cheap, but these are much more sophisticated than the simple polarizing glasses passed out at the movies. In both cases, the idea is to show each eye a different image, tricking the brain into perceiving depth. Nvidia’s active-shutter goggles do so by opening and closing their shutters 120 times each second. (I understand the process involves a liquid-crystal layer in each lens, which goes dark when an electric current is applied.) Compatible displays spit out images for the left and right eye in rapid succession, also at 120Hz. All the glasses have to do is shield one eye when the display is rendering an image intended for the other, and vice versa—presto, there’s your illusion of depth.
The displays and goggles have to be synchronized, which is where that wireless receiver comes in. It uses infrared signals, just like a TV remote. Also, since the glasses have to sync up and flicker their shutters many times each second, they require power. The Nvidia glasses have a battery built into the frame; you’ll just need to charge them via the included micro-USB cable every once in a while.
Nvidia says the 3D Vision ecosystem includes more than 20 compatible displays. There are a number of compatible laptops and projectors, too, but desktop monitors are what concern us today. Companies like Asus, Acer, BenQ, and ViewSonic all sell 3D Vision monitors, which typically have 1080p resolutions, panel sizes in the 23″-27″ range, and prices upward of $300. In most cases, dual-link DVI is the input of choice for stereo 3D. Some monitors have HDMI 1.4 support, too, but that interface limits the refresh rate of 1080p 3D images to 30Hz—fine for movies, but not so good for fast-paced action games. A small minority of 3D Vision panels (only a couple of models from BenQ, according to Nvidia) employ DisplayPort for stereo 3D.
Last October, Nvidia announced 3D Vision 2, an update to the hardware side of its stereo 3D implementation. 3D Vision 2 introduced new glasses, which are thinner, more flexible, and have 20% larger lenses than their predecessors. Also, 3D Vision 2 included a display technology called LightBoost, which strives to compensate for the dimming effect of stereo 3D glasses.
Without LightBoost, 3D Vision-compatible panels leave their backlights on constantly. To make sure image persistence, or ghosting, doesn’t interfere with the illusion of depth, both shutters in the glasses close while the display changes frames. So, you get a pattern that looks like this: display renders left frame, left shutter opens and then closes again; display renders right frame, right shutter opens and then closes again; rinse, repeat. The default state of the shutters, in other words, is closed.
On a LightBoost display, the shutters are open by default, and the backlight takes over ghosting-prevention duties by switching itself off while the frames change. What’s the point of transferring duties in this fashion, you ask? First, it allows the shutters to stay open longer and to close only to make sure each eye sees the right image. That arrangement makes images viewed through the glasses appear brighter. Second, because the backlight doesn’t have to be on all the time in stereo 3D mode, the display can get away with running the backlight at a higher intensity when it is on, without going beyond the monitor’s power spec. The result: an even brighter picture.
The concept of LightBoost isn’t exclusive to 3D Vision 2 monitors (we’ll get to that in a minute), but it does nicely address one of the common complaints about active-shutter stereo 3D implementations.
The last piece of the puzzle is software. Stereo 3D support is built into the GeForce drivers, and Nvidia says it can “convert existing games in real time (on the fly) into 3D.” On top of that, the company allows developers to “create their own native 3D games that also work with our entire 3D Vision ecosystem of products.” The result is a list of 3D Vision-compatible games with over 650 entries. To the user, 3D Vision’s software implementation is remarkably seamless and consistent. You’ll first want to head to the Nvidia control panel to enable stereo 3D, like so:
Once that’s done, a text overlay should come up at the bottom right of the screen whenever you launch a compatible 3D game. The overlay will say how well the game is supported and whether you need to tweak any settings to get the best experience. From there, enabling stereoscopy is as simple as hitting Ctrl-T. You can adjust the image depth by hitting Ctrl-F3 and Ctrl-F4, bring up a “laser sight” with another hotkey (in case the game’s built-in crosshair doesn’t play nice with 3D Vision), and tweak a handful of other settings with other, more esoteric shortcuts. They’re all detailed in the control panel.
There’s more to 3D Vision, including support for 3D televisions over HDMI 1.4, which requires special software. If you have multiple 1080p monitors, 3D Vision Surround will spread stereo 3D goodness over a trio of displays. Those features lie beyond the scope of this article, though; to make things manageable, we’re focusing on single-monitor desktop PC gaming.
Nvidia’s GeForce 3D Vision 2 — continued
So, how do you put together a 3D Vision 2-infused gaming PC? The first step is finding a good display, preferably with LightBoost. (Nvidia’s 3D Vision system requirements page is a good place to start your search.) Check to see whether the display comes with the Nvidia glasses or not. If it doesn’t, you’ll need to buy a glasses-and-receiver kit separately. You’ll also want a fast graphics card, because remember, you’ll be asking it to render twice as many frames each second—one for each eye—when stereo 3D is enabled. I won’t spoil our performance results early, but I’d advise against going with anything much slower than a GeForce GTX 560 Ti 448 or GeForce GTX 570 if you intend to crank up the eye candy at 1080p.
The hardest part, I suppose, is balancing all of those ingredients if you’re on a budget. Right now, the only Nvidia-approved monitors available at Newegg are based on 1080p panels, making it difficult to cheap out on the GPU front without running games at lower than the display’s native resolution. You could choose one of the more affordable 3D Vision panels, but be careful. At $330, Acer’s GD235HZbid might seem like a better deal than Asus’ VG236H, which has the same panel size and a $440 price tag. However, if you read the fine print, you’ll see the Asus display comes with the Nvidia goggles and the Acer does not. Add the $150 3D Vision kit, and the Acer display’s total cost goes up to $480.
Picking out your ideal 3D Vision setup is probably going to involve hours of careful research and price comparisons. You’ll want to look at the pros and cons of each display and check out a few reviews. You’ll also want to keep in mind that 3D Vision displays use TN panels, which might make them less suitable for tasks requiring high color accuracy. Perhaps you’ll find yourself compelled to keep a high-fidelity display (likely based on an IPS panel) for, say, photo editing, and complement it with a 3D Vision panel for stereoscopic gaming.
Once you have all the components selected, configuring and using your 3D Vision setup for stereo 3D gaming should be fairly straightforward.
Nvidia has made 3D Vision a tightly integrated offering, while AMD’s HD3D is a looser approach that encompasses a number of different solutions to the same problems. HD3D also hasn’t been around as long. Initial driver support arrived in October 2010 for Radeon HD 5000-series and newer graphics cards.
AMD doesn’t sell its own stereoscopic glasses, so makers of displays, TVs, and projectors must take over that responsibility. In practice, that means a gaggle of different glasses offerings, most compatible only with one display brand. Also, because AMD doesn’t impose a certain panel-and-glasses technology on its partners, sequential-frame designs with active-shutter goggles (a la 3D Vision) aren’t the only game in town. A small number of HD3D-compatible displays have line-interleave designs and passive, polarized 3D glasses. As I understand it, line-interleave displays work by applying different polarization to odd and even lines on the display, with each lens on the glasses filtering out one set of lines.
The presence of different stereo 3D implementations and incompatible glasses means there’s a certain degree of inconsistency in the HD3D experience—more so than with 3D Vision. Last year, AMD took a step toward addressing the issue by rolling out an HD3D certification program. The program has two tiers: silver and gold. To receive HD3D gold certification, hardware makers must submit their products to AMD for testing. Silver certification can be obtained by doing testing in-house and submitting the results to AMD. In either case, certified displays must meet certain minimum requirements, support existing 3D middleware and Blu-ray 3D movies, and have printed documentation that is “clear and easy to follow.” Just like Nvidia, AMD supports stereoscopic displays with dual-link DVI, DisplayPort, or HDMI 1.4 inputs.
Right now, AMD’s list of recommended stereo 3D monitors is only seven entries long. All of the recommended displays are based on either 23″ or 27″ panels, and four of ’em are from Samsung. Prices start at a scant $269.99 for LG’s 23″ line-interleave monitor with bundled passive glasses, and they range up to $699.99 for Samsung’s 27″ S27A950D with active-shutter goggles. A PDF file on AMD’s website lists three more HD3D-compatible displays from ViewSonic and Zalman, but those seem to be older and harder to find.
If HD3D isn’t tied down to a particular kind of panel or glasses, some of you may be thinking about using 3D Vision hardware. Bzzt. Wrong. AMD tells us 3D Vision gear is “bound by license” to Nvidia graphics cards. Even if AMD wanted it to, 3D Vision hardware won’t work with Radeons. Too bad.
Things are also fragmented on the software front, where two competing middleware vendors, DDD and iZ3D, provide the software glue that lets most compatible games run in stereo 3D mode on AMD hardware. DDD’s TriDef seems to be the most popular, the most frequently updated, and the one with support for the most titles—468, by DDD’s count, with user-submitted profiles for “over 40” additional games also available on the TriDef forums. TriDef costs $49.99 on its own, but some monitor vendors bundle it with their displays. iZ3D, meanwhile, sells for $39.99.
Unlike Nvidia’s stereo implementation, TriDef requires users to start games through its launcher app. In-game operation is similar, though, with keyboard shortcuts bound to certain functions. On the numpad, hitting the asterisk key will enable or disable stereo 3D, pressing + or – will adjust the depth, and punching 0 will bring up an on-screen display with other settings. Users can also hit a hotkey to enable Virtual 3D, a “method of improving performance by using information from the Z-buffer in the DirectX graphics pipeline.” Virtual 3D may work in games that lack a proper TriDef profile, but DDD cautions that it can cause visual distortion around object edges.
It’s worth noting that both TriDef and iZ3D also support Nvidia hardware, and both existed long before 3D Vision or HD3D. The history page on DDD’s website says that, way back in 2003, the firm “licensed its TriDef 3D suite of software to Sharp for deployment on its revolutionary 3D laptop PC – the Sharp Actius RD3D.” Wikipedia tells us Neurok Optics, which became iZ3D through a joint venture with Chi Mei Optoelectronics in 2007, used to sell its own stereo 3D monitors as far back as 2006. The gist is that, unlike 3D Vision, AMD’s initial HD3D push appears to have been more of a low-budget effort based on pre-existing products from third-party vendors.
AMD seems to be moving away from that approach, at least in part. It’s now pushing its own quad-buffer software API, which allows game developers to implement native 3D support for AMD GPUs without the need for third-party middleware. Several blockbuster titles, including Battlefield 3, DiRT 3, and Deus Ex: Human Revolution, already use this API. Because AMD’s Catalyst Control Center lacks a 3D configuration pane, however, games that implement the quad-buffer API must expose stereo 3D settings (typically an on/off switch and depth sliders) through their own configuration menus.
Like 3D Vision 2, HD3D supports Blu-ray 3D playback and, since the release of the Catalyst 12.1 driver, 3D gaming across triple-monitor configs. Again, though, we’ll be restricting our focus to single-monitor PC gaming for this article.
Picking out an HD3D setup involves the same balancing act as on the Nvidia side. The small selection of AMD-recommended panels does make things a tad more straightforward, as does the fact that all the solutions seem to come with glasses in the box. That said, you’ll have to decide whether you favor line-interleave or active-shutter stereo implementations. We’ve only had a chance to test out active-shutter designs, so we unfortunately can’t provide much assistance on that front. You may also want to check if your chosen monitor comes with TriDef or iZ3D software—that could save you the expense of having to purchase the software separately. The 23″ Samsung monitor AMD sent us comes with TriDef.
On the GPU side of things, AMD’s Radeon HD 6970 should be a good starting point if you want smooth, responsive gameplay at 1080p with the eye candy cranked up. AMD’s new Radeon HD 7900-series cards are also worth a look if you can afford them, and they should offer substantially higher performance. See our review of the Radeon HD 7970 for more details.
Our test hardware
The 3D Vision 2 and HD3D displays we used for testing aren’t strictly comparable—one is larger and more expensive than the other. Sorry about that. The thing is, these are the 3D panels AMD and Nvidia are sending out to testers right now. Perhaps Nvidia wants to wow reviewers, while AMD is happier to highlight the existence of more budget-conscious solutions. Either way, we’ll try not to penalize the AMD solution unfairly because our 3D Vision display is nicer. Keep in mind that 27-inch 3D monitors are available on the HD3D side, and that cheaper, smaller panels are available in the 3D Vision camp.
Asus’ VG278H fills in as our 3D Vision 2 monitor. This is a fairly high-end offering, with a $699.99 price tag and a 27″ 1080p panel. According to Asus, the VG278H has 300 cd/m² luminosity, 50,000,000:1 contrast, a 2-ms response time, and LightBoost backlighting technology. It also features an assortment of VGA, DVI, and HDMI inputs, and it can be adjusted vertically and tilted. Happily, Asus opted for a matte finish on the panel.
The VG278H is bundled with a set of 3D Vision 2 goggles from Nvidia. The goggles interface with an IR emitter built into the top of the monitor, where you might normally see a webcam, so there’s no need for an auxiliary base station. The IR emitter is mounted on a hinge and can be tilted up and down (but not left or right) to accommodate the user’s sitting position.
For our HD3D testing, we’re using the Samsung S23A750D, a more affordable offering with $349.99 asking price, a 23″ 1080p panel, and no height adjustment functionality. (The stand offers only tilt adjustment.) Samsung touts a 1,000,000:1 contrast ratio, 250 cd/m² luminosity, and a 2-ms response time. Judging by what AMD told us, this monitor has an LED backlight pulsing technology very similar to LightBoost, although it’s not called by that name. You’ll find DisplayPort and HDMI inputs around the back, Samsung’s active-shutter glasses in the box, and DDD’s TriDef software on the driver CD.
Oh, and this display is stamped with AMD’s HD3D gold seal of approval.
While the Asus offering looks the same as any other PC monitor, Samsung’s S23A750D is a strange beast. It’s quite literally glossy all over, even on the back, and it features touch controls laid out on a wedge-shaped, cylindrical base. There’s no secondary display, mind you; the controls bring up an OSD on the main panel, just like you’d expect. It’s a neat concept, but the execution could use some work. On one hand, the controls are easier to get to than if they were laid out along the edge of the panel. On the other, I found myself occasionally having to tap the buttons multiple times to get a response.
I’m not a fan of the glossy finish, either. You’re guaranteed to leave an ugly fingerprint anytime you touch this monitor, whether it’s to adjust the tilt or to work the OSD controls. Also, I’m somewhat concerned that the reflective coating on the panel could be detrimental to stereoscopic image quality. Some folks have argued that stereo 3D confuses the brain by forcing the eyes to converge on one plane and focus on another. With its glossy coating, the Samsung display serves up artificial depth on a flat plane, while at the same time reflecting true depth, on which the eyes can converge and focus normally.
One last thing to note: while the 3D Vision 2 goggles bundled with the Asus display can be recharged via micro-USB, Samsung’s 3D goggles have a small compartment in the frame that accommodates a CR2025 battery, which you’ll have to replace once it’s depleted. Both displays have built-in receivers, though.
To drive these monitors, we selected two competing graphics cards:
Asus’ take on the GeForce GTX 570, the ENGTX570 DCII/2DIS/1280MD5, will be hooked up to our 3D Vision 2 panel. This card sells for $349.99 (or $329.99 after a mail-in rebate) and features a custom, triple-slot DirectCU II cooler with dual 80-mm fans. Asus clocks the card at Nvidia’s prescribed speeds of 742MHz for the GPU core and an effective 3.8GHz for the memory. Two DVI ports, one DisplayPort, and one HDMI output line the rear cluster.
For our HD3D setup, we’re using Asus’ Radeon HD 6970 card, the EAH6970 DCII/2DI4S/2G. This card also has a dual-fan, triple-slot DirectCU II cooler and stock clock speeds, but it sports a smaller vent and more display outputs than the Nvidia offering. (There’s a total of two DVI ports and four DisplayPort, er, ports.) This puppy used to be sold for $359.99 after rebate at Newegg, but it seems to have been deactivated. The card is out of stock at other e-tailers, too.
To be clear, we’ll be running our tests with only one of each of these two cards. We have several good reasons for that choice, most notably the microstuttering issues associated with multi-GPU configs. We want to rule those out when testing 3D performance in stereo. We also believe that, for 3D Vision 2 and HD3D to be appealing to the masses, they ought not be bound to pricey multi-GPU setups and their associated caveats. Using high-end, single-GPU cards is a good way to address the performance requirements of real-time stereoscopic 3D without going overboard.
Because of the differences between our displays, we’ll first address things like compatibility and playability to get a feel for each solution’s software support. Then, we’ll talk about the image quality on each display with the bundled glasses. Finally, we’ll discuss whether stereo 3D—in either implementation—provides a compelling step up from standard 2D graphics in each game.
We’re going to kick things off with a game AMD and Nvidia have named as a poster child for 3D support: EA DICE’s Battlefield 3. Next, we’ll move on to games recommended by each company (but not both). That should give us an idea how much overlap exists in the game support between the two solutions.
At 1080p with the “high” detail preset, Battlefield 3 looked good and ran fairly smoothly on both setups. However, unlike the 3D Vision setup, the HD3D rig exhibited a few kinks—even with AMD’s new Catalyst 12.1a driver. Sometimes, overlays like blood splatters and dust particles, which are meant to look like they’re stuck to the screen, would only be visible to one eye. I saw the same problem in the single-player campaign. At the start of one mission, when you slowly wake up after an earthquake, an overlay of the character’s eyelids opening and closing suffered from the same bug. At certain points, one eye would be getting a totally dark image, while the other would see the full 3D scene. Disconcerting. Seeing dust and blood splatters through only one eye seemed to cause eye strain over prolonged multiplayer sessions, too.
What about the displays and glasses?
I actually tried Battlefield 3 on the Samsung panel with the Radeon first, and I must confess to being sorely disappointed. The image was too dark, details were too small, and trying to sit closer to the display resulted in serious eye strain. (Turns out Samsung’s documentation recommends sitting no closer than 20″ from the display in 3D mode.) A large part of being a skilled BF3 player involves spotting camouflaged enemies in hard-to-see places—grass, bushes, behind rocks, and the like. The dimness of the Samsung setup with stereo 3D enabled was a handicap, and it led me to getting blindsided by enemies I really should have been able to spot.
To make matters worse, the Samsung panel had some nasty ghosting going on. With each eye, I could see a faint outline of the image intended for the other eye. Switching the glasses off and on again to re-sync them didn’t help. Somehow, though, cycling display inputs through the monitor’s OSD reduced ghosting in a noticeable way. Go figure.
After that, switching to the Asus monitor with the GeForce was like night and day. Not only was the image bigger, which helped with both the immersion and the spotting of bad guys, but it was also brighter and closer to the level of contrast one might expect from a 2D panel. Best of all, I saw little to no ghosting and experienced almost no eye strain. My only beef was that, sometimes, I caught reflections of the game action in the glossy frame of the Nvidia goggles. Really, Nvidia? Why would you make that glossy, of all things?
In any case, I got sucked into the action and ended up losing track of time playing BF3 on the 3D Vision rig. It was delightful. Bullets whizzed at me menacingly, dogfights gained a whole other dimension, and crawling through grass and bushes was suddenly a whole lot more realistic. No doubt about it, the 3D Vision 2 rig made the game more fun and visually engrossing than my personal, non-stereoscopic gaming setup.
Batman: Arkham City
The latest entry in Rocksteady Studios’ Batman series is featured prominently in 3D Vision 2’s official compatibility list. True to Nvidia’s promise, the game was smooth and exhibited no visual bugs that I noticed in 3D Vision mode. (To keep frame times low, I played at 1080p using high detail levels, but I left DirectX 11 features disabled.) I’m afraid I can’t say as much for HD3D, which lacks proper stereoscopic support for the game altogether. Arkham City doesn’t use AMD’s quad-buffer API, and it doesn’t have a complete HD3D profile in TriDef. Trying to use the TriDef’s profile creation feature resulted in awful artifacting. The Virtual 3D option provided the best approximation of what I saw on the Nvidia config, but the distortion around objects and characters was bothersome.
While I couldn’t stomach much of Arkham City in HD3D, I spent some time playing on the 3D Vision 2 setup. As in BF3, enabling stereo 3D added to the experience. Arkham City‘s streets are dark and visually noisy, which can make the game feel a little uniform without the illusion of depth. (In fact, I think one can say the same thing for all too many Unreal Engine 3 games—there’s something about UE3 and gritty environments, isn’t there?) Adding depth made Arkham City‘s game world pop, causing level geometry, objects, and characters to stand out, to gain more visual separation from one another. Believe it or not, I got the feeling playing in stereo 3D made the levels easier to navigate.
All in all, though, stereo 3D wowed me less in Arkham City than it did in Battlefield 3. Perhaps that’s because Arkham City is a third-person game that doesn’t throw objects and bullets directly at the player’s face. Then again, its masterfully voice-acted cut scenes did look great with a dash of stereoscopy. Hmm.
Deus Ex: Human Revolution
Deus Ex: Human Revolution is one of those rare games where AMD’s Gaming Evolved logo pops up amid the other unskippable animations at the start. The game employs AMD’s quad-buffer API, which made TriDef unnecessary on our HD3D setup. There, the game ran well and looked essentially perfect with all the detail settings turned up. On the 3D Vision 2 config, the Nvidia overlay warned us to turn the shadow detail down from “soft” to “normal.” We found that playing with ambient occlusion at its highest setting caused some choppiness, as well. There were no major problems once we turned down those two settings, although we still noticed some strange jumpiness when circle-strafing. We’ll have some DE:HR benchmarks in a minute, so stay tuned.
Comparing the Asus and Samsung displays, I’d say their image quality was largely similar. The Samsung did produce more ghosting, which made my eyes strain a little more, but cycling the input source seemed to lessen the problem just like it did in BF3.
With all that said, I don’t think DE:HR is a good poster child for the benefits of stereoscopy. The flat textures, simple level geometry, and overwhelmingly dark environments looked just as flat in stereo 3D mode as they did without pseudo depth. I tried fiddling with the depth adjustments for 3D Vision 2 and HD3D, but instead of making the game look significantly better, adding depth made playing uncomfortable. DE:HR can be incredibly engrossing, and I expected stereo 3D to make it more immersive. It wasn’t to be.
The folks at AMD we spoke to about stereo 3D suggested trying a Source Engine game. Since my Team Fortress 2 skills leave a lot to be desired, I ran Valve’s latest commercial Source title, Portal 2. The game was buttery smooth on both the 3D Vision 2 and HD3D setups with everything cranked up at 1080p. Enabling stereo 3D on the Nvidia rig involved the same maneuver as with the other titles—hitting Ctrl-T once in-game. On the AMD system, I had to use the TriDef launcher to start the game. The two solutions produced essentially identical image quality… with one small exception: on the HD3D setup, graffiti in the game’s secret areas was only visible through one eye, and it moved with the camera. Although not a show-stopping issue by any stretch, this flaw did break the immersion somewhat.
The Asus and the Samsung displays were both flattering to the game. The Samsung solution’s lower brightness wasn’t the handicap as it had been in BF3, nor was it particularly noticeable. However, both panels exhibited noticeable ghosting in Portal 2, especially around the bright and colorful graphics placed near buttons, doors, and other objects of note. I was especially disappointed to see ghosting on the Asus panel, which had done a better job of keeping image persistence in check up to that point.
Despite these visual anomalies, playing Portal 2 in stereo 3D was a ton of fun. The game’s relatively simple, highly stylized levels really popped, and stereoscopy made looking through portals a whole other experience, especially when the blue and orange portals weren’t on the same axis. Vertigo, ahoy! I don’t know if the effect warrants spending hundreds of dollars on stereo 3D gear, but it was undeniably cool. Both the HD3D and the 3D Vision 2 configs seemed to go a little overboard with stereoscopic depth initially, which caused some visual discomfort. I ended up reducing the stereo depth using keyboard shortcuts.
Our last test subject, Trine 2, came highly recommended by Nvidia. I was curious to try the game, having enjoyed the original quite a bit when it came out a few years back. For those unfamiliar with the series, Trine and Trine 2 are action/puzzle side-scrollers with snazzy Unreal Engine 3-driven graphics. In both games, getting through levels involves switching between three characters on the fly and using their unique abilities to solve puzzles and to fight bad guys.
With our Nvidia setup, the storybook-style title screens and in-game graphics rendered perfectly in three dimensions. HD3D lacks proper support for Trine 2, but TriDef’s Virtual 3D mode was a surprisingly good consolation prize. The distortion around objects wasn’t nearly as obvious as it had been in Arkham City. Although the title screens only rendered in 2D, the in-game action looked fine, with just the right amount of depth. The game played smoothly at 1080p, as well, just like on the 3D Vision config.
Unfortunately, Trine 2 has a lot of high-contrast graphics, which led to noticeable ghosting and, in turn, eye strain on both displays.
I actually had to cut my play-testing short, even on the 3D Vision 2 setup. After a while, the game just gave me a headache. That’s really a shame, because stereo 3D was surprisingly effective in Trine 2. Without the illusion of depth, I tended to focus more on the character. In stereo 3D, the backdrop and foreground really came to life, which made the game world more… well, three-dimensional. The effect wasn’t breathtaking in every scene, but it did wow me on a regular basis as I played through the first few levels.
In a nutshell
Before we move on to hard performance numbers, charts, and graphs, let’s boil down what we’ve learned so far to a few key observations.
I think it’s safe to say 3D Vision 2 has a serious edge over HD3D in terms of game compatibility. None of the titles I played on the Nvidia setup exhibited any visual bugs in stereo 3D mode. All of the games actually worked, even if I had to tweak the settings in Deus Ex: Human Revolution. Meanwhile, the HD3D setup lacked proper support for two of the games I ran, and it exhibited visual artifacts in two others—Battlefield 3 and Portal 2. Only Deus Ex, an AMD “Gaming Evolved’ release, ran in stereo mode without noticeable kinks.
Prospective users should carefully consider just how much they have to gain from a stereo setup.
Based on my experience, the combination of Asus’ VG278H display and 3D Vision 2 glasses delivers a better visual experience than Samsung’s S23A750D with its bundled glasses. Until we cycled through its display inputs, the HD3D Samsung panel had serious ghosting problems in every game we tried. Even without the ghosting, the display was too dark to make Battlefield 3 multiplayer an appealing proposition in stereo 3D mode. The Asus display and Nvidia spectacles offered substantially brighter images and caused less eye strain. Yes, the Asus panel costs more ($700 vs. $350 for the Samsung) and has a higher luminance rating (300 cd/m² vs. 250 cd/m²), so the comparison isn’t entirely fair. Nevertheless, using the two displays side by side demonstrated just how vital image brightness is to a stereo 3D setup. You can save several hundred bucks by going with a monitor that has a weaker backlight, but doing so could seriously damage the stereoscopic experience and make gameplay punitive rather than engrossing.
A few words must also be said about the benefits of stereo 3D over traditional 3D gaming. In some games, like BF3 and Portal 2, the added immersiveness and “wow” factor was undeniable. In others, like Deus Ex, stereo 3D didn’t really do much. And in Trine 2, the experience was mixed—the game looked beautiful, but ghosting and eye strain made it painful to play for too long. Our selection of titles admittedly wasn’t exhaustive, but our experience shows the addition of stereo 3D doesn’t automatically make things better. Considering the high cost of entry and the sacrifices that must be made (namely, turning down graphical detail in order to keep frame times low in stereo 3D mode), prospective users should carefully consider just how much they have to gain from a stereo setup.
Speaking of sacrifices, I’d like to throw in one final complaint. I wear prescription glasses. Their frames are made of titanium, and the titanium is encased in thick rubber on the temples. Unfortunately, that design didn’t play well with 3D goggles when I tried to wear headphones—the pressure against my temples quickly became uncomfortable. Many of you not blessed with 20/20 vision likely wear contacts, but those who don’t should be aware that active-shutter glasses are relatively chunky.
Let’s now look, in detail, at the performance impact of stereo 3D on both of our setups.
Our testing methods
Our test system was configured as follows:
|Processor||Intel Core i5-750|
|North bridge||Intel P55 Express|
|Memory size||4GB (2 DIMMs)|
|Memory type||Kingston HyperX KHX2133C9AD3X2K2/4GX
DDR3 SDRAM at 1333MHz
|Memory timings||9-9-9-24 1T|
|Chipset drivers||INF update 126.96.36.1995
Rapid Storage Technology 10.1.0.1008
|Audio||Integrated Via VT1828S
with 188.8.131.5200 drivers
|Graphics||Asus Radeon HD 6970 (EAH6970 DCII/2DI4S/2G)
with Catalyst 12.1a drivers
|Asus GeForce GTX 570 (ENGTX570 DCII/2DIS/1280MD5)
with GeForce 290.53 beta drivers
|Power supply||Corsair HX750W|
|OS||Windows 7 Ultimate x64 Edition
Service Pack 1
Thanks to AMD, Asus, Intel, Corsair, Kingston, and Western Digital for helping to outfit our test rigs.
We used the following applications for our empirical testing:
As ever, we did our best to deliver clean benchmark numbers. We used the Fraps utility to record frame rates while playing a 90-second sequences through each level we tested. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence at least five times per video card, in order to counteract any variability. Vertical refresh sync (vsync) was disabled for all empirical tests.
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
The performance — Deus Ex: Human Revolution
We tested Deus Ex: Human Revolution by playing through the start of the game’s first real mission, at the Sarif Manufacturing plant. We mostly shot our way through, but some sneaking took place, as well.
Based on what we saw in our subjective testing, we decided to run the game at 1080p with high FXAA antialiasing and all of the image quality options cranked up—save for shadows and ambient occlusion, which were set to “normal.”
Enabling stereo 3D causes not just a rise in frame times, but also an increase in frame-time variability. The inconsistency is visible through most of the run on AMD config, where frame times jump quickly and repeatedly from under 20 ms to 25-30 ms, and then back under 20 ms. Such variation in numbers that low isn’t a huge concern, though, since they’d translate to frame rates around 30 FPS or better. The Nvidia card produces higher frame-time peaks, which have the potential to be more noticeable.
When we look at average frame rates, which paint an incomplete picture of overall smoothness, the Nvidia GPU seems to take less of a hit from stereo 3D than its AMD rival. However, the two configs have equivalent average frame rates in stereo mode.
This metric gives us a sense of overall frame latencies; it tells us the threshold below which 99% of the frames were rendered by each setup. Even in stereo 3D mode, frame times were reasonably short for both the Radeon and the GeForce. (The GeForce’s threshold of 36 ms per frame corresponds to around 28 FPS.) In a game like Deus Ex, which favors slow sneaking over twitchy action, that seems to be fast enough to ensure largely smooth gameplay—despite the frame-time spikes.
Even with stereo 3D enabled, the AMD config never takes longer than 50 ms to generate a frame. (50 ms/frame works out to a frame rate of 20 FPS and is our tentative threshold for acceptable frame rendering times.) In stereo 3D mode, the GTX 570 spends a little bit of time beyond 50 ms, but not too terribly much in the context of a 90-second test run.
From a seat-of-the-pants perspective, the Nvidia setup feels less fluid, with intermittent sputtering that compromises the illusion of motion. It’s not less responsive mind you; we didn’t notice any input delay or anything of the sort. There were, however, occasional skips and rare, subtle oscillations in movement speed and animation.
The performance — Battlefield 3
We benchmarked Battlefield 3 by playing through the start of the Kaffarov mission, right after the player lands. Our 90-second runs involved walking through the woods and getting into a firefight with a group of hostiles, who fired and lobbed grenades at us.
The game was run at the “high” detail preset on both cards, first with stereo 3D disabled, and then with it enabled.
Without stereoscopy enabled, frame times are remarkably consistent, though there are some intermittent spikes on the Nvidia setup toward the end of the run.
Enable stereo 3D, and things go a little crazy—even on the AMD setup. Not only are frame times higher on average, but there’s much more variation, with both configs exhibiting some frame time spikes. Those spikes are particularly bad on the Nvidia side, where peak frame times approach 80 ms—equivalent to 13 FPS.
This traditional measurement shows us the average performance hit of stereo 3D, without taking spikes and frame-time irregularities into account. Clearly, rendering a 1080p scene twice per frame (once for each eye) is no picnic.
In this game, enabling stereo 3D raises our general frame time threshold quite a bit, especially on the Nvidia GPU. 68 ms per frame, which corresponds to 15 FPS, is too high for a fast-paced action game like this one.
Here, we see how much time each card spends churning out frames that take longer than 50 ms to render. True to what we saw in the plots above, the Nvidia setup is at a considerable disadvantage. Evidently, the impact of stereo 3D isn’t just limited to lower average frame rates (or higher average frame times). Frame-time spikes can be a serious problem. They threaten to damage the illusion of motion and make gameplay choppy.
Here, I’m having a hard time reconciling my gut impression with the data recorded. As I noted earlier, the game was playable and enjoyable on the Nvidia system, especially in multiplayer mode, and I don’t recall gameplay being obviously sputtery. That’s odd, because sputtering was clearly noticeable in Deus Ex with far fewer frame-time peaks above 50 ms. One possible explanation we’ve kicked around is that the frame times spikes in BF3 may be related to having lots of particle effects on screen, like when things are exploding. Those aren’t the moments when one can move and react especially well, regardless, so the slowdowns may be less noticeable. Another possible explanation is that frame-time spikes may somehow feel less disruptive when average frame times are already fairly high—which they were in BF3, but not in Deus Ex.
We’ve learned a couple of things from this little exercise: stereo 3D has come a long way since the early days of 3D Vision, and it still has a way to go.
Inconsistent game compatibility is still a potential issue. In our experience, Nvidia is far and away the winner on that front, but judging by the frame time spikes we measured and the need to tweak settings in Deus Ex, even the best solution on the market isn’t perfect. AMD, meanwhile, suffers from a fragmented hardware ecosystem and patchy support for even major titles. Battlefield 3 still has visual bugs with the latest driver release, and Arkham City lacks proper support of any kind right now. More worryingly, we encountered a rendering bug in Portal 2, a game that’s been out for over nine months and really ought to work without issues at this point.
Another problem lies with the displays and their matching active-shutter glasses. The vast majority of 3D monitors are based on TN panels, which have improved in recent years but still exhibit clearly inferior color reproduction and viewing angles compared to IPS offerings. TN panels have quicker response times, a vital quality for active-shutter 3D configs. However, our testing shows that even Asus’ VG278H monitor, a $700 solution with lightning-quick response times, still produces bothersome ghosting in certain games—and it’s the display Nvidia sends to reviewers. That makes the prospect of slower IPS panels coupled with active-shutter goggles seem unworkable, at least for now.
On top of that, the rapid flickering of active-shutter glasses can cause eye fatigue. One day, after testing, I went to the theater to see Tintin in 3D. I was amazed at how much more comfortable the stereoscopic visuals were, simply because the RealD polarizing glasses didn’t flicker. Even with wide game compatibility, no ghosting, and perfect color reproduction and viewing angles, any stereo 3D setup that relies on active-shutter glasses will be flawed to some degree.
The price and performance issues are worth noting, too. A stereo 3D config requires considerable graphics horsepower. Based on our performance results, a $300+ card seems de rigueur for stereoscopic gaming at 1080p. Add the price of a sufficiently big and bright 120Hz panel with the latest glasses, and you could be looking at a thousand-dollar expense. That’s a lot of money to spend on a capability that doesn’t improve every game substantially, a capability that forces one to trade visual fidelity (in the sense of graphical bells and whistles) for the illusion of depth.
Help is on the way, at least for some of the problems we encountered.
AMD says Microsoft will include native stereo 3D support in DirectX 11.1, which will ship with Windows 8, in the form of a new quad-buffer API. (Incidentally, that API will be incompatible with AMD’s own quad-buffer API.) We don’t know yet if DX11.1 will be back-ported to Windows 7, but in time, game developers should be able to support both Nvidia and AMD stereoscopic setups through the same API calls. That doesn’t mean they will, especially considering the abundance of shoddy console ports on the market today. If devs do take advantage of DX11.1’s quad-buffer API, the stereo 3D experience should be more consistent across different configurations.
All things being equal, I would undoubtedly prefer stereo 3D over a lack of it—but things are not equal. Not anywhere close.
In other news, AMD told us about an IPS-based 3D monitor due out this quarter. The display uses line interleaving coupled with passive, polarizing glasses—a design that should offer greater color fidelity without the troublesome flickering associated with active-shutter goggles. AMD says the polarization constrains vertical viewing angles in stereo mode, but horizontal viewing angles are purportedly “outstanding.” Our limited experience with line interleaving makes us wary of other potential tradeoffs, though.
All of this raises one simple question: if you can afford to, should you hop on the stereo 3D bandwagon now?
Personally, I think the negatives still outweigh the positives. On one hand, you’ve got the high cost of entry, flickering, ghosting, a huge performance hit, and compatibility kinks (on the HD3D side). On the other, you’ve got… what? A fleeting “wow” factor that’s there in some games and absent in others? It’s true PC gaming is all about enhancing the experience in subtle ways, but I think stereo 3D requires users to jump through too many hoops for too few enhancements. All things being equal, I would undoubtedly prefer stereo 3D over a lack of it—but things are not equal. Not anywhere close.
If you’re feeling the itch and can afford a good 3D setup, I won’t try to stop you. You’ll certainly enjoy the experience in a few games, and depending on your point of view, that might make the investment worthwhile. I will, however, suggest that you favor 3D Vision 2 until HD3D’s patchy game compatibility is, er, less patchy. There’s really no sense in going through all the trouble of building a stereo gaming rig only to encounter compatibility problems in every other game. Also, whatever you do, don’t skimp on the display; it can make or break the experience.