PC gaming in 3D stereo: 3D Vision 2 vs. HD3D

Stereoscopic 3D has made its way into just about every medium of visual entertainment over the past few years. Most blockbuster films are in 3D nowadays, with movie theaters delighted to charge extra for the privilege—and for the disposable polarizing glasses. 3D televisions from the likes of Sony and Samsung are being sold at Best Buy. Even some game consoles, both set-top and handheld, now offer stereoscopic graphics.

The PC, too, has jumped on this bandwagon, thanks in large part to the efforts of Nvidia and AMD. Though these two companies have different philosophies and largely incompatible implementations, they’ve now been pushing stereo 3D on the PC for years.

Our last in-depth look at stereo 3D gaming was in February 2009, shortly after the debut of Nvidia’s 3D Vision technology. Back then, game compatibility left much to be desired, the performance hit was sizable, and the entry price was awfully steep—$199 for the glasses, $349 for a compatible display, and even more for a graphics card fast enough to do them justice. Our verdict was that, while promising, 3D Vision just wasn’t ready for prime time.

Much has happened since. Prices for both the glasses and compatible displays have fallen. Updated Nvidia glasses, as well as displays based on a new backlight technology, have hit the market. Some monitor vendors are now bundling the Nvidia glasses with their displays. AMD has entered the field with a looser standard called HD3D, which promises many of the same benefits as 3D Vision. Most importantly, the list of supported games has grown—substantially. Both companies now tout stereo compatibility with hundreds of titles, including recent triple-A releases like Battlefield 3.

Things are looking up.

Over the past few weeks, I’ve been tinkering with a pair of stereo 3D setups—one based on 3D Vision 2, the other based on HD3D—to get a sense of the current state of affairs. I was curious to get a feel for not just how well the technology works and how the AMD and Nvidia solutions compare, but also whether stereoscopy is a worthwhile addition to the PC gaming experience.

Before we get to the big questions, we should start by explaining what 3D Vision 2 and HD3D entail.

Stereoscopic 3D glasses: Nvidia (3D Vision 2) on the left, Samsung (HD3D Gold) on the right.

Nvidia’s GeForce 3D Vision 2

With 3D Vision, Nvidia was the first GPU maker to really push stereoscopic 3D gaming on the PC in a big way. The latest version of this technology still involves the same three basic components: glasses, displays, and software.

Nvidia’s 3D goggles are bundled with a number of compatible monitors and laptops. They’re also available on their own for $149.99 with a wireless receiver in the box, or $99.99 for the glasses only, if your display or laptop already has a receiver built in. That’s not cheap, but these are much more sophisticated than the simple polarizing glasses passed out at the movies. In both cases, the idea is to show each eye a different image, tricking the brain into perceiving depth. Nvidia’s active-shutter goggles do so by opening and closing their shutters 120 times each second. (I understand the process involves a liquid-crystal layer in each lens, which goes dark when an electric current is applied.) Compatible displays spit out images for the left and right eye in rapid succession, also at 120Hz. All the glasses have to do is shield one eye when the display is rendering an image intended for the other, and vice versa—presto, there’s your illusion of depth.

The displays and goggles have to be synchronized, which is where that wireless receiver comes in. It uses infrared signals, just like a TV remote. Also, since the glasses have to sync up and flicker their shutters many times each second, they require power. The Nvidia glasses have a battery built into the frame; you’ll just need to charge them via the included micro-USB cable every once in a while.

Nvidia says the 3D Vision ecosystem includes more than 20 compatible displays. There are a number of compatible laptops and projectors, too, but desktop monitors are what concern us today. Companies like Asus, Acer, BenQ, and ViewSonic all sell 3D Vision monitors, which typically have 1080p resolutions, panel sizes in the 23″-27″ range, and prices upward of $300. In most cases, dual-link DVI is the input of choice for stereo 3D. Some monitors have HDMI 1.4 support, too, but that interface limits the refresh rate of 1080p 3D images to 30Hz—fine for movies, but not so good for fast-paced action games. A small minority of 3D Vision panels (only a couple of models from BenQ, according to Nvidia) employ DisplayPort for stereo 3D.

Last October, Nvidia announced 3D Vision 2, an update to the hardware side of its stereo 3D implementation. 3D Vision 2 introduced new glasses, which are thinner, more flexible, and have 20% larger lenses than their predecessors. Also, 3D Vision 2 included a display technology called LightBoost, which strives to compensate for the dimming effect of stereo 3D glasses.

Without LightBoost, 3D Vision-compatible panels leave their backlights on constantly. To make sure image persistence, or ghosting, doesn’t interfere with the illusion of depth, both shutters in the glasses close while the display changes frames. So, you get a pattern that looks like this: display renders left frame, left shutter opens and then closes again; display renders right frame, right shutter opens and then closes again; rinse, repeat. The default state of the shutters, in other words, is closed.

On a LightBoost display, the shutters are open by default, and the backlight takes over ghosting-prevention duties by switching itself off while the frames change. What’s the point of transferring duties in this fashion, you ask? First, it allows the shutters to stay open longer and to close only to make sure each eye sees the right image. That arrangement makes images viewed through the glasses appear brighter. Second, because the backlight doesn’t have to be on all the time in stereo 3D mode, the display can get away with running the backlight at a higher intensity when it is on, without going beyond the monitor’s power spec. The result: an even brighter picture.

The concept of LightBoost isn’t exclusive to 3D Vision 2 monitors (we’ll get to that in a minute), but it does nicely address one of the common complaints about active-shutter stereo 3D implementations.

The last piece of the puzzle is software. Stereo 3D support is built into the GeForce drivers, and Nvidia says it can “convert existing games in real time (on the fly) into 3D.” On top of that, the company allows developers to “create their own native 3D games that also work with our entire 3D Vision ecosystem of products.” The result is a list of 3D Vision-compatible games with over 650 entries. To the user, 3D Vision’s software implementation is remarkably seamless and consistent. You’ll first want to head to the Nvidia control panel to enable stereo 3D, like so:

Once that’s done, a text overlay should come up at the bottom right of the screen whenever you launch a compatible 3D game. The overlay will say how well the game is supported and whether you need to tweak any settings to get the best experience. From there, enabling stereoscopy is as simple as hitting Ctrl-T. You can adjust the image depth by hitting Ctrl-F3 and Ctrl-F4, bring up a “laser sight” with another hotkey (in case the game’s built-in crosshair doesn’t play nice with 3D Vision), and tweak a handful of other settings with other, more esoteric shortcuts. They’re all detailed in the control panel.

There’s more to 3D Vision, including support for 3D televisions over HDMI 1.4, which requires special software. If you have multiple 1080p monitors, 3D Vision Surround will spread stereo 3D goodness over a trio of displays. Those features lie beyond the scope of this article, though; to make things manageable, we’re focusing on single-monitor desktop PC gaming.

Nvidia’s GeForce 3D Vision 2 — continued

So, how do you put together a 3D Vision 2-infused gaming PC? The first step is finding a good display, preferably with LightBoost. (Nvidia’s 3D Vision system requirements page is a good place to start your search.) Check to see whether the display comes with the Nvidia glasses or not. If it doesn’t, you’ll need to buy a glasses-and-receiver kit separately. You’ll also want a fast graphics card, because remember, you’ll be asking it to render twice as many frames each second—one for each eye—when stereo 3D is enabled. I won’t spoil our performance results early, but I’d advise against going with anything much slower than a GeForce GTX 560 Ti 448 or GeForce GTX 570 if you intend to crank up the eye candy at 1080p.

The hardest part, I suppose, is balancing all of those ingredients if you’re on a budget. Right now, the only Nvidia-approved monitors available at Newegg are based on 1080p panels, making it difficult to cheap out on the GPU front without running games at lower than the display’s native resolution. You could choose one of the more affordable 3D Vision panels, but be careful. At $330, Acer’s GD235HZbid might seem like a better deal than Asus’ VG236H, which has the same panel size and a $440 price tag. However, if you read the fine print, you’ll see the Asus display comes with the Nvidia goggles and the Acer does not. Add the $150 3D Vision kit, and the Acer display’s total cost goes up to $480.

Picking out your ideal 3D Vision setup is probably going to involve hours of careful research and price comparisons. You’ll want to look at the pros and cons of each display and check out a few reviews. You’ll also want to keep in mind that 3D Vision displays use TN panels, which might make them less suitable for tasks requiring high color accuracy. Perhaps you’ll find yourself compelled to keep a high-fidelity display (likely based on an IPS panel) for, say, photo editing, and complement it with a 3D Vision panel for stereoscopic gaming.

Once you have all the components selected, configuring and using your 3D Vision setup for stereo 3D gaming should be fairly straightforward.

AMD’s HD3D

Nvidia has made 3D Vision a tightly integrated offering, while AMD’s HD3D is a looser approach that encompasses a number of different solutions to the same problems. HD3D also hasn’t been around as long. Initial driver support arrived in October 2010 for Radeon HD 5000-series and newer graphics cards.

AMD doesn’t sell its own stereoscopic glasses, so makers of displays, TVs, and projectors must take over that responsibility. In practice, that means a gaggle of different glasses offerings, most compatible only with one display brand. Also, because AMD doesn’t impose a certain panel-and-glasses technology on its partners, sequential-frame designs with active-shutter goggles (a la 3D Vision) aren’t the only game in town. A small number of HD3D-compatible displays have line-interleave designs and passive, polarized 3D glasses. As I understand it, line-interleave displays work by applying different polarization to odd and even lines on the display, with each lens on the glasses filtering out one set of lines.

The presence of different stereo 3D implementations and incompatible glasses means there’s a certain degree of inconsistency in the HD3D experience—more so than with 3D Vision. Last year, AMD took a step toward addressing the issue by rolling out an HD3D certification program. The program has two tiers: silver and gold. To receive HD3D gold certification, hardware makers must submit their products to AMD for testing. Silver certification can be obtained by doing testing in-house and submitting the results to AMD. In either case, certified displays must meet certain minimum requirements, support existing 3D middleware and Blu-ray 3D movies, and have printed documentation that is “clear and easy to follow.” Just like Nvidia, AMD supports stereoscopic displays with dual-link DVI, DisplayPort, or HDMI 1.4 inputs.

Right now, AMD’s list of recommended stereo 3D monitors is only seven entries long. All of the recommended displays are based on either 23″ or 27″ panels, and four of ’em are from Samsung. Prices start at a scant $269.99 for LG’s 23″ line-interleave monitor with bundled passive glasses, and they range up to $699.99 for Samsung’s 27″ S27A950D with active-shutter goggles. A PDF file on AMD’s website lists three more HD3D-compatible displays from ViewSonic and Zalman, but those seem to be older and harder to find.

If HD3D isn’t tied down to a particular kind of panel or glasses, some of you may be thinking about using 3D Vision hardware. Bzzt. Wrong. AMD tells us 3D Vision gear is “bound by license” to Nvidia graphics cards. Even if AMD wanted it to, 3D Vision hardware won’t work with Radeons. Too bad.

Things are also fragmented on the software front, where two competing middleware vendors, DDD and iZ3D, provide the software glue that lets most compatible games run in stereo 3D mode on AMD hardware. DDD’s TriDef seems to be the most popular, the most frequently updated, and the one with support for the most titles—468, by DDD’s count, with user-submitted profiles for “over 40” additional games also available on the TriDef forums. TriDef costs $49.99 on its own, but some monitor vendors bundle it with their displays. iZ3D, meanwhile, sells for $39.99.

Unlike Nvidia’s stereo implementation, TriDef requires users to start games through its launcher app. In-game operation is similar, though, with keyboard shortcuts bound to certain functions. On the numpad, hitting the asterisk key will enable or disable stereo 3D, pressing + or – will adjust the depth, and punching 0 will bring up an on-screen display with other settings. Users can also hit a hotkey to enable Virtual 3D, a “method of improving performance by using information from the Z-buffer in the DirectX graphics pipeline.” Virtual 3D may work in games that lack a proper TriDef profile, but DDD cautions that it can cause visual distortion around object edges.

It’s worth noting that both TriDef and iZ3D also support Nvidia hardware, and both existed long before 3D Vision or HD3D. The history page on DDD’s website says that, way back in 2003, the firm “licensed its TriDef 3D suite of software to Sharp for deployment on its revolutionary 3D laptop PC – the Sharp Actius RD3D.” Wikipedia tells us Neurok Optics, which became iZ3D through a joint venture with Chi Mei Optoelectronics in 2007, used to sell its own stereo 3D monitors as far back as 2006. The gist is that, unlike 3D Vision, AMD’s initial HD3D push appears to have been more of a low-budget effort based on pre-existing products from third-party vendors.

AMD seems to be moving away from that approach, at least in part. It’s now pushing its own quad-buffer software API, which allows game developers to implement native 3D support for AMD GPUs without the need for third-party middleware. Several blockbuster titles, including Battlefield 3, DiRT 3, and Deus Ex: Human Revolution, already use this API. Because AMD’s Catalyst Control Center lacks a 3D configuration pane, however, games that implement the quad-buffer API must expose stereo 3D settings (typically an on/off switch and depth sliders) through their own configuration menus.

Like 3D Vision 2, HD3D supports Blu-ray 3D playback and, since the release of the Catalyst 12.1 driver, 3D gaming across triple-monitor configs. Again, though, we’ll be restricting our focus to single-monitor PC gaming for this article.

Picking out an HD3D setup involves the same balancing act as on the Nvidia side. The small selection of AMD-recommended panels does make things a tad more straightforward, as does the fact that all the solutions seem to come with glasses in the box. That said, you’ll have to decide whether you favor line-interleave or active-shutter stereo implementations. We’ve only had a chance to test out active-shutter designs, so we unfortunately can’t provide much assistance on that front. You may also want to check if your chosen monitor comes with TriDef or iZ3D software—that could save you the expense of having to purchase the software separately. The 23″ Samsung monitor AMD sent us comes with TriDef.

On the GPU side of things, AMD’s Radeon HD 6970 should be a good starting point if you want smooth, responsive gameplay at 1080p with the eye candy cranked up. AMD’s new Radeon HD 7900-series cards are also worth a look if you can afford them, and they should offer substantially higher performance. See our review of the Radeon HD 7970 for more details.

Our test hardware

The 3D Vision 2 and HD3D displays we used for testing aren’t strictly comparable—one is larger and more expensive than the other. Sorry about that. The thing is, these are the 3D panels AMD and Nvidia are sending out to testers right now. Perhaps Nvidia wants to wow reviewers, while AMD is happier to highlight the existence of more budget-conscious solutions. Either way, we’ll try not to penalize the AMD solution unfairly because our 3D Vision display is nicer. Keep in mind that 27-inch 3D monitors are available on the HD3D side, and that cheaper, smaller panels are available in the 3D Vision camp.

Asus’ VG278H fills in as our 3D Vision 2 monitor. This is a fairly high-end offering, with a $699.99 price tag and a 27″ 1080p panel. According to Asus, the VG278H has 300 cd/m² luminosity, 50,000,000:1 contrast, a 2-ms response time, and LightBoost backlighting technology. It also features an assortment of VGA, DVI, and HDMI inputs, and it can be adjusted vertically and tilted. Happily, Asus opted for a matte finish on the panel.

The VG278H is bundled with a set of 3D Vision 2 goggles from Nvidia. The goggles interface with an IR emitter built into the top of the monitor, where you might normally see a webcam, so there’s no need for an auxiliary base station. The IR emitter is mounted on a hinge and can be tilted up and down (but not left or right) to accommodate the user’s sitting position.

For our HD3D testing, we’re using the Samsung S23A750D, a more affordable offering with $349.99 asking price, a 23″ 1080p panel, and no height adjustment functionality. (The stand offers only tilt adjustment.) Samsung touts a 1,000,000:1 contrast ratio, 250 cd/m² luminosity, and a 2-ms response time. Judging by what AMD told us, this monitor has an LED backlight pulsing technology very similar to LightBoost, although it’s not called by that name. You’ll find DisplayPort and HDMI inputs around the back, Samsung’s active-shutter glasses in the box, and DDD’s TriDef software on the driver CD.

Oh, and this display is stamped with AMD’s HD3D gold seal of approval.

While the Asus offering looks the same as any other PC monitor, Samsung’s S23A750D is a strange beast. It’s quite literally glossy all over, even on the back, and it features touch controls laid out on a wedge-shaped, cylindrical base. There’s no secondary display, mind you; the controls bring up an OSD on the main panel, just like you’d expect. It’s a neat concept, but the execution could use some work. On one hand, the controls are easier to get to than if they were laid out along the edge of the panel. On the other, I found myself occasionally having to tap the buttons multiple times to get a response.

I’m not a fan of the glossy finish, either. You’re guaranteed to leave an ugly fingerprint anytime you touch this monitor, whether it’s to adjust the tilt or to work the OSD controls. Also, I’m somewhat concerned that the reflective coating on the panel could be detrimental to stereoscopic image quality. Some folks have argued that stereo 3D confuses the brain by forcing the eyes to converge on one plane and focus on another. With its glossy coating, the Samsung display serves up artificial depth on a flat plane, while at the same time reflecting true depth, on which the eyes can converge and focus normally.

One last thing to note: while the 3D Vision 2 goggles bundled with the Asus display can be recharged via micro-USB, Samsung’s 3D goggles have a small compartment in the frame that accommodates a CR2025 battery, which you’ll have to replace once it’s depleted. Both displays have built-in receivers, though.

To drive these monitors, we selected two competing graphics cards:

Asus’ take on the GeForce GTX 570, the ENGTX570 DCII/2DIS/1280MD5, will be hooked up to our 3D Vision 2 panel. This card sells for $349.99 (or $329.99 after a mail-in rebate) and features a custom, triple-slot DirectCU II cooler with dual 80-mm fans. Asus clocks the card at Nvidia’s prescribed speeds of 742MHz for the GPU core and an effective 3.8GHz for the memory. Two DVI ports, one DisplayPort, and one HDMI output line the rear cluster.

For our HD3D setup, we’re using Asus’ Radeon HD 6970 card, the EAH6970 DCII/2DI4S/2G. This card also has a dual-fan, triple-slot DirectCU II cooler and stock clock speeds, but it sports a smaller vent and more display outputs than the Nvidia offering. (There’s a total of two DVI ports and four DisplayPort, er, ports.) This puppy used to be sold for $359.99 after rebate at Newegg, but it seems to have been deactivated. The card is out of stock at other e-tailers, too.

To be clear, we’ll be running our tests with only one of each of these two cards. We have several good reasons for that choice, most notably the microstuttering issues associated with multi-GPU configs. We want to rule those out when testing 3D performance in stereo. We also believe that, for 3D Vision 2 and HD3D to be appealing to the masses, they ought not be bound to pricey multi-GPU setups and their associated caveats. Using high-end, single-GPU cards is a good way to address the performance requirements of real-time stereoscopic 3D without going overboard.

The experience

Because of the differences between our displays, we’ll first address things like compatibility and playability to get a feel for each solution’s software support. Then, we’ll talk about the image quality on each display with the bundled glasses. Finally, we’ll discuss whether stereo 3D—in either implementation—provides a compelling step up from standard 2D graphics in each game.

We’re going to kick things off with a game AMD and Nvidia have named as a poster child for 3D support: EA DICE’s Battlefield 3. Next, we’ll move on to games recommended by each company (but not both). That should give us an idea how much overlap exists in the game support between the two solutions.

Battlefield 3

At 1080p with the “high” detail preset, Battlefield 3 looked good and ran fairly smoothly on both setups. However, unlike the 3D Vision setup, the HD3D rig exhibited a few kinks—even with AMD’s new Catalyst 12.1a driver. Sometimes, overlays like blood splatters and dust particles, which are meant to look like they’re stuck to the screen, would only be visible to one eye. I saw the same problem in the single-player campaign. At the start of one mission, when you slowly wake up after an earthquake, an overlay of the character’s eyelids opening and closing suffered from the same bug. At certain points, one eye would be getting a totally dark image, while the other would see the full 3D scene. Disconcerting. Seeing dust and blood splatters through only one eye seemed to cause eye strain over prolonged multiplayer sessions, too.

What about the displays and glasses?

I actually tried Battlefield 3 on the Samsung panel with the Radeon first, and I must confess to being sorely disappointed. The image was too dark, details were too small, and trying to sit closer to the display resulted in serious eye strain. (Turns out Samsung’s documentation recommends sitting no closer than 20″ from the display in 3D mode.) A large part of being a skilled BF3 player involves spotting camouflaged enemies in hard-to-see places—grass, bushes, behind rocks, and the like. The dimness of the Samsung setup with stereo 3D enabled was a handicap, and it led me to getting blindsided by enemies I really should have been able to spot.

To make matters worse, the Samsung panel had some nasty ghosting going on. With each eye, I could see a faint outline of the image intended for the other eye. Switching the glasses off and on again to re-sync them didn’t help. Somehow, though, cycling display inputs through the monitor’s OSD reduced ghosting in a noticeable way. Go figure.

After that, switching to the Asus monitor with the GeForce was like night and day. Not only was the image bigger, which helped with both the immersion and the spotting of bad guys, but it was also brighter and closer to the level of contrast one might expect from a 2D panel. Best of all, I saw little to no ghosting and experienced almost no eye strain. My only beef was that, sometimes, I caught reflections of the game action in the glossy frame of the Nvidia goggles. Really, Nvidia? Why would you make that glossy, of all things?

In any case, I got sucked into the action and ended up losing track of time playing BF3 on the 3D Vision rig. It was delightful. Bullets whizzed at me menacingly, dogfights gained a whole other dimension, and crawling through grass and bushes was suddenly a whole lot more realistic. No doubt about it, the 3D Vision 2 rig made the game more fun and visually engrossing than my personal, non-stereoscopic gaming setup.

Batman: Arkham City

The latest entry in Rocksteady Studios’ Batman series is featured prominently in 3D Vision 2’s official compatibility list. True to Nvidia’s promise, the game was smooth and exhibited no visual bugs that I noticed in 3D Vision mode. (To keep frame times low, I played at 1080p using high detail levels, but I left DirectX 11 features disabled.) I’m afraid I can’t say as much for HD3D, which lacks proper stereoscopic support for the game altogether. Arkham City doesn’t use AMD’s quad-buffer API, and it doesn’t have a complete HD3D profile in TriDef. Trying to use the TriDef’s profile creation feature resulted in awful artifacting. The Virtual 3D option provided the best approximation of what I saw on the Nvidia config, but the distortion around objects and characters was bothersome.

While I couldn’t stomach much of Arkham City in HD3D, I spent some time playing on the 3D Vision 2 setup. As in BF3, enabling stereo 3D added to the experience. Arkham City‘s streets are dark and visually noisy, which can make the game feel a little uniform without the illusion of depth. (In fact, I think one can say the same thing for all too many Unreal Engine 3 games—there’s something about UE3 and gritty environments, isn’t there?) Adding depth made Arkham City‘s game world pop, causing level geometry, objects, and characters to stand out, to gain more visual separation from one another. Believe it or not, I got the feeling playing in stereo 3D made the levels easier to navigate.

All in all, though, stereo 3D wowed me less in Arkham City than it did in Battlefield 3. Perhaps that’s because Arkham City is a third-person game that doesn’t throw objects and bullets directly at the player’s face. Then again, its masterfully voice-acted cut scenes did look great with a dash of stereoscopy. Hmm.

Deus Ex: Human Revolution

Deus Ex: Human Revolution is one of those rare games where AMD’s Gaming Evolved logo pops up amid the other unskippable animations at the start. The game employs AMD’s quad-buffer API, which made TriDef unnecessary on our HD3D setup. There, the game ran well and looked essentially perfect with all the detail settings turned up. On the 3D Vision 2 config, the Nvidia overlay warned us to turn the shadow detail down from “soft” to “normal.” We found that playing with ambient occlusion at its highest setting caused some choppiness, as well. There were no major problems once we turned down those two settings, although we still noticed some strange jumpiness when circle-strafing. We’ll have some DE:HR benchmarks in a minute, so stay tuned.

Comparing the Asus and Samsung displays, I’d say their image quality was largely similar. The Samsung did produce more ghosting, which made my eyes strain a little more, but cycling the input source seemed to lessen the problem just like it did in BF3.

With all that said, I don’t think DE:HR is a good poster child for the benefits of stereoscopy. The flat textures, simple level geometry, and overwhelmingly dark environments looked just as flat in stereo 3D mode as they did without pseudo depth. I tried fiddling with the depth adjustments for 3D Vision 2 and HD3D, but instead of making the game look significantly better, adding depth made playing uncomfortable. DE:HR can be incredibly engrossing, and I expected stereo 3D to make it more immersive. It wasn’t to be.

Portal 2

The folks at AMD we spoke to about stereo 3D suggested trying a Source Engine game. Since my Team Fortress 2 skills leave a lot to be desired, I ran Valve’s latest commercial Source title, Portal 2. The game was buttery smooth on both the 3D Vision 2 and HD3D setups with everything cranked up at 1080p. Enabling stereo 3D on the Nvidia rig involved the same maneuver as with the other titles—hitting Ctrl-T once in-game. On the AMD system, I had to use the TriDef launcher to start the game. The two solutions produced essentially identical image quality… with one small exception: on the HD3D setup, graffiti in the game’s secret areas was only visible through one eye, and it moved with the camera. Although not a show-stopping issue by any stretch, this flaw did break the immersion somewhat.

The Asus and the Samsung displays were both flattering to the game. The Samsung solution’s lower brightness wasn’t the handicap as it had been in BF3, nor was it particularly noticeable. However, both panels exhibited noticeable ghosting in Portal 2, especially around the bright and colorful graphics placed near buttons, doors, and other objects of note. I was especially disappointed to see ghosting on the Asus panel, which had done a better job of keeping image persistence in check up to that point.

Despite these visual anomalies, playing Portal 2 in stereo 3D was a ton of fun. The game’s relatively simple, highly stylized levels really popped, and stereoscopy made looking through portals a whole other experience, especially when the blue and orange portals weren’t on the same axis. Vertigo, ahoy! I don’t know if the effect warrants spending hundreds of dollars on stereo 3D gear, but it was undeniably cool. Both the HD3D and the 3D Vision 2 configs seemed to go a little overboard with stereoscopic depth initially, which caused some visual discomfort. I ended up reducing the stereo depth using keyboard shortcuts.

Trine 2

Our last test subject, Trine 2, came highly recommended by Nvidia. I was curious to try the game, having enjoyed the original quite a bit when it came out a few years back. For those unfamiliar with the series, Trine and Trine 2 are action/puzzle side-scrollers with snazzy Unreal Engine 3-driven graphics. In both games, getting through levels involves switching between three characters on the fly and using their unique abilities to solve puzzles and to fight bad guys.

With our Nvidia setup, the storybook-style title screens and in-game graphics rendered perfectly in three dimensions. HD3D lacks proper support for Trine 2, but TriDef’s Virtual 3D mode was a surprisingly good consolation prize. The distortion around objects wasn’t nearly as obvious as it had been in Arkham City. Although the title screens only rendered in 2D, the in-game action looked fine, with just the right amount of depth. The game played smoothly at 1080p, as well, just like on the 3D Vision config.

Unfortunately, Trine 2 has a lot of high-contrast graphics, which led to noticeable ghosting and, in turn, eye strain on both displays.

I actually had to cut my play-testing short, even on the 3D Vision 2 setup. After a while, the game just gave me a headache. That’s really a shame, because stereo 3D was surprisingly effective in Trine 2. Without the illusion of depth, I tended to focus more on the character. In stereo 3D, the backdrop and foreground really came to life, which made the game world more… well, three-dimensional. The effect wasn’t breathtaking in every scene, but it did wow me on a regular basis as I played through the first few levels.

In a nutshell

Before we move on to hard performance numbers, charts, and graphs, let’s boil down what we’ve learned so far to a few key observations.

I think it’s safe to say 3D Vision 2 has a serious edge over HD3D in terms of game compatibility. None of the titles I played on the Nvidia setup exhibited any visual bugs in stereo 3D mode. All of the games actually worked, even if I had to tweak the settings in Deus Ex: Human Revolution. Meanwhile, the HD3D setup lacked proper support for two of the games I ran, and it exhibited visual artifacts in two others—Battlefield 3 and Portal 2. Only Deus Ex, an AMD “Gaming Evolved’ release, ran in stereo mode without noticeable kinks.

Prospective users should carefully consider just how much they have to gain from a stereo setup.

Based on my experience, the combination of Asus’ VG278H display and 3D Vision 2 glasses delivers a better visual experience than Samsung’s S23A750D with its bundled glasses. Until we cycled through its display inputs, the HD3D Samsung panel had serious ghosting problems in every game we tried. Even without the ghosting, the display was too dark to make Battlefield 3 multiplayer an appealing proposition in stereo 3D mode. The Asus display and Nvidia spectacles offered substantially brighter images and caused less eye strain. Yes, the Asus panel costs more ($700 vs. $350 for the Samsung) and has a higher luminance rating (300 cd/m² vs. 250 cd/m²), so the comparison isn’t entirely fair. Nevertheless, using the two displays side by side demonstrated just how vital image brightness is to a stereo 3D setup. You can save several hundred bucks by going with a monitor that has a weaker backlight, but doing so could seriously damage the stereoscopic experience and make gameplay punitive rather than engrossing.

A few words must also be said about the benefits of stereo 3D over traditional 3D gaming. In some games, like BF3 and Portal 2, the added immersiveness and “wow” factor was undeniable. In others, like Deus Ex, stereo 3D didn’t really do much. And in Trine 2, the experience was mixed—the game looked beautiful, but ghosting and eye strain made it painful to play for too long. Our selection of titles admittedly wasn’t exhaustive, but our experience shows the addition of stereo 3D doesn’t automatically make things better. Considering the high cost of entry and the sacrifices that must be made (namely, turning down graphical detail in order to keep frame times low in stereo 3D mode), prospective users should carefully consider just how much they have to gain from a stereo setup.

Speaking of sacrifices, I’d like to throw in one final complaint. I wear prescription glasses. Their frames are made of titanium, and the titanium is encased in thick rubber on the temples. Unfortunately, that design didn’t play well with 3D goggles when I tried to wear headphones—the pressure against my temples quickly became uncomfortable. Many of you not blessed with 20/20 vision likely wear contacts, but those who don’t should be aware that active-shutter glasses are relatively chunky.

Let’s now look, in detail, at the performance impact of stereo 3D on both of our setups.

Our testing methods

Our test system was configured as follows:

Processor Intel Core i5-750
Motherboard Asus P7P55D
North bridge Intel P55 Express
South bridge
Memory size 4GB (2 DIMMs)
Memory type Kingston HyperX KHX2133C9AD3X2K2/4GX

DDR3 SDRAM at 1333MHz

Memory timings 9-9-9-24 1T
Chipset drivers INF update 9.2.0.1025

Rapid Storage Technology 10.1.0.1008

Audio Integrated Via VT1828S

with 6.0.1.8700 drivers

Graphics Asus Radeon HD 6970 (EAH6970 DCII/2DI4S/2G)

with Catalyst 12.1a drivers

Asus GeForce GTX 570 (ENGTX570 DCII/2DIS/1280MD5)

with GeForce 290.53 beta drivers

Power supply Corsair HX750W

OS Windows 7 Ultimate x64 Edition

Service Pack 1

Thanks to AMD, Asus, Intel, Corsair, Kingston, and Western Digital for helping to outfit our test rigs.

We used the following applications for our empirical testing:

As ever, we did our best to deliver clean benchmark numbers. We used the Fraps utility to record frame rates while playing a 90-second sequences through each level we tested. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence at least five times per video card, in order to counteract any variability. Vertical refresh sync (vsync) was disabled for all empirical tests.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

The performance — Deus Ex: Human Revolution

We tested Deus Ex: Human Revolution by playing through the start of the game’s first real mission, at the Sarif Manufacturing plant. We mostly shot our way through, but some sneaking took place, as well.

Based on what we saw in our subjective testing, we decided to run the game at 1080p with high FXAA antialiasing and all of the image quality options cranked up—save for shadows and ambient occlusion, which were set to “normal.”

Enabling stereo 3D causes not just a rise in frame times, but also an increase in frame-time variability. The inconsistency is visible through most of the run on AMD config, where frame times jump quickly and repeatedly from under 20 ms to 25-30 ms, and then back under 20 ms. Such variation in numbers that low isn’t a huge concern, though, since they’d translate to frame rates around 30 FPS or better. The Nvidia card produces higher frame-time peaks, which have the potential to be more noticeable.

When we look at average frame rates, which paint an incomplete picture of overall smoothness, the Nvidia GPU seems to take less of a hit from stereo 3D than its AMD rival. However, the two configs have equivalent average frame rates in stereo mode.

This metric gives us a sense of overall frame latencies; it tells us the threshold below which 99% of the frames were rendered by each setup. Even in stereo 3D mode, frame times were reasonably short for both the Radeon and the GeForce. (The GeForce’s threshold of 36 ms per frame corresponds to around 28 FPS.) In a game like Deus Ex, which favors slow sneaking over twitchy action, that seems to be fast enough to ensure largely smooth gameplay—despite the frame-time spikes.

Even with stereo 3D enabled, the AMD config never takes longer than 50 ms to generate a frame. (50 ms/frame works out to a frame rate of 20 FPS and is our tentative threshold for acceptable frame rendering times.) In stereo 3D mode, the GTX 570 spends a little bit of time beyond 50 ms, but not too terribly much in the context of a 90-second test run.

From a seat-of-the-pants perspective, the Nvidia setup feels less fluid, with intermittent sputtering that compromises the illusion of motion. It’s not less responsive mind you; we didn’t notice any input delay or anything of the sort. There were, however, occasional skips and rare, subtle oscillations in movement speed and animation.

The performance — Battlefield 3

We benchmarked Battlefield 3 by playing through the start of the Kaffarov mission, right after the player lands. Our 90-second runs involved walking through the woods and getting into a firefight with a group of hostiles, who fired and lobbed grenades at us.

The game was run at the “high” detail preset on both cards, first with stereo 3D disabled, and then with it enabled.

Without stereoscopy enabled, frame times are remarkably consistent, though there are some intermittent spikes on the Nvidia setup toward the end of the run.

Enable stereo 3D, and things go a little crazy—even on the AMD setup. Not only are frame times higher on average, but there’s much more variation, with both configs exhibiting some frame time spikes. Those spikes are particularly bad on the Nvidia side, where peak frame times approach 80 ms—equivalent to 13 FPS.

This traditional measurement shows us the average performance hit of stereo 3D, without taking spikes and frame-time irregularities into account. Clearly, rendering a 1080p scene twice per frame (once for each eye) is no picnic.

In this game, enabling stereo 3D raises our general frame time threshold quite a bit, especially on the Nvidia GPU. 68 ms per frame, which corresponds to 15 FPS, is too high for a fast-paced action game like this one.

Here, we see how much time each card spends churning out frames that take longer than 50 ms to render. True to what we saw in the plots above, the Nvidia setup is at a considerable disadvantage. Evidently, the impact of stereo 3D isn’t just limited to lower average frame rates (or higher average frame times). Frame-time spikes can be a serious problem. They threaten to damage the illusion of motion and make gameplay choppy.

Here, I’m having a hard time reconciling my gut impression with the data recorded. As I noted earlier, the game was playable and enjoyable on the Nvidia system, especially in multiplayer mode, and I don’t recall gameplay being obviously sputtery. That’s odd, because sputtering was clearly noticeable in Deus Ex with far fewer frame-time peaks above 50 ms. One possible explanation we’ve kicked around is that the frame times spikes in BF3 may be related to having lots of particle effects on screen, like when things are exploding. Those aren’t the moments when one can move and react especially well, regardless, so the slowdowns may be less noticeable. Another possible explanation is that frame-time spikes may somehow feel less disruptive when average frame times are already fairly high—which they were in BF3, but not in Deus Ex.

Conclusions

We’ve learned a couple of things from this little exercise: stereo 3D has come a long way since the early days of 3D Vision, and it still has a way to go.

Inconsistent game compatibility is still a potential issue. In our experience, Nvidia is far and away the winner on that front, but judging by the frame time spikes we measured and the need to tweak settings in Deus Ex, even the best solution on the market isn’t perfect. AMD, meanwhile, suffers from a fragmented hardware ecosystem and patchy support for even major titles. Battlefield 3 still has visual bugs with the latest driver release, and Arkham City lacks proper support of any kind right now. More worryingly, we encountered a rendering bug in Portal 2, a game that’s been out for over nine months and really ought to work without issues at this point.

Another problem lies with the displays and their matching active-shutter glasses. The vast majority of 3D monitors are based on TN panels, which have improved in recent years but still exhibit clearly inferior color reproduction and viewing angles compared to IPS offerings. TN panels have quicker response times, a vital quality for active-shutter 3D configs. However, our testing shows that even Asus’ VG278H monitor, a $700 solution with lightning-quick response times, still produces bothersome ghosting in certain games—and it’s the display Nvidia sends to reviewers. That makes the prospect of slower IPS panels coupled with active-shutter goggles seem unworkable, at least for now.

On top of that, the rapid flickering of active-shutter glasses can cause eye fatigue. One day, after testing, I went to the theater to see Tintin in 3D. I was amazed at how much more comfortable the stereoscopic visuals were, simply because the RealD polarizing glasses didn’t flicker. Even with wide game compatibility, no ghosting, and perfect color reproduction and viewing angles, any stereo 3D setup that relies on active-shutter glasses will be flawed to some degree.

The price and performance issues are worth noting, too. A stereo 3D config requires considerable graphics horsepower. Based on our performance results, a $300+ card seems de rigueur for stereoscopic gaming at 1080p. Add the price of a sufficiently big and bright 120Hz panel with the latest glasses, and you could be looking at a thousand-dollar expense. That’s a lot of money to spend on a capability that doesn’t improve every game substantially, a capability that forces one to trade visual fidelity (in the sense of graphical bells and whistles) for the illusion of depth.

Help is on the way, at least for some of the problems we encountered.

AMD says Microsoft will include native stereo 3D support in DirectX 11.1, which will ship with Windows 8, in the form of a new quad-buffer API. (Incidentally, that API will be incompatible with AMD’s own quad-buffer API.) We don’t know yet if DX11.1 will be back-ported to Windows 7, but in time, game developers should be able to support both Nvidia and AMD stereoscopic setups through the same API calls. That doesn’t mean they will, especially considering the abundance of shoddy console ports on the market today. If devs do take advantage of DX11.1’s quad-buffer API, the stereo 3D experience should be more consistent across different configurations.

All things being equal, I would undoubtedly prefer stereo 3D over a lack of it—but things are not equal. Not anywhere close.

In other news, AMD told us about an IPS-based 3D monitor due out this quarter. The display uses line interleaving coupled with passive, polarizing glasses—a design that should offer greater color fidelity without the troublesome flickering associated with active-shutter goggles. AMD says the polarization constrains vertical viewing angles in stereo mode, but horizontal viewing angles are purportedly “outstanding.” Our limited experience with line interleaving makes us wary of other potential tradeoffs, though.

All of this raises one simple question: if you can afford to, should you hop on the stereo 3D bandwagon now?

Personally, I think the negatives still outweigh the positives. On one hand, you’ve got the high cost of entry, flickering, ghosting, a huge performance hit, and compatibility kinks (on the HD3D side). On the other, you’ve got… what? A fleeting “wow” factor that’s there in some games and absent in others? It’s true PC gaming is all about enhancing the experience in subtle ways, but I think stereo 3D requires users to jump through too many hoops for too few enhancements. All things being equal, I would undoubtedly prefer stereo 3D over a lack of it—but things are not equal. Not anywhere close.

If you’re feeling the itch and can afford a good 3D setup, I won’t try to stop you. You’ll certainly enjoy the experience in a few games, and depending on your point of view, that might make the investment worthwhile. I will, however, suggest that you favor 3D Vision 2 until HD3D’s patchy game compatibility is, er, less patchy. There’s really no sense in going through all the trouble of building a stereo gaming rig only to encounter compatibility problems in every other game. Also, whatever you do, don’t skimp on the display; it can make or break the experience.

Comments closed
    • Kslope
    • 8 years ago

    3D is much better than higher resolution, it just needs to be done right in terms of:
    – Display (DLP projector for 100% flicker&ghost free experience, absolutely stunning image orders of magnitude better than cinema’s, which is based on circular polarized light [ghosting])
    – Both eye’s viewpoint’s configuration (separation&convergence, this parameters are screen size dependent, but this is a little bit complex to explain in a post).
    If you are a serious gamer that wants the absolute best gaming experience you are doing yourself a great disservice not gaming in 3D (it is so much better that it is a one way road, really, and much much better than movies@3D).
    I’d like to point out also a constructive critic for techreport in the hope that they can provide us a more accurate article in the future and be pioneer’s in this field, as it’s conclusions are clearly based on a lack of adequate display plus a lack of understanding of how to parameterize a 3D image.
    Also, I’d like to add that Tridef drivers are absolutely fantastic, and the Power3D option is the best thing that has happened to 3D ever!!!
    Obviously, if the image has flicker and/or ghosting, 3D is just a 5min gimmick, which seems the experience of most people in this forum, including the reviewer’s.

    • ghjtdge
    • 8 years ago
    • sdghjyukty
    • 8 years ago
    • safghtjrtj
    • 8 years ago
    • Chrispy_
    • 8 years ago

    It’s yet another VHS/Betamax, DVD+R/DVD-R, Blu-ray/HD-DVD war [i]all over again

    Wake me up when it’s standardised to the point that any 3D-capable display can work with any 3D-capable graphics card running a game that is 3D-vendor agnostic.

    At present there are too many conflicting criteria making the consumer choice totally pathetic.
    (And call me a cynic, but 3D gaming just isn’t ready for primetime yet).

    • thebeastie
    • 8 years ago

    I ordered my Sony HMZ-T1 visor what should I expect with PC 3d gaming?
    I know the advantage with this is flicker free proper 3d as I have a dedicated screen for each eye.

    I got a gtx295 but would upgrade if there is any benefit from either Nvidia or AMD.
    All major tech sites like these should be doing a review these since it blows away the flicker problems with 3d gaming and you get your own private 750inch screen.

    • MountainKing
    • 8 years ago

    Hey Cyril,

    I just wanted to say thanks for doing this review. Information and benchmarks on the stereo 3D setups are really hard to find. I couldn’t imagine going out and spending money on a 3D monitor and glasses that would only work for PC games (and only some at that). However, as more and more people buy 3DTV’s, it seems like a reasonable choice to try hooking up your PC to your 3DTV to see how some games might look on it. I think this would be the greatest area of growth for stereo 3D games on the PC, as you don’t need to lay out any additional money to try it if you already have the 3DTV and glasses. The only thing you might need to do is buy a slightly better video card than you otherwise would have for your PC gaming. So I’m hoping you’ll continue to cover stereo 3D, and maybe consider adding stereo 3D benchmarks to your regular video card reviews. I’d love to see benchmarks for stereo 3D for BF3 and Dirt 3 on a 3DTV with the 7950, 7970 and GTX 580 for example.

    Thanks,
    MK

    • Forge
    • 8 years ago

    Ironically, the 3D rendering used to be better than it is. I bought a GeForce 2 Ultra that came with shutter glasses (via VGA passthrough for sync+power, no less!). Ghosting was a huge problem, but 3D depth really wasn’t.

    I upgraded to a GeForce 3 and saw more Z issues. Things were flat to each other instead of being deeper/shallower. I upgraded to a GF4 and got even more.

    Why? Because the 3D hardware was getting more and more ‘clever’ about compressing or working around Z issues. Instead of keeping a full Z buffer all throughout rendering, things were getting flattened or tiled. Now, those are givens. Nobody keeps a full, proper Z buffer anymore. We’re only just starting to return to that, with the pushes for 3DV and HD3D.

    Progress, it is not!

    • Maboroshi Daikon
    • 8 years ago

    I’ve got the Samsung SA27950D display and out of the box, there’s a lot more crosstalk then there should be. The color stability is lacking as well. A few changes greatly improve the situation though:

    1) Set the Magic Angle setting to Group View.
    2) Turn down the refresh setting in the menus. Paradoxically, setting this to the fastest setting causes more ghosting and I think the default is at the highest. I have mine set at the lowest setting. It doesn’t remove ghosting, but it does cut it back.
    3) Change the % in front of monitor setting in DDD. By changing it, you can reduce the distance of the left-right images on objects in the far distance that pop in front of a stark blue sky. This will help mitigate some of the effects of ghosting.

    With all that said, I still use the monitor mostly for 120Hz tearing-free gaming.

    • tanker27
    • 8 years ago

    I wish this fascination with 3D would die already…….>.<

    • Pax-UX
    • 8 years ago

    You left out the single most important point about a 3D setup, 120Hz gaming. Even if you find the 3D crap and flip the glasses on ebay; 120Hz gaming is the way it’s meant to be played! It’s also the reason to invest in a good monitor and then maybe buy 3D later if you can get a deal on glasses.

    I picked up the LG W2363D at a good price of €180, would never buy or recommend anything less the 120Hz for some looking to upgrade their monitor.

      • HisDivineOrder
      • 8 years ago

      When a 120hz monitor is not using a TN-based panel, then I’d say it’s the future of monitors. Until then… ehhhh.. no.

        • indeego
        • 8 years ago

        When IPS is less than the 3X premium it asks for just for slightly better color reproduction and viewing angle, then we’ll talk.

          • Firestarter
          • 8 years ago

          IPS displays can be had for less than equivalent 120hz displays though.

            • indeego
            • 8 years ago

            TN w/120Hz@higher rez > IPS@same or lower res. That’s just like my opinion, man.

            Visual clarity and fluidness is better than viewing angle (when do we look at our displays too far from center?) Also there are better TN displays than many people give credit for, and worse. I think the TN hate comes from the crappy Acer TN’s I’ve seen.

            • Firestarter
            • 8 years ago

            Depends on what you need. Me, I need 120hz, but my gf needs an IPS-display. She can’t even see the judders from playing a ~24fps movie on a 25fps (progressive scan) TV, heh. And I’d imagine that’s true more or less for a lot of people, who are just not bothered by tearing or juddering or simply don’t notice it.

    • Squeazle
    • 8 years ago

    Small pip:
    Deus Ex: Human Revolution now has skippable endorsements at the beginning, as per a patch a month or two ago, which I learned about on this very website.

    • jensend
    • 8 years ago

    I’ve said it before and I’ll say it again: every 3d game out there should use head-tracking motion parallax.

    Stereopsis (the way that the slightly different images from your two eyes are combined to help perceive depths) is only a very small part of our depth perception in everyday life. Motion parallax, the fact that when we move our heads slightly our view of nearby things changes more than our view of farther things, is a much more important factor.

    Using stereopsis by itself for a 3d effect requires funky glasses and expensive monitors. It means you have decreased detail (either less temporal detail because things are rendered twice, less spatial detail because things are interleaved with polarization effect, or less color detail for red-cyan anaglyphs). You aren’t used to having this effect be so strong while all the other visual depth cues are lacking, so it causes headaches and eye fatigue.

    Whenever there’s just one person in front of a screen, you can detect small head motions and adjust frames accordingly to get a motion parallax 3d effect. This looks considerably more realistic than stereoscopic 3d, it causes no headaches or eye fatigue, it would have no major impact on framerates or detail, and it can be done with just a standard webcam.

    I can’t understand why people haven’t started doing this.

      • Palek
      • 8 years ago

      Something like this, right?

      [url<]http://www.youtube.com/watch?v=Jd3-eiid-Uw[/url<]

        • jensend
        • 8 years ago

        Right. Fairly simplistic demo, but that’s the basic idea.

        His version required having an IR sensor and wearing two infrared lights on your head, which makes it dead simple to sense your head’s location and alignment.

        In the intervening four years, lots of libraries have been developed which recognize and track your head without wearing emitters (FaceAPI, Cachya, FreeTrack, stuff from OpenCV, and I think it’s built into the Kinect package). It’s quite feasible with just a standard webcam, though a sensor like Kinect (which has an IR laser emitter and reads the reflections) would work in all light conditions while a webcam would be quite dependent on the lighting.

          • Firestarter
          • 8 years ago

          A Kinect solution would probably add some lag and be less precise, compared to something head-mounted dedicated to tracking. I guess any lag, frametime spikes or loss of precision would be even more important with head movements than it is already with hand/mouse movements, as it would more easily cause nausea/dizziness.

          Time for a gaming cap maybe?

            • Palek
            • 8 years ago

            Imagine how awesome it would be to actually use your head to look around a corner in an FPS!

            • Firestarter
            • 8 years ago

            And get your head shot off 😛

            • Palek
            • 8 years ago

            Comes with the territory!

            • Palek
            • 8 years ago

            Argh, reply fail.

            • Meadows
            • 8 years ago

            I used to try doing that in videogames when I was quite young, and it took me a while to get accustomed to the fact that I [u<]can't[/u<] see farther than my crosshair will go.

            • jensend
            • 8 years ago

            Unless you’re operating with a plain webcam and poor lighting, I’m fairly sure the software can reliably track your head with plenty of precision.

            The processing time lag required for a Kinect or webcam solution is really not an issue on modern CPUs.

            Regardless of what kind of sensor you have, the main question affecting latency is the sensor’s capture rate. You’re right that this could be an issue with Kinect since it captures at 30fps (I wasn’t aware of that until after you brought this up). Kinect and 30fps webcams would be OK for many types of games, but a 60fps camera would really help.

            The advantage of the Wii Remote used in the video Palek linked to is that its IR camera operates at 100 fps. It’s really low resolution– 128×96– but it’s got a chip on it which does a good job of interpolating the tracking resolution data 8x to 1024×768. All in all, it is pretty nice. The gaming cap would be a simple way to take advantage of something like that.

          • sschaem
          • 8 years ago

          Tracking need to be absolutely and perfectly precise, otherwise it would be like having eyes floating in a cup of watter.
          And we are not there yet, by a long shot.

          And see my other post at how alone this doesn’t create a 3d effect for the direct viewer. Its just a neat way to control the camera.

            • jensend
            • 8 years ago

            Wrong wrong wrong. Nothing to do with human perception has to be “absolutely and perfectly precise” — it just has to have enough precision to reach perceptual transparency. A 640×480 webcam has enough resolution to resolve motions of your head subtending just a few arcminutes, which is precise enough to do a good job in this application. Diagonal field of view of a webcam is usu. ~60 degrees; 60 degrees/ 800 (=sqrt(640^2+480^2)) =4.5 arcminutes. If you’re sitting two feet from your webcam that’s a motion of just 0.8 mm. Unless you’ve got your head in a vise you’re likely to be moving your head more than that quite frequently. Moving your head much less than that causes rather little difference in your perception of a 3d scene.

            I get sick of people pretending they are experts when they’re speaking out of their rear. I’m no expert but at least I’ll think before I post.

            • sschaem
            • 8 years ago

            You are just trying to make the idea work when it doesn’t.

            But even if some day we can capture the eye telemetry perfectly and instantly with a webcam and the player is injected with caffeine to make him jitter constantly, the result is still perceived as totally flat by the viewer.

            • Firestarter
            • 8 years ago

            So this image ( [url<]http://i.imgur.com/fTnuq.gif[/url<] ) does not create any illusion of depth for you at all? I mean, I'm perfectly aware that the surface that I'm viewing it on is flat, but that doesn't degrade the illusion for me.

            • Anonymous Hamster
            • 8 years ago

            I’ve downloaded the software from the above demo (as can anyone), and I can tell you that it does work very well. The 3D effect is very striking. You don’t have to believe me; if you’ve got a WiiMote and a PC with Bluetooth, you can try it out for yourself.

        • l33t-g4m3r
        • 8 years ago

        or [url<]http://en.wikipedia.org/wiki/FreeTrack[/url<]

      • sschaem
      • 8 years ago

      “I can’t understand why people haven’t started doing this.”

      Because people are mostly sitting down at PC to play games.
      And unless you move your head up/down, left/right, titlt, you get ZERO 3d effect.

      Also because the effect is flat 2d, it look really weird in most cases without the combination of stereoscopic.

      The best would be VR glasses (stereo vision with ZERO crosstalk/ghosting + head tracking)

      I’ve said it before and I’ll say it again:, the best 3d is your brain.
      Try covering one eye during a scene with smooth motion and correct DOF. (game movie, cgi, …)
      When both eyes are ‘enabled’ your brain eliminate your perception skills and say “dude, thats 2D, it all flat”
      Cover one eye and your brain gets a workout. It as no clue the image is projected flat and will actually generate depth info.
      The problem is the brain effect only last maybe 10 second….
      So if you see a guy covering one eye at the showing of Disney movies during motion scene thats me 🙂
      The result can be absolutely stunning!

        • jensend
        • 8 years ago

        Sure, if you lock your head in a vise while playing your FPS this won’t make any difference. But most of the motion parallax effect comes not from major movements like swinging your head around but from fairly minor movements which you naturally make all the time even if you’re sitting still. They move the camera around a lot in the demos because it’s not the viewer of the video whose head is being tracked so they have to make the motion really big (orders of magnitude larger than the motions your head is making while you watch the video) for you to perceive the effect.

        It doesn’t “look really weird in most cases without the combination of stereoscopic.” You’re talking out of your rear on this one. Your brain’s primary depth perception clues really are monocular and you will perceive depth just fine without the stereoscopic effect. How do you think most kinds of animals manage to make it through life dealing with a 3d world when the overlap between their eyes’ visual ranges is small to nil?

          • sschaem
          • 8 years ago

          You have no understand of how the eye and brain communicate.
          Why do you think we perceive a movie projected on a flat wall as flat ? because both eyes converge to a single point.

          Doing the parallax trick doesn’t stop this effect from happening, flat is perceived as flat.
          And no micro studded doesn’t help. and to capture micro studded you need near infinite precision in your eye telemetry capture.
          You brain see an absolutely flat image when emulating the camera position.

          The only way the parallax idea work is if you close one eye so you can remove that information going to your brain.
          And without both eye the effect is temporary, and very limited.

          I’m telling you this is not used because
          a) you cant achieve a 3d effect unless you can provide each eye its distinct view
          b) tracking need to be so precise that its still a problem to achieve believable effects for a human first hand.

          This method is ok for augmented reality, but thats as far as we have gone.
          Best result is with a 2d capture system (like a camera, not a free motion eye ball) hardwired to a calibration system.

            • jensend
            • 8 years ago

            You have no data to back you up, every psychologist and neurologist out there disagrees with you, and very simple experiments show you are wrong. I’m sick of hearing your garbage.

        • Anonymous Hamster
        • 8 years ago

        Even if you are sitting at your desk and playing a 3D game, your head moves a little bit. Having the scene react to even a small amount of head motion will make a noticeable difference vs. there being no reaction. And, once you provide that kind of interaction, you’ll probably move your head a bit more to peek around corners and such. Who knows; you might even try to dodge things. But if you know there’s no feedback, then of course you have no reason to move your head at all. It’s a chicken & egg kind of issue.

      • gc9
      • 8 years ago

      Games that support TrackIR support head tracking, not only 3d position but also 3d rotation.
      Head tracking is available in many games.

      Toshiba’s glasses-free 3d notebooks (not ready for prime-time in 2011) are also advertised to provide head tracking.
      There was a rumor recently saying that Microsoft is experimenting with adding Kinect to notebooks.
      So some manufacturers are trying to integrate head tracking.

      Is there a standard API, not specific to tracking hardware ?

    • Palek
    • 8 years ago

    Cyril: what you refer to as “ghosting” – that is, when the image intended for the left eye is partially visible for the right eye and vice versa – is called crosstalk in the consumer electronics industry.

      • sschaem
      • 8 years ago

      Ghosting is perfectly valid and correct to use (as is crosstalk)

      Ghosting as also has been used in the video industry for decades as it better describe the end result.
      The stereo community also adopted Ghosting over crosstalk in the 80s because it better portrait the phenomenon.

      And If you look at articles in the past 20 years on stereo glasses, ghosting is favored over crosstalk.

        • Palek
        • 8 years ago

        I worked in the industry for years, and [u<]every single person[/u<] I ever talked to about this phenomenon referred to it as crosstalk. Yes, it is similar to ghosting but the point is that the signal crosses over to the wrong channel, hence the word "cross" in crosstalk.

          • Black69ta
          • 8 years ago

          I tend to agree with you, crosstalk is when the signal bleeds to the other channel. Ghosting is when the previous frame still shows up on the next frame, I remember a trick on my Pentax 35mm camera, you take a shot but it wouldn’t advance the film so you could take another shot on top of it, thus making the first image Ghost onto the second image.

          However, ghosting could also describe this phenomena since both right and left images display in the same space, the image from one “Ghosts” to the next.

          That said, many industry insiders use different vernacular than the consumer, mainly thinking of Medical field coronary infarction vs. Mr. Smith, you had a heart attack. On the consumer side I have never heard of Crosstalk outside of the audio field.

            • Chrispy_
            • 8 years ago

            There are two meanings of the word ghosting in this debate.

            “Ghosting” used to describe ghost-like semi-transparent images meant for the other eye – specifically from stereo crosstalk.
            “Ghosting” used to describe the fading trailing edges of moving objects on a screen caused by slow LCD response time.

            Both these uses of the word “ghosting” describe symptoms of LCD crystals moving too slowly to fool the human eye, but one ghosting is a subset of the other:

            [b<]Stereo crosstalk is percieved as ghosting because of the ghost image meant for your other eye, but it is caused the LCD response time phenomenom most commonly known as ghosting. If I could be arsed to draw a venn diagram, it would be four concentric circles with 'ghosting' as a subset of 'crosstalk', as a subset of 'ghosting', as a subset of 'LCD response-time issues'.[/b<] Nobody is wrong, English vocabulary is just too limiting sometimes. Try german - they just string descriptive sentences together and remove the spaces to create a long, but [i<]incredibly accurate[/i<] word

    • yammerpickle2
    • 8 years ago

    I have a VG236H monitor and twin GTX580 with the NVidia glasses. It’s nice and immersive. It’s made me jump back from my screen reflexively a couple of times when the action was close. But it is a pain and I game only about 10% of the time with them. The flicker and ghosting are annoying. I can’t say about the 2nd version of displays and glasses, but heard they are a nice improvement. Give it a couple more years and it will be even better. I figure in a couple of years, basically shortly after the release of the next gen consoles that will do a better job with 3D and you will see 3D become more mainstream. I just hope by then 4K 3D 120 Hz displays are available.

      • HisDivineOrder
      • 8 years ago

      Hopefully in a few years they’ll have figured out how to give us glasses-less 3d that isn’t destroying our wallet and that isn’t muddying up the 2d.

      Or they could just skip that and go straight to holograms. I’d settle for that.

      • l33t-g4m3r
      • 8 years ago

      “twin GTX580” Therein lies the problem. SLI or lowering the detail settings is a requirement. Overall, I’d prefer jensend’s Motion parallax idea.

    • ColeLT1
    • 8 years ago

    I’m legally blind in one eye, so I always feel like I’m missing out on 3d. Thanks for making me feel better, lol, not missing a thing 😀

      • jensend
      • 8 years ago

      Even people who are completely blind in one eye generally manage to deal just fine with most everyday tasks requiring some depth perception. Most of our everyday depth perception cues are monocular, and one of the most important is motion parallax, which could be implemented for games without any goofy glasses or expensive monitors.

        • ColeLT1
        • 8 years ago

        Yeah, I deal fine, mine was injury related and it took a year or so after surgery to regain my usable depth perception. Mine is retina damage, specifically macula, which makes it impossible for that eye to focus on anything or see anything directly in line of sight, I can only see edge peripheral, so my brain simply ignores the blind spot in that eye. I see in 3d, but I can’t see any 3d effects anymore unless I were to look away from the screen.

        TLDR – No 3d for me

    • ew
    • 8 years ago

    This is just like when they did it back in the Geforce 2 days except now you need a special monitor to “enjoy” crummy compatibility and poor image quality.

    • Pantsu
    • 8 years ago

    I have the Samsung monitor, and from personal experience I can say that while stereoscopic 3D gaming on this monitor can be a great experience when it works, most of the time it’s far from perfect. You’ll really need a fast GPU to play at 1080p and high quality. Even my OC 7970 can’t push 60 FPS in all situations.

    Then there’s the lack of support, and practically while there’s “support” for 100’s of games using Tridef, it’s rarely perfect and you’ll have to adjust settings and perhaps live with some bugs. Personally I only play S3D in ME3 and Metro 2033, but that’s because all other games I can play with Eyefinity. While both games work well enough, they still have a few small issues like distortion and crosstalk, crosshair problems etc. These are run of the mill problems you’ll encounter with stereoscopic 3D at the moment. Thankfully Tridef does have extensive settings to counter such issues, but this really isn’t for someone searching for a hassle free solution.

    Beyond Tridef and HD3D the Samsung monitor has its own 2D-3D conversion that can add some extra depth to 2D content. I find it a nice addition, though far from perfect. The depth effect isn’t as good as with true S3D content, and suffers from crosstalk. It also works while gaming but does bring a lot of input lag with it.

    Overall the current S3D with glasses is uncomfortable and buggy experience, especially while gaming. I too use prescription glasses and headphones while I game, and adding even the thin Samsung 3D glasses is uncomfortable. Besides that S3D does strain my eyes, so I tend to play only short periods with S3D on. While the stereoscopic effect is really immersive when it works, I’m constantly annoyed by bugs, uncomfortable glasses and eye strain.

    All in all at the moment I wouldn’t consider buying a monitor for S3D gaming (I bought the Samsung for 120 Hz and as the center monitor for my Eyefinity setup since it has slim bezels), but it does bring some extra value if you’re looking for a 120 Hz monitor. I’d go as far as to say the S23A700D is probably the best bang for your buck 120 Hz monitor available at the moment. Of course if someone is hell bent on S3D gaming, the Nvidia solution is the superior one.

    Hopefully once OLED monitors are out in the (not too) distant future we’ll also get glasses free 3D without these issues, since it really can make a world of difference in terms of immersion in games.

      • Firestarter
      • 8 years ago

      I’m looking to buy that S23A700D somewhere next week

    • Firestarter
    • 8 years ago

    All I want is a proper 120Hz display. As for 3D, I think head tracking is more important for immersion than stereoscopic imaging. A head mounted display with large FOV, high resolution and tracking, that I would like to see as a consumer product!

    • squeeb
    • 8 years ago

    I’ve had a 3d vision setup since Dec 2010 and I’ve probably not logged more than 10 hours with the glasses. However, the 120hz LCD is amazing – one of the best upgrades ever.

    • Spyder22446688
    • 8 years ago

    I have no interest in 3D games, 3D movies, or 3D television, and wish the entire fad would die. Am I in the minority?

    Wearing those glasses while watching a movie or television seems especially silly, and probably gets pretty uncomfortable after a short while.

      • sschaem
      • 8 years ago

      Only time before they sale polarized contact lenses 🙂

      My take is that 3D will take when console support 3D games, and TV have 3D built in, that you want it or not.

      So I dont think 3D will go away.

      The big thing actually with TV and gaming is you can use Glasses Player 1 and Glasses Player 2 mode.
      and you can have full screen multi-player gaming. Could be a good alternative to split screen.

        • Stargazer
        • 8 years ago

        [quote<]The big thing actually with TV and gaming is you can use Glasses Player 1 and Glasses Player 2 mode. and you can have full screen multi-player gaming. Could be a good alternative to split screen.[/quote<] Now, *that* would be neat.

          • derFunkenstein
          • 8 years ago

          That’s one of the thigns the PS3 is using 3D for, though it’s not particularly common yet.

          [url<]http://www.theverge.com/2011/06/06/playstation-3d-display-hands-on-e3-2011/[/url<]

      • DancinJack
      • 8 years ago

      [quote<]I have no interest in 3D games, 3D movies, or 3D television, and wish the entire fad would die. Am I in the minority?[/quote<] Not if the sample size is you and me. 3D on the desktop, at least in its current state, is not even close to interesting to me. I'd much, much rather have higher res picture than 3D.

      • colinstu
      • 8 years ago

      I with ya on this one.

    • sschaem
    • 8 years ago

    Untill game developer add it natively, this feature is not ready for prime-time.

    Thats what the new range of console is supposed to bring in. 1080p 3D gaming that look rock solid (not in term of flickering, but in term of gameplay).

    I still remember playing this game [url<]http://en.wikipedia.org/wiki/Drakan:_Order_of_the_Flame[/url<] on my TNT2 with the elsa wireless LCD and nvidia 3d stereo drivers back in 1999... whoa, 13 years ago! Not much changed, at all, in the world of 3d PC gaming in all those years. [url<]http://www.sharkyextreme.com/hardware/reviews/video/elsa_erazor3/g.jpg[/url<]

      • squeeb
      • 8 years ago

      Wow dude, sharkyextreme…now that is old skool. I used to read them back in like ’99-00

      • Waco
      • 8 years ago

      I had a third-party setup very similar to that with the VGA pass-through. It actually worked surprisingly well considering the cost at the time ($50 IIRC) but was a headache-inducing nightmare if you played for more than 30-40 minutes. My monitor at the time could only do 60 Hz even at low resolutions and the flickering was absolutely painful.

Pin It on Pinterest

Share This