What it takes continued
The other bit of hardware needed for GeForce 3D Vision is this IR transmitter, which helps synchronize and activate the glasses' shuttering mechanism. The picture above shows the back side of the transmitter, which houses a rather important control: that wheel adjusts the amount of depth in 3D games. Turning it up provides more stereo separation and greater depth, while dialing it back does the opposite. Although 3D Vision also supports 3D movies and videos, the depth wheel only works in games, because only games generate images on the fly.
In order for any of this to work, you'll need a display capable of 120Hz refresh rates. In our case, we tried out a pre-production version of a 120Hz LCD, the Samsung 2233RZ. This 22" display has a native resolution of 1680x1050, with a 16:10 aspect ratio and a 5ms rated response time. The $399 MSRP is mighty pricey for a 22" monitor. Samsung's own SyncMaster 2233BW sells for about 220 bucks at online retailers. List and street prices don't always match up, of course, but you're still likely to be paying quite a premium for a 120Hz display.
Beyond its 120Hz capability, our pre-production sample of the SyncMaster 2233BZ isn't anything special, either. To make an entirely unfair comparison, next to the Dell 3007WFP-HC we usually have on our GPU test bench, the 2233BZ has noticeably inferior color reproduction, with visible loss of contrast and slight color shift at acute viewing angles. Perhaps production models will be improved somewhat when they arrivethey're slated for release in Aprilbut I doubt Samsung will be able to achieve the color reproduction of the best LCDs in combination with 120Hz quickness.
ViewSonic has also announced a 120Hz 22" LCD with a funny name, the FuHzion VX2265wm. Happily, although its specs are similar to the Samsung, the FuHzion has a bad spelling discount of 50 bucks, bringing its suggested retail price to $349. That's it for LCDs at present, although there's promise on the horizon. Nvidia informs us that LG has a 23" 120Hz panel planned for later this year, and that display will have a 1920x1080 native resolution. That's more my speed, considering that the 3D Vision kit itself costs $199. Seems to me like this is something of a premium product, and 1680x1050 isn't really a premium resolution.
If you're into rather larger displays, 3D Vision is also compatible with a host of 1080p DLP HDTVs from Mitsubishi. And, if you have real money laying around, it'll also work an HD 3D projector from a company called LightSpeed Design. I have a hunch that puppy will cost you more than, say, a nice Audi.
The final piece of the 3D Vision puzzle is a PC with a suitable GeForce graphics card. Nvidia has a list of compatible GPUs, most of which are at the higher end of the product spectrum. The oldest graphics card on the list is the GeForce 8800 GTX, and the cheapest is the GeForce 9600 GT. Anything newer or more powerful than those cards, including the GTX 200 series, ought to work. SLI is supported, as well, but only in two-way configurations.
So... does it work? In a word, yep. Nvidia's decision to limit 3D Vision to displays with very high refresh rates makes this technology easily superior to most past attempts at stereoscopic 3D on the PC. There's noticeably less flicker, and the illusion of depth works better as a result.
The biggest catch, at present, is spotty game compatibility. Most games aren't designed with stereoscopic 3D in mind, and to cope with a variety of potential issues, Nvidia has created a host of game-specific profiles, much like the profiles it uses for SLI. 3D Vision profiles are a little more complicated, though. If SLI doesn't work, the fall-back behavior is pretty simple: lower performancefrustrating, maybe, but not devastating. If 3D Vision has a compatibility problem, well, all manner of funky things might happen visually, many of which can ruin the sense of depth in the display or send your visual system into a tizzy. What's more, 3D Vision's incompatibilities tend to involve certain rendering techniques, so Nvidia will oftentimes ask you to disable some features of a game for the sake of compatibility. In fact, the game profiles will show compatibility information directly onscreen when a game starts up, like so:
Most of the games I tried (all of them relatively new releases) required a few adjustments, many of which meant compromising on visual fidelity somewhat. The most common trouble spot seems to be shadowing algorithms. The profiles frequently recommending dialing back the quality of shadowing in the game's options, if not disabling shadows entirely.
I tried to get specifics out of Nvidia about what the issues are. Is it one approach, like stencil shadow volumes, that causes problems? But Nvidia has taken the "vague PR blob" approach to answering any and all questions about the technical specifics of GeForce 3D Vision. As a result, we have few tools for handicapping the prospects for future game compatibility with this technology. Instead, Nvidia offers only the reassurance that 3D Vision compatibility is a problem very much like SLI compatibility, and claims that it will take the same approach to surmounting any obstacles: a combination of collaboration with game developers and vigilant profile development. That sounds good, I suppose, but we're left having to trust that Nvidia will be able to herd cats well enough to make this work.
|Intel Computex keynote confirms Kaby Lake and Optane for 2016||25|
|Asus shows off Avalon modular case and GX800 liquid-cooled laptop||6|
|Samsung designs minuscule single-package NVMe SSD||21|
|Thermaltake shows off The Tower and more at Computex||9|
|Adata shows NVMe and TLC SSDs at Computex||1|
|Corsair@Computex 2016: fans that levitate, fans that illuminate||8|
|Patriot adds 2TB model to Ignite SSD lineup||13|
|Intel boosts the high-end desktop with its Broadwell-E CPUs||86|
|EVGA@Computex 2016: Custom Pascal cards, new PSUs, and more||9|
|Everyone from Asus to Zotac has announced a non-reference GTX 1080. I see what you did there!||+46|