One of the minor highlights of CES last week was Nvidia's announcement and demonstrations of its 3D Vision Surround technology. This scheme combines the funny-glasses-based depth-giving tech of GeForce 3D Vision with multi-monitor gaming capability reminiscent of
Matrox's TripleHead AMD's Eyefinity.
The basic proposition is fairly straightforward: to extend the 3D Vision scheme to three separate monitors placed side-by-side, making for a panoramic display surface that gives the impression of depth. The technical bits under the covers quickly become complicated, though. You must, of course, have the funny glasses and the IR transmitter hooked to the computer, and the game must support both 3D Vision and unconventional display resolutions. All three monitors must support a 120Hz refresh rate in order to keep up with the switching rate of the glasses and maintain an effective 60Hz refresh. And you'll need to have two high-end Nvidia graphics cards, either based on the upcoming GF100 GPU or the current GTX 200 series, because neither generation of GPU can drive more than two displays simultaneously.
You'll probably want the GPU power from multiple cards, anyhow, since 3D Vision Surround will tax your graphics subsystem as if it were a tobacco company. In fact, given its need to drive demand for new high-end GPUs, Nvidia makes this point with a certain relish. Today's high-end gaming systems typically top out with a single 30" display at 2560x1600 resolution. At a 60Hz refresh rate, you'll have to service about 246 million pixels per second to drive that display. But three 1080p displays at 120Hz in a 3D Vision Surround setup will push roughly 747 million pixels per second—that's equivalent to six non-3D displays at the same resolution.
One of my first questions, upon hearing about 3D Vision Surround, was whether the scheme would require a set of Bono-style wrap-around glasses. After all, most multi-monitor gaming setups with wide-aspect displays emphasize peripheral vision, with the outer displays angled in toward the viewer. Yet most 3D glasses only help when looking straight ahead, given their flat lenses. Turns out that setting up the displays at varying angles from the viewer creates perspective problems for 3D schemes, so the preferred configuration is to put all three displays side by side at the same angle.
When set up that way, I was surprised to learn, 3D Vision Surround can be pretty darned effective. Nvidia was using a racing game in its main demo (not sure which game), and the sensation of depth across a wide display area gave it the feel of a simulator or some sort of high-end arcade outfit. I have been somewhat dubious about AMD's Eyefinity (although, I admit, I need to spend more time with it), in part because the presence of the display bezels tends to ruin the panoramic effect for me. But with the illusion of depth added to the mix, that problem seems to melt away. My visual system seems to filter out the bezels effortlessly, and one is left with the impression of simply looking through a window at a world of depth beyond. The illusion is more than the sum of its parts: better than panoramic displays or 3D Vision alone.
Nvidia says it plans to add a bezel correction feature to its drivers, which apparently wasn't working in the demos we saw at CES. That should help further the illusion.
I came away thinking that 3D Vision Surround might actually be an attractive alternative to what AMD now offers with Eyefinity. After all, if you're going to go to the trouble of buying three 1080p displays and creating space for them in front of your computer, adding the 3D glasses and additional GPU horsepower doesn't seem like a major added expense or hassle—and the payoff is quite nice. Nvidia does plan to support triple-display gaming without 3D glasses, as well, which should be useful when the 3D scheme is unsupported or unwanted.
The flip side of the coin is this: AMD already has Eyefinity out in the market now, and it, too, was showing demos of 3D glasses-based gaming in its CES meeting area. Odds are AMD and Nvidia could converge on the same multi-display, depth-enhanced sort of offering within the next six months or so. If that happens, rest assured, we stand ready to point and laugh just as soon as you slip on a pair of those ridiculous glasses.
|Amazon's Echo Look uses machine learning to dress you up||23|
|EK machines a waterblock for the ROG Maximus IX Apex||2|
|Microsoft describes how it uses telemetry data for smoother updates||20|
|id software talks about Ryzen||83|
|FSP hits the heatsink market with its Windale CPU coolers||16|
|Steelseries Qck Prism is a lit stage for your mouse||26|
|Biostar shows up fashionably late to the Radeon 500-series party||10|
|MSI lets loose a trio of Optane motherboard bundles||12|
|GeForce 381.89 drivers power up their armor for Dawn of War III||8|