Single page Print

Bindless textures
One of Kepler's new capabilities is something Nvidia calls "bindless textures," an unfamiliar name, perhaps, for a familiar feature. Ever since John Carmack started talking about "megatexturing," the world of real-time graphics has been on notice that the ability to stream in and manage large amounts of texture data will be an important capability in the future. AMD added hardware support for streaming textures in its Southern Islands series of GPUs, and now Nvidia has added it in Kepler. Architect John Danskin said this facility will lead to a "dramatic increase" in the number of textures available at runtime, and he showed off a sample scene that might benefit from the feature, a ray-traced rendition of a room whose ceiling is covered entirely with murals.

Unfortunately, neither AMD's nor Nvidia's brand of streaming textures is supported in DirectX 11.1, and we're not aware of any roadmaps from Microsoft for the next generation of the API. For the time being, Kepler's bindless texture capability will be exposed to programmers via an OpenGL extension.

Hardware video encoding
For a while now, video encoding has been hailed as an example of a compute-intensive task that can be accelerated via GPU computing. However, even highly parallel GPU computing can't match the efficiency of dedicated hardware—and these days, nearly everything from smartphone SoCs to Sandy Bridge processors comes with an H.264 video encoding block built in. Now, you can add GK104 to that list.

Nvidia says the GK104's NVENC encoder can compress 1080p video into the H.264 format at four to eight times the speed of real-time playback. The encoding hardware supports the same H.264 profile levels as the Blu-ray standard, including the MVC extension for stereoscopic 3D video. Nvidia cites power efficiency as NVENC's primary advantage over GPU-compute-based encoders.

AMD claims to have added similarly capable video encoding hardware to its 7000-series Radeons, but we have yet to see a program that can take advantage of it. The length of the delay, given the fact that the 7970 has been out for roughly four months now, makes us worry there could be some sort of hardware bug holding things up. Fortunately, Nvidia was able to supply a beta version of CyberLink's MediaEspresso with NVENC support right out of the chute.

Although we were able to try out NVENC and test its performance, I'd caution you that the results we're about to share are somewhat provisional. Many of these hardware encoders are essentially black boxes. They sometimes offer some knobs and dials to tweak, but making them all behave the same as one another, or the same as a pure-software routine, is nearly impossible—especially in MediaEspresso, which offers limited control over the encoding parameters. As a result, the quality levels involved here will vary. Heck, we couldn't even get Intel's QuickSync to produce an output file of a similar size.

We started with a 30-minute sitcom recorded in 720p MPEG2 format on a home theater PC and compressed it into H.264/ACC format at 1280x720. Our target bitrate for NVENC and the MediaEspresso software encoder was 10 Mbps. We weren't able to raise the bitrate for Intel's QuickSync beyond 6 Mbps, so it was clearly doing a different class of work. We've still reported the results below; make of them what you will.

While encoding with NVEnc, our GTX 680-equipped test system's power consumption hovered around 126W. When using the software encoder running on the Core i7-3820, the same system's power draw was about 159W, so the GK104's hardware encoder has obvious benefits for power efficiency.

For what it's worth, all three of the output files contained video that looked very nice. We couldn't easily detect any major difference between them with a quick check using the naked eye; none had any obvious visual artifacts. A more rigorous quality and performance comparison might turn up substantial differences, but we think most end users will likely find NVENC and QuickSync acceptable for their own video conversions. We may have to revisit this ground in a more detailed fashion in the future.

Better display support
For several generations, AMD has had a clear advantage over Nvidia in the number and variety of displays and output standards its GPUs have supported. The Kepler generation looks to close that gap with a host of notable additions, including (at last!) the ability to drive more than two displays simultaneously.

The default output config on the GeForce GTX 680 consists of a pair of DVI ports (both dual-link), an HDMI port, and a full-sized DisplayPort connector. The HDMI output is compliant with version 1.4a of the HDMI standard, and the DisplayPort output conforms to version 1.2 of that spec. The GTX 680 can drive 4K-class monitors at resolutions up to 3840x2160 at 60Hz.

Impressively, connecting monitors to all four of the card's default outputs simultaneously shouldn't be a problem. The GTX 680 can drive four monitors at once, three of which can be grouped into a single large surface and used as part of a Surround Gaming setup. That's right: SLI is no longer required for triple-display gaming. Nvidia recommends using the fourth monitor as an "accessory display" for email, chatting, and so on. I recommend just buying three displays and using alt-tab, but whatever.

The GeForce camp is looking to catch up to Eyefinity on the software front, as well, with a host of new driver features in the pipeline. Most notably, it will be possible to configure three displays as a single large surface for the Windows desktop while keeping the taskbar confined to one monitor. Also, users will be able to define custom resolutions. Last but perhaps niftiest, Surround Gaming is getting a new twist. When playing a game with bezel compensation enabled, the user will be able to punch a hotkey in order to get a peek "behind" the bezels, into space that's normally obscured. If it works as advertised, we'd like that feature very much for games that aren't terribly smart about menu placement and such.

In fact, writing about these things makes us curious about how Eyefinity and Surround Gaming stack up with the latest generation of games. Hmmmmmm...