Single page Print

GPU Boost enters its second generation
I said earlier that the PC market has gone to weird place recently. Truth is, very high-end graphics cards have been in a strange predicament for a while now, thanks to the tradeoffs required to achieve the very highest performance—and due to conflicting ideas about how a best-of-breed video card should behave. The last couple generations of dual-GPU cards from AMD have stretched the limits of the PCIe power envelope, giving birth to innovations like the dual-BIOS "AUSUM" switch on the Radeon HD 6990. Those Radeons have also been approximately as quiet as an Airbus, a fact that doesn't fit terribly well with the growing emphasis on near-silent computing.

With the Titan, Nvidia decided not to pursue the absolute best possible performance at all costs, instead choosing to focus on two other goals: good acoustics and extensive tweakability. The tech that makes these things possible is revision 2.0 of Nvidia's GPU Boost, which is exclusive to Titan cards. As with version 1.0 introduced with the GTX 680, GPU Boost 2.0 dynamically adjusts GPUs speeds and voltages in response to workloads in order to get the best mix of performance and power efficiency. Boost behavior is controlled by a complicated algorithm with lots of inputs.

What makes 2.0 different is the fact that GPU temperature, rather than power draw, is now the primary metric against which Boost's decisions about frequency and voltage scaling are made. Using temperature as the main reference has several advantages. Because fan speeds ramp up and down with temperatures, a Boost algorithm that regulates temperatures tends to produce a constant fan speed while the GPU is loaded. In fact, that's what happens with Titan, and the lack of fan variance means you won't notice the sound coming from it as much. Furthermore, Nvidia claims Boost 2.0 can wring 3-7% more headroom out of a chip than the first-gen Boost. One reason for the extra headroom is the fact that cooler chips tend to leak less, so they draw less power. Boost 2.0 can supply higher voltages to the GPU when temperatures are relatively low, allowing for faster clock speeds and better performance.

Via Boost 2.0, Nvidia has tuned the Titan for noise levels lower than those produced by a GeForce GTX 680 reference card, at a peak GPU temperature of just 80° C. That's somewhat unusual for a top-of-the-line solution, if you consider the heritage of cards like the GeForce GTX 480 and the aforementioned Radeon HD 6990.

I'm happy with this tuning choice, because I really prefer a quiet video card, even while gaming. I think a best-of-breed solution should be quiet. Some folks obviously won't agree. They'll want the fastest possible solution, noise or not.


EVGA's Precision utility

Fortunately, Boost 2.0 incorporates a host of tuning options, which will be exposed via tweaking applications from board makers. For instance, EVGA's Precision app, pictured above, offers control over a host of Boost 2.0 parameters, including temperature and power targets, fan speed curves, and voltage. Yep, I said voltage. With Boost 2.0, user control of GPU voltage has returned, complete with the ability to make your GPU break down early if you push it too hard.

As you can see in the shot above, our bone-stock Titan card is running at 966MHz. We had the "rthdribl" graphics demo going in the background, and our board was happy to exceed its 876MHz Boost clock for a good, long time. As you might imagine given the fairly conservative default tuning, there's apparently quite a bit of headroom in these cards. You can take advantage of it,if you're willing to tolerate a little more noise, higher GPU temperatures, or both. Nvidia tells us it's found that most Titans will run at about 1.2GHz without drama.

Further tweaking possibilities include control over the green LED lights that illuminate the "GEFORCE GTX" lettering across the top of the Titan card. EVGA has a utility for that. And Nvidia says it will expose another possibility—somewhat oddly, under the umbrella of GPU Boost 2.0—via an API for use by third-party utilities: display overclocking. Folks have already been toying with this kind of thing, pushing their LCD monitors to higher refresh rates than their official specs allow. Nvidia will be enabling its partners to include display overclocking options in tools like Precision. We haven't seen an example of that feature in action yet, but we stand ready and willing to sacrifice our 27" Korean monitor for the sake of science.