During my early days as a PC enthusiast, we didn't really care about noise. The howl of a stack of high-speed fans was the mark of a powerful system, like the throaty exhaust of a badass custom chopper. Perhaps because we really didn't have a choice, we embraced the din and used it to intimidate lesser PCs.
Today, silence is golden. CPU makers are increasingly focused on lowering power consumption, and everyone seems to have embraced the idea that computers should be as quiet as possible. Enthusiasts still have an insatiable desire for moar power. We've just combined that thirst with the pursuit of quiet cooling systems and smart fan control algorithms.
CPUs used to be the biggest consumers of power within PCs. As such, they required the most aggressive—and noisiest—cooling. These days, however, GPUs consume much more power than CPUs. The Core i7-2600K, which sits atop the Sandy Bridge lineup, has a 95W thermal envelope. Nvidia's mid-range GeForce GTX 560 Ti has a much higher 170W TDP, and power consumption only increases as you climb up the GPU ladder. Even the most power-hungry desktop CPUs of the last couple of years have topped out in the 125-140W range.
If you look at the chips themselves, it's easy to see why. Graphics processing is an inherently parallelizable task, making it relatively easy to increase performance by cramming more cores into each slice of GPU silicon. With core counts in the hundreds, modern GPUs are made up of many more transistors than desktop CPUs. The GF114 GPU behind the GeForce GTX 560 Ti, for example, has twice the number of transistors as a quad-core Sandy Bridge chip.
I've long been a vocal proponent of robust, user-configurable fan speed controls on motherboards. With that horse beaten to within an inch of its life, it's time to saddle up a new pony: graphics cards are long overdue for better fan speed controls.
To their credit, modern graphics cards are much quieter than the noisy Dustbusters of old. They're also smart enough to change fan speeds intelligently based on GPU temperatures. Users don't have much control over the fan control logic, though. Drivers and tuning applications sometimes give you a slider to change the fan speed, but that usually turns off temperature-based scaling and locks on a single speed, which is no good if you alternate between gaming and mundane desktop tasks. Unlike the best fan control options for CPUs, you can't set temperature thresholds, targets, or corresponding fan speeds. There's no way to dictate how aggressively your GPU cooler responds to changes in temperature, either.
Thanks to the popularity of custom cooler designs, there's often a wide range of noise levels and GPU temperatures between different graphics cards. Some cooler designs are simply better than others. Manufacturers often have different tolerances for GPU temperatures, and their propensity to tweak clock speeds adds another wrinkle to the cooling equation. Take the GeForce GTX 560 Ti, again. We tested four flavors of the card in our initial GPU review, and each had very different noise levels and GPU temperatures. Noise levels ranged from 38-43 decibels under load, while GPU temperatures were spread between 56 and 70°C. There certainly seem to be differing opinions on how to best balance GPU temperatures and fan speeds, so why not let end users decide for themselves?
As long as safe limits are taken into consideration, there's really no good reason to prevent users from fiddling with whatever variables are offered by a graphics card's fan control intelligence. Besides, some GPUs already feature built-in throttling mechanisms that serve as a last line of defense against overheating.
After all the ranting I've done about motherboard-based fan speed controls, I feel a little silly for neglecting graphics cards for so long. The loudest component in the average enthusiast's PC is likely to be the graphics cooler, and we should have just as much control over it as we do the CPU cooler—if not more.
Ideally, I'd like to see graphics drivers allow users to shape the fan's speed profile by dragging multiple points along a line graph with temperature on one axis and fan speed on the other. If AMD and Nvidia aren't going to step up, there's no reason why card makers shouldn't offer similar functionality through their own tweaking software. Just about anything would be an improvement over the manual sliders we have now.