Review Request: Good cooling reduces power consumption?

Speed addicts anonymous.

Moderator: Starfalcon

Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 9:10 am

So, the writers at TR and Real World Tech have been cross-complementing each other over the past year, and that's how I learned about RWT. David Kanter has an article up on the interaction between heat production in a CPU and its temperature. The money quote:

The authors estimated that lowering the junction temperature from 85C to 30C saves 7W for a typical chip – more than the power consumption of an entire core at 2GHz. Put another way, using conventional air cooling would increase chip power consumption by 12%.


Whoa! Now, every review I've read about high-end CPU cooling, either air or water, has focused on the direct impact on CPU temperature and attainable maximum clock speed. Does anyone have access to hard benchmark data supporting this assertion that more effective cooling has a significant impact on CPU power consumption?

Can the hardware gurus at TR benchmark the power consumption with stock cooling versus enthusiast-level cooling? Because I'm suddenly very curious to see if this claim actually holds water ;)
jbrandmeyer
Gerbil In Training
 
Posts: 8
Joined: Tue Nov 22, 2011 10:32 am

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 9:46 am

I have no idea if the 7W figure is accurate, but it is definitely plausible. CMOS circuits consume more power (in the form of leakage current) when they are hot. In extreme cases, this can lead to what's called "thermal runaway" -- this is a phenomenon where a chip with inadequate cooling heats up, and this in turn raises power consumption (and heat production) even more, in a positive feedback loop which quickly fries the chip.

Edit: Also note that the 7W power savings cited in the RWT article is from a 55C drop in die temperature. That's a really big drop! In a desktop PC environment you're not going to be able to get a 55C drop compared to a decent air cooler without resorting to mechanical refrigeration (i.e. something with a compressor); and the compressor will consume more than 7W.
(this space intentionally left blank)
just brew it!
Administrator
Gold subscriber
 
 
Posts: 37479
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 11:29 am

We test this effect every time we do a video card roundup that includes different versions of the same card. We test different coolers and report temperatures and power draw. Heck, I even let video cards warm up for several minutes at peak load before taking a power draw reading. The effect is real, but it's fairly minimal, generally speaking--a few watts here or there, mostly. You will see it in our reported data, though, if you look carefully.
Scott Wasson - "Damage"
Editor - The Tech Report
Damage
TR Staff
Gold subscriber
 
 
Posts: 1671
Joined: Wed Dec 26, 2001 7:00 pm
Location: Lee's Summit, Missouri, USA

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 11:32 am

Very interesting. It's one of those obvious things I had never given the first thought to. Thanks for bringing this up op.
ModernPrimitive
Gerbil Team Leader
 
Posts: 249
Joined: Thu Aug 03, 2006 7:12 am
Location: USA

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 11:42 am

just brew it! wrote:Edit: Also note that the 7W power savings cited in the RWT article is from a 55C drop in die temperature. That's a really big drop! In a desktop PC environment you're not going to be able to get a 55C drop compared to a decent air cooler without resorting to mechanical refrigeration (i.e. something with a compressor); and the compressor will consume more than 7W.

Yea... Basically the effect may be real but there's no point in achieving it - a big drop in temperature requires lots of cooling, which will also consume a lot of power by itself (even with water cooling - the radiator fans (2 or more) and the pump itself will easily consume more than 7W of power running "full speed").
My subscription allows you people to exist on this site and makes me a better human being than you'll ever be
JohnC
Gerbil Jedi
Gold subscriber
 
 
Posts: 1861
Joined: Fri Jan 28, 2011 2:08 pm
Location: NY/NJ/FL

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 2:22 pm

Thanks, Damage. Yes, I do see the change on the recent 6950 round-up. Looks like a 14C temperature change between the XFX and MSI cards also corresponded to 9W of power savings at the wall. Neglecting the tiny difference in clock speed between them, is it still a fair comparison? Does AMD mandate a particular power supply design for the card mfrs?
jbrandmeyer
Gerbil In Training
 
Posts: 8
Joined: Tue Nov 22, 2011 10:32 am

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 2:29 pm

It isn't quite an apples-to-apples comparison. Since the cards have different coolers on them, there could've been differences in power draw from the fan motor as well. A high CFM fan can pull several watts all by itself...
(this space intentionally left blank)
just brew it!
Administrator
Gold subscriber
 
 
Posts: 37479
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 3:11 pm

Except in this case, it is the lower-wattage unit with the lower temperature.
jbrandmeyer
Gerbil In Training
 
Posts: 8
Joined: Tue Nov 22, 2011 10:32 am

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 3:28 pm

This absolutely happens. What really happens though is the resistance in the circuit/component increases with heat (Thermal resistance). Most electronic components have whats called a "R theta" (Change in resistance) or a P_d, (Power Disipation) and both are measured in Celcius/Watt and Watt/Celcius respectively. The reason this happens is as the wire or component in this case heats up the thermal resistance increases. Since Power = Current^2 * Resistance, where the current is held constant for this simple case, it is quite easy to see how more power is being draw/disipated as the resistance increases.
To Start Press Any Key'. Where's the ANY key?
If something's hard to do, then it's not worth doing
You know, boys, a nuclear reactor is a lot like a woman. You just have to read the manual and press the right buttons.
mmmmmdonuts21
Gerbil Elite
 
Posts: 590
Joined: Wed Jul 16, 2008 9:09 am

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 4:09 pm

jbrandmeyer wrote:Except in this case, it is the lower-wattage unit with the lower temperature.

So how do you separate cause from effect? Is the wattage lower *because* of the lower temperature? Or is the chip drawing less power due to natural variation from chip-to-chip, differences in the on-card BIOS, slight variations in VRM output, etc., and this is resulting in lower temperatures?

mmmmmdonuts21 wrote:This absolutely happens. What really happens though is the resistance in the circuit/component increases with heat (Thermal resistance). Most electronic components have whats called a "R theta" (Change in resistance) or a P_d, (Power Disipation) and both are measured in Celcius/Watt and Watt/Celcius respectively. The reason this happens is as the wire or component in this case heats up the thermal resistance increases. Since Power = Current^2 * Resistance, where the current is held constant for this simple case, it is quite easy to see how more power is being draw/disipated as the resistance increases.

Umm... no.

While it is indeed true that most metals and metal alloys have a positive temperature coefficient of resistivity (resistance rises with temperature), you're on the wrong track here. It is the voltage which is constant, not the current. So rising resistance would result in *less* power being dissipated, because the current would *drop* in an inverse relationship with the rising resistance.

We're dealing with transistors, which are made from silicon. Semiconductors like silicon have a *negative* temperature coefficient of resistivity -- their resistance drops with increasing temperature. And this is what results in increased current flow, and higher power dissipation.
(this space intentionally left blank)
just brew it!
Administrator
Gold subscriber
 
 
Posts: 37479
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 9:11 pm

just brew it! wrote:
jbrandmeyer wrote:Except in this case, it is the lower-wattage unit with the lower temperature.

So how do you separate cause from effect? Is the wattage lower *because* of the lower temperature? Or is the chip drawing less power due to natural variation from chip-to-chip, differences in the on-card BIOS, slight variations in VRM output, etc., and this is resulting in lower temperatures?

mmmmmdonuts21 wrote:This absolutely happens. What really happens though is the resistance in the circuit/component increases with heat (Thermal resistance). Most electronic components have whats called a "R theta" (Change in resistance) or a P_d, (Power Disipation) and both are measured in Celcius/Watt and Watt/Celcius respectively. The reason this happens is as the wire or component in this case heats up the thermal resistance increases. Since Power = Current^2 * Resistance, where the current is held constant for this simple case, it is quite easy to see how more power is being draw/disipated as the resistance increases.

Umm... no.

While it is indeed true that most metals and metal alloys have a positive temperature coefficient of resistivity (resistance rises with temperature), you're on the wrong track here. It is the voltage which is constant, not the current. So rising resistance would result in *less* power being dissipated, because the current would *drop* in an inverse relationship with the rising resistance.

We're dealing with transistors, which are made from silicon. Semiconductors like silicon have a *negative* temperature coefficient of resistivity -- their resistance drops with increasing temperature. And this is what results in increased current flow, and higher power dissipation.


My bad JBI. Sorry for the bad info.
To Start Press Any Key'. Where's the ANY key?
If something's hard to do, then it's not worth doing
You know, boys, a nuclear reactor is a lot like a woman. You just have to read the manual and press the right buttons.
mmmmmdonuts21
Gerbil Elite
 
Posts: 590
Joined: Wed Jul 16, 2008 9:09 am

Re: Review Request: Good cooling reduces power consumption?

Postposted on Wed Nov 30, 2011 11:22 pm

Semiconductors like silicon have a *negative* temperature coefficient of resistivity -- their resistance drops with increasing temperature


Be careful with temperature coefficient of resistivity. Different transistor types have different coefficients. Diodes are typically negative temperature coefficient devices. Power MOSFET's are almost always positive temperature coefficient. Power IGBT's can be positive or negative, depending on the design of the transistor and operating region. The metal wires within a CPU are certainly positive coefficient, and that does increase power dissipation at higher temperatures**. CPU switching transistors could be positive or negative, I don't know for sure. All of those effects are usually linear in temperature. The subthreshold leakage referred to by Mr. Kantor is both positive and exponential in the temperature.

So how do you separate cause from effect?


It is very easy to show that the increase in drawn power is not solely responsible for the increase in temperature. For a linear material or stack of linear materials (like a CPU die in package), the relationship between thermal power and temperature is \dot{Q} = U A (T_source - T_sink), where \dot{Q} is thermal power (Watts), U is a constant property of the material (usually just its conductivity), and A is the cross-sectional area for heat transfer. The product UA can be lumped into a "thermal resistance" term, in units of watts per degree K rise across that surface. Anything short of a fluid interface boundary follows this law. For constant fluid flow-rate, even fluid boundaries do too. If the effect observed in the 6950 review is 9 W for 16 K increase, and it was totally attributed to thermal resistance, then 1/UA would be a whopping 1.8 K/W. For a 250W card, that implies a total temperature rise above ambient of 450 C! Since this clearly is not the case, the increase thermal power must be caused by other factors.


** So, why does increasing wire resistance increase power dissipation? Because the transistors in a CPU are mostly switching the gate capacitance of other transistors on and off. If the resistance in the gate drive path goes up, then more energy is consumed in switching the transistor. In ECE101 terms, the effective circuit is that of a source capacitor, switch, wire resistor, and gate capacitor in a series loop. In the ideal case, the wire resistance is zero, and the act of closing the switch losslessly transfers a small amount of energy (1/2 CV^2) from the large source cap to the tiny gate cap. The gate voltage rises to the threshold voltage, and the transistor turns on. If the wire resistance is non-zero, then some energy is also dissipated in that wire resistance (integral of I^2 R dt), in addition to the energy required to charge the transistor's gate.

My knowledge of this theory comes from power electronics, but the basic principles should also apply to CPU's. At our scale, gate drive power is tiny compared to switching and conduction losses of the power transistors, so we don't care about optimizing gate drive leakage current much. So it was somewhat shocking to see that power dissipation in CPU's is all about the gate drive power itself.
jbrandmeyer
Gerbil In Training
 
Posts: 8
Joined: Tue Nov 22, 2011 10:32 am

Re: Review Request: Good cooling reduces power consumption?

Postposted on Thu Dec 01, 2011 8:29 am

mmmmmdonuts21 wrote:My bad JBI. Sorry for the bad info.

Mmm... looks like I owe you an apology as well. :oops:

The post following yours seems to indicate that MOS transistors can have a positive temperature coefficient (even though it is negative for bulk silicon)... and there's a mechanism by which the temperature coefficient of the metal wires could come into play as well, due to switching effects (I hadn't considered that). However, it appears that the subthreshold leakage mentioned by jbrandmeyer (and explained by David Kanter in his article) dominates, since it is exponential with temperature. What I'm gathering from all this is that MOS transistors effectively have a positive temperature coefficient of resistance when they're switched "on", but a negative one when they're switched "off" (due to the leakage current).

So it looks like we were both half right, and half wrong!
(this space intentionally left blank)
just brew it!
Administrator
Gold subscriber
 
 
Posts: 37479
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer


Return to Overclocking, Tweaking, & Cooling

Who is online

Users browsing this forum: No registered users and 5 guests

cron