TR Forum Tidings: The cost of graphics card power use

— 6:00 AM on May 14, 2008

A week ago, we published an article exploring today's graphics cards from a value perspective, mixing our extensive benchmark results with pricing data from online retailers. Shortly after publication, forum regular Dragmor suggested we factor power consumption into the equation, since more power-hungry cards tend to imply heftier power bills. A couple of days later, he decided to take matters into his own hands and posted his own result on our forums.

Dragmor describes his methodology as such:

I based usage on 6 hours idle and 2 hours load per day. I think this is a decent usage model for a gamer. Some days the PC would be off, some days lots of web browsing and some days lots of gaming. The power values for the cards were taken from Xbit labs since they provide the values for the cards by themselves. I couldn't find the numbers for the 8800 Ultra on Xbit so I used TR's numbers and Xbit's numbers from the other cards to take away the rest of TR's systems power.

His numbers cover nine cards, and they paint a pretty insightful picture. AMD's Radeon HD 3850 costs the least to run, followed by the GeForce 9600 GT, Radeon HD 3870, and GeForce 8800 GT. Assuming power at 12 cents per kWh, Dragmor worked out that these cards cost a respective $9.08, $11.93, $12.02, and $15.95 per year—not as part of a system, mind you, but on their own. At the other end of the spectrum is the GeForce 9800 GX2, which according to Dragmor costs $34.09 per year.

Considering yearly home power bills tend to run in the hundreds of dollars (at least), even 34 bucks isn't much. Then again, the data assumes only eight hours of operation per day, while many enthusiasts leave their computers on 24/7.

Tip: You can use the A/Z keys to walk threads.
View options

This discussion is now closed.