A note on GeForce GTX 480 noise levels

Now that GeForce GTX 400-series graphics cards are out in the wild, although in limited numbers, I should say something quickly about the main issues for which these cards, and especially the GTX 480, have gained a reputation: power, noise, and heat. I talked about this some on the latest podcast, but I don’t think I communicated it all that well in the context of our GeForce GTX 480 and 470 review.

I feel like the GTX 480 is getting a bit of a bad rap.

Yes, the GF100 cards’ performance isn’t all that it should be, and that’s almost certainly due to the fact that the GPUs wouldn’t reach Nvidia’s projected clock speeds with all of the units onboard enabled. Typically in such cases, and again almost surely in this one, the established power and thermal envelopes are a constraining factor. High clock speeds might be possible by giving the chip additional voltage, but doing so would push the GPU’s power draw, heat, and cooling demands into unreasonable territory. The GF100 is a large chip, and such problems can be compounded for a number of reasons by a large die area and lots of transistors.

Dealing with these issues is a balancing act, one that every chip company has face to one degree or another in turning out a product. Competitive issues aside, I think the balance Nvidia has struck with the GeForce GTX 480 is largely a reasonable one. You can look at the numbers we measured in our review, but the basics are pretty clear. For power draw and GPU temperatures, the GTX 480 stays within the generally established boundaries for the industry. That’s not to say that the Radeon HD 5870 doesn’t look a darn sight better in terms of power draw, but a test system equipped with single GTX 480 draws 60W less than the same system with dual Radeon HD 5870s in CrossFire. We’re not talking about a paragon of efficiency here, but the GTX 480 isn’t out on the bleeding edge, either. No new PSU standards were created with the introduction of this product.

Similarly, Nvidia has obviously biased the fan profiles on the GTX 480 toward lower noise levels than toward lower GPU temperatures, but the GPU temperature readings we got for the card were only a few degrees higher than what we saw from the Radeon HD 5870.

More notably, the GTX 480’s cooler is an impressive bit of engineering that attempts to mitigate the effects of the GPU’s relatively high power consumption—and thus heat production. Have a look at the noise levels we measured while running a real game, Left 4 Dead, that generally produces higher power draw numbers than most:

Once more, the Radeon HD 5870 is quieter—but only the 1GB version with the stock cooler. The slightly overclocked Asus Matrix card with 2GB of RAM was louder than the GTX 480. I don’t want to overstate it, but heck, another example might be considered a victory of sorts: the GTX 480 is quieter than the GeForce GTX 295, even though the GTX 480 draws about the same amount of power under load.

For those of you who think that doesn’t count for much, you’re forgetting the bad old days of the GeForce FX 5800 Ultra, when Nvidia had some similar problems with a new GPU and attempted to make up for it by reducing image quality in multiple ways—skimping on texture filtering and dropping down to lower-fidelity texture formats, mostly—and strapping a cooler to the side of the card that we derisively dubbed the Dustbuster. If Nvidia is compromising on image quality with the GTX 400 series, we sure haven’t detected it yet. The texture filtering algorithm looks to be the same as the other recent GeForces, which is to say excellent. And a single GTX 480 is nowhere near as loud as ye olde Dustbuster. I’m hesitant to compare across a such vast differences in time and equipment, but have a look at these numbers:

A single FX 5800 Ultra was nearly 9 dB louder than a GTX 480. Both objectively and subjectively, the difference between the two is huge.

To take these iffy cross-review comparisons even further, we measured the Radeon HD 4890 at 50.6 dB under load not long ago, slightly higher than the 49.9 dB at which we measured the GTX 480. Our SLI noise level results prove the GTX 480’s cooler is capable of spinning up to higher speeds, but a single card just didn’t go there during the course of our testing. We tested on an open test bench, and your results may vary in either direction in an enclosure, depending on cooling and venting. Still, that bit about the GTX 480 fitting well within the established boundaries of the market applies.

Remember, when you hear an incensed fanboy spouting off about how awful the GTX 480’s noise and heat levels are, that his point only applies in a very limited sense, relative to some slightly better competition. I wouldn’t hesitate to put a GeForce GTX 480 into a gaming rig of my own on that basis. Yes, I would prefer the Radeon HD 5870’s overall combination of attributes, especially in terms of price and performance. But let’s be clear: in an undeniably tough situation, Nvidia has avoided the temptation to reach the highest possible performance levels at the cost of reasonable power draw and acoustics. Folks seem to be missing that fact, which has caused Nvidia to send out its viral drones to spread the message. Such silliness shouldn’t be necessary. This is one lesson we’re happy they’ve learned, and I’d hate to fail to acknowledge it.

Comments closed
    • sammorris
    • 9 years ago

    This article horrifies me. If it’s not response to nvidia threatening a lawsuit over not giving their product a good enough rep, then it’s simply poor writing.
    I can’t claim to be unbiased, neither do I have first hand experience of using a GTX480 (or a 470, which is often overlooked, despite being almost as bad in all the same ways).

    [inb4 ‘then your opinion is irrelevant’ tards.]

    The noise level of these cards is largely unimportant. Cards have been loud before and will be again. Latest drivers for my HD4870X2s make them hideously loud under load, but at idle they are quiet, which is where it really matters. The GTX480s aren’t silent at idle, but they aren’t terrible. In my opinion, noise level is the weakest criticism of the GTX400 series, though I will say that giving the GTX480 the thumbs up for being quieter than the FX5800 is like complimenting someone for not being a fascist dictator, it means absolutely nothing. (The comparison should not have been included either, the scientific data for noise levels is so specific, by using completely different tests the numbers are useless)

    The real issue is the cause of that noise, the cards’ vast power consumption, and thus heat output. The amount of electrical power used to generate its tiny bonus in processing power is disgraceful.

    “in an undeniably tough situation, Nvidia has avoided the temptation to reach the highest possible performance levels at the cost of reasonable power draw and acoustics”
    Reasonable acoustics maybe. Reasonable power draw? Really? The HD5970 uses less power, and that’s two GPUs. That’s not reasonable power consumption, that’s ridiculous power consumption.

    “For power draw and GPU temperatures, the GTX 480 stays within the generally established boundaries for the industry”
    Given that it is by far the most power hungry single GPU card ever, and is the hottest running card by reference design (driver bugs excluding) ever, I’d dispute that. It’s a trend-setter. Never before has a card been so power-hungry and so hot.
    Nvidia’s manufacturing does not deal well with those sorts of temperatures. Until you reach the maximum limit (and be fair, the GTX480 doesn’t, not even in SLI) temperatures mean little to the end user, they don’t affect how something works, so there’s no issue. However, time and time again I see overheated Geforces simply break from it. I almost never see that happen to ATI cards. That’s not fanboyism, that’s just real world experience. The huge temperatures of the GTX400 series are a real reliability issue. I don’t expect these to last that long before they all start going pop. THAT is an issue.

    Long story short, the GTX400 series performs well in the market, but is not appropriately priced in all markets (i.e. is absurdly overpriced in the UK), runs far too hot for long-term reliability, makes a heck of a racket, and uses a planet-melting amount of power. (A GTX480 SLI setup that plays games 6 hours a day could cost an HD5670 every year in wasted electricity compared to two HD5870s)

    This has now reached the stage where I’m going to be carefully inspecting techreport benches for bias, if articles like this are showing up.

      • Anonymous Coward
      • 9 years ago

      Well, just so you know, I found that this blog/article was in fact well balanced and entirely appropriate. In fact, whereas the article was written with subtle and carefully targeted praise that was easily understood to stop well short of any sort of endorsement, the responses such as yours are absolutist, loud and crude.

        • sammorris
        • 9 years ago

        The article is written relatively well, it does not clearly show bias to one manufacturer or another. However, the fact that it draws conclusions that are entirely untrue makes me question it, and future articles like it.

    • kc77
    • 9 years ago

    I can understand the need to “clarify” a position but comparing something released in 2003 to a product in 2010 doesn’t really make a valid point. If that were the case we could compare a Phenom to a P4 or Core 2 to a Thunderbird.

    Unfortunately this product isn’t powerful enough to make the case that it’s as successful as people would like it to be.

      • RampantAndroid
      • 9 years ago

      I disagree – I think you CAN compare the 5800 to the GTX480, since everyone is complaining about noise. I don’t know how long you’ve been following PC hardware (you recognize the age of the card, so you know something of it, obviously) but he 5800 WAS a leaf blower. The card lived on the market for a short period of time, before it was dropped for the 5900 and later, the 5950. By those days standards, the 5800 was terrible. I remember Maximum PC hated it…and the GTX480 isn’t comparable noise wise, apparantly. Frankly, I about fell out of my chair when he compared the GTX480 to the 5800 Ultra, because when people talk about video cards running loud and hot, I think of the 5800 Ultra. And I do compare some CPUs to the old Precott core P4s…we had a Precott at work as a build machine a while ago. The PC name was “space heater” – because it ran so hot. He didn’t compare the GTX480 to the 5800 Ultra and say it performed better, he compared noise.

      I’ve been digging around, and to me the GTX480 seems a decent choice – reviews place it doing well against the 5870 (and the 5870 2GB version costs the same as the GTX480.) While the temperatures and power consumption leave something to be desired, that doesn’t invalidate it’s better perf with AA turned on, and seemingly, it’s better minimum framerate – even if the max is still not as good at the 5970. Should I actually buy the GTX480, I’ll be sure to borrow a coworker’s decibel meter and do my own comparison from my GTX280 to the GTX480, if not to just satisfy my own curiosity.

        • kc77
        • 9 years ago

        I didn’t say it wasn’t possible. I could compare a Yugo to a Corvette and theres really nothing to stop me from making the comparison. However, when it comes to reviewing PC hardware we don’t usually make excuses for hardware that displays characteristics that are worse than it’s previous generation and displays temperatures that are not inline with what the current generation’s characteristics are.

        You’ve never seen anyone in a review say well at least it’s not as hot as a P4 for something that has been released in today’s marketplace.

        Also what makes matters worse is that anyone can easily overclock 5870 to 950 – 1000 on a stock cooler and it will compete or beat the 480 while still sucking down less power and lower temperatures.

        You can put lipstick on a pig but that doesn’t make it a beauty queen.

    • xtalentx
    • 9 years ago

    While I am an ATI man I am very sad that Fermi is such a turd. I was looking forward to some price wars and reaping the benefits. Hopefully NVIDIA turns things around and gets a competitive product out the door soon.

      • RampantAndroid
      • 9 years ago

      “While I am an ATI man I am very sad that Fermi is such a turd. ”

      How does this make any sense to you? While you’re an ATI man, you think an nVidia product is crap. OK…since you prefer ATI, I kinda expect that you don’t like nVidia products. Now, if you had said “While I am an nVidia man I am very sad to that Fermi is such a turd” your post might carry some weight.

        • Kaleid
        • 9 years ago

        He was hoping that fermi would have been better, which would have created a price war and make it cheaper to buy an Ati card.

    • VILLAIN_xx
    • 10 years ago

    Im actually surprised PRIME1 didnt respond to this blog….

    ….Maybe he did jump off a bridge.

    O_o

      • Meadows
      • 10 years ago

      He hasn’t been seen for a long time now, week maybe? Two weeks? I don’t know. I bet he’s visiting his so-called “unbiased” sites like Guru3D (don’t laugh) right now.

    • Exo
    • 10 years ago

    I understand the need for this article because most of the people who read something for the first time like to stick to this idea and spread it over the place. People always feel the need to bash something to elevate one another, and even more when they want to own one or to show how great they are in this knowledge area.

      • outcast
      • 10 years ago

      Wow!!!…Thanks!!!…that was even worst than i thought…

    • StuG
    • 10 years ago

    4980 = typo

    • Convert
    • 10 years ago

    Meh.

    I haven’t seen any posts that really rail on the 480 being so loud that regardless of features and performance it isn’t worth using. There has been plenty of talk about comparing it to it’s closest competitor, sure, and there is nothing wrong with that.

    From your charts you have only illustrated that the 2GB 5870 is also loud, or more specifically that it’s possible to purchase louder cards. I’m not sure how that in any way changes the fact that a 480 can still be considered too loud for a lot of people AND given a choice between similar performance and quieter offerings that there are more desirable cards to be had. Though I suppose you did touch on this.

    As I’ve said before, with Fermi when it comes to heat and noise it’s just bad on bad news.

    As for applauding Nvidia for making the best out of a bad situation, um ok. That’s fine I guess. Personally I feel their marketing department wipes out any sympathy their engineering department deserves but I suppose that’s just me seeing Nvidia as a whole company that deserves scorn or praise when such situations arise.

    • rodney_ws
    • 10 years ago

    Unless the earlier reporting on Fermi’s noise levels was inaccurate, I really don’t see the need for this article.

    • lycium
    • 10 years ago

    i can’t believe meadows is *still* not banned; he’s a million times more rude (and by far worse: smarmy) than many of the infamous TR trolls, all of whom have been banned.

    the signal-to-noise ratio on TR would go up many decibels if that impossibly arrogant troll were given the boot he’s been begging, daily, on his hands and knees, to receive.

      • yogibbear
      • 10 years ago

      Suck my fat meadow’s induced hard on. ๐Ÿ™‚

        • Jigar
        • 10 years ago

        Ranger Smith: [showing a “Do Not Feed The Bears” sign to Yogi] Read this sign.
        Yogi Bear: [deliberately reading incorrectly] Uh, “No Smoking In The Forest”?
        Ranger Smith: You know what it says, Yogi, and it applies to *all* the bears, especially you!
        Yogi Bear: Uh, yes, sir.

      • StuG
      • 10 years ago

      I beg to differ, he can be highly abrasive at times but compared to the past TR trolls he has one thing that the others lacked.

      A large forum presence. Though I know him from his bantering on the front-page comments I know him more form his forum participation and honestly can say he has given me valuable advice on more than one issue.

      This is one thing that snakeoil or any other TR troll has yet to do, and is the defining factor why he is kept within TR. Prime1 is among Meadows in this type of troll. Granted, they both have their times where they get “trolly” but at the same time I respect both of their opinions as knowledgeable, though possibly bias.

    • can-a-tuna
    • 10 years ago

    Someone got a brown envelope with nvidia logo slipped under door :).

    Perhaps Fermi is not that loud with stocks but when comparing overclocked cards’ noise levels, overclocked Fermi brings levels in its own class as can be seen from tweaktowns review.

    • yfital
    • 10 years ago

    this sounded like a freaking fanboy cry (correction, payed fanboy), the fact you even dared to mention the 2gb 5870 as louder was embarrassing.

    i will think twice whether to trust your GPU reviews from now on

    • bdwilcox
    • 10 years ago

    My guess is that some whiny PR b*tch from nVidia called Scott up crying that TR was spreading misconceptions about their newest product and reminded Scott about how much nVidia has helped them in the past and how good their relationship with TR is. Then he requested a favor, asking Scott to write an article explaining it’s not as bad as everyone’s saying it is. Scott’s out of character emotionalism speaks that something doesn’t jive here. This is pure conjecture on my part, but it’s usually how business works. And if it’s true, it isn’t anything scandalous or unethical. If Scott put on his Dr. Nick voice and claimed, “IT’S WHISPER QUIET!” then I would have a problem with it. But, as it stands, he’s spinning a bit but sticking to the facts.

    On the other hand, I think the product is a dog. It’s hot, loud, power hungry and underwhelming for what it costs. Just because it only makes your ears bleed a little less than the competition isn’t great justification. If you want to dump your money into it, be my guest. Personally, I’ll wait for that better $200 dollar solution that comes with a quiet fan.

      • BoBzeBuilder
      • 10 years ago

      Keep in mind, that most of us who cry about GTX 480s noise levels never actually owned one, while on the other hand, Scott is testing these cards first hand.

      It’s louder than the competition, but it’s no dustbuster. That’s the lesson here.

      • indeego
      • 10 years ago

      Sounds like a data room. His base system itself was extremely loud. I’ve never had a desktop that loud beforeg{<.<}g

    • Lazier_Said
    • 10 years ago

    You don’t need to go back 7 years to the dustbuster edition for a hot and noisy card.

    You don’t have to back any further than the wildly popular HD4890 which pretty well owned the $200 market until just a few months ago.

    In TR’s testing the GTX 480 showed 10W lower idle power consumption and was 2dB quieter under load.

    That environmental envelope was barely worthy of comment then yet now is a dealbreaker in and of itself?

      • Ushio01
      • 10 years ago

      Yes because it’s the current generation that matters not year old cards that have basically disappeared form the marketplace.

    • TaBoVilla
    • 10 years ago

    Thanks for the post Damage.

    In Brian’s defense, he has been pretty much the only one trying to set the record straight on the forums, as civilized as possible in the middle of the fanboy atmosphere and rant comments.

    • Shining Arcanine
    • 10 years ago

    How are decibel levels being measured? If they are not being measured 1 meter away from the card’s fan, the results can be inaccurate such that louder cards can be quieter and quieter cards can be louder according to the numbers.

    Edit: There is definitely something wrong with these numbers. Having two GeForce cards in SLI should not increase decibel levels by more than 3 decibels, however, the numbers have them increasing by 10 decibels, which is a factor of 10 in terms of how much noise is being generated.

      • Lazier_Said
      • 10 years ago

      SLI or CF isn’t a simple two fans instead of one, +3dB. The extra card raises the ambient temperature and impedes airflow.

        • Shining Arcanine
        • 10 years ago

        I am aware of that, but 10 decibels seems excessive when considering such effects because these cards have a rear exhaust through an empty PCI/PCI-Express slot, which should keep the effect on the ambient temperature low, especially on a test bench where there is nothing impeding the flow of air out the back of the case.

          • NewfieBullet
          • 10 years ago

          My guess is that the increased power draw caused the fan in the power supply to ramp up as well and this contributed to the higher noise.

      • dazzawul
      • 10 years ago

      Next time you have a high end card loaded up for more than half an hour, take your hand, and put it on the back of the chip.

      See how you could fry an egg on there?

      Now, imagine that, instead of empty space, there was another Very Hot videocard sitting there, putting out heat of its own.
      And on top of that, the card above it has it’s intake restricted somewhat by the videocard below it.

      Suddenly the fan needs to ramp up to accommodate the poor intake airflow and you’ve got more than twice the fan noise of a videocard on it’s own ๐Ÿ™‚

      • BlackStar
      • 10 years ago

      Running in SLI/Crossfire causes some of the power-saving features to turn off.

      Similar to what happens when you attach multiple monitors (mem speeds stop downclocking).

    • VILLAIN_xx
    • 10 years ago

    This doesnt change my mind about anything.

    I think this entry could have been saved for the next GPU round up that they finally corrected the noise* issues and more thoughts regarding power draw comparisons.

    • Kaleid
    • 10 years ago

    Even 5750 with its default cooler is too noisy…

    • wira020
    • 10 years ago

    Ati 5000-series is so efficient that it makes fermi looks real bad…

    p/s: I thought this is already yesteryears news.. I mean, i see no point bringing it up now to be honest…

      • Grape Flavor
      • 10 years ago

      Sure, its less efficient, but that doesn’t mean it’s a total failure. It fits a segment of the market. The power/noise levels are acceptable, not great, but acceptable.

      Personally I’m planning to get a GTX 480. Yes, it’s a worse “value” than the 5870, but it’s (generally) faster and I can afford it. Plus you get PhysX (even though I wish it weren’t exclusive), and better tessellation which could be a plus in the long term.

      The GTX 480 fits my needs. Most people would say it’s excessive but I’m willing to make the tradeoff. The /[<5970<]/ on the other hand, selling at $700, is well out of even *[

    • Krogoth
    • 10 years ago

    Sorry, Damage.

    GTX480 is a crude, unrefined product. I don’t deny that the architecture behind it has tons of potential. The GTX480 is just a rushed example of it which got hampered by manufacturing difficulties. In that respect, it is quite similar to 2900XT at its own launch.

    Fermi has a good chance to follow 2900XT’s example by having several reworkings and die shrinking to make a more remarkable product like 3870 and 4870.

    I would recommend Nvidia fans to hold onto their GT2xx cards and wait for Fermi refresh.

      • StuG
      • 10 years ago

      I hope that they can turn Fermi into a workable card, I see no reason why not and it could bring alot of good progress to the graphics scene. Fermi was a great concept, just poorly executed.

        • Krogoth
        • 10 years ago

        IMO, Nvidia just needs to pull off another 7600GT or 8800GT.

        A solid performer for a reasonable price tag without being hot and obnoxiously loud.

          • LawrenceofArabia
          • 10 years ago

          Well we’ll be waiting until GF104 hits this summer. A 256 shader part with good clockspeeds at the $250 price point anyone?

            • Krogoth
            • 10 years ago

            If Nvidia can manage to get thermals down, while providing more performance than 5770 and 5830. “NF104” has a good shot at heating up AMD’s heels in the mid-range segment. You get some competition which mean price wars. The customer wins in the end. Affordable, good performing DX11 class GPUs. ๐Ÿ˜‰

        • HisDivineShadow
        • 10 years ago

        Usually what happens in this scenario when nVidia is behind and they’ve got a lot of cards failing to even hit the next to highest end card, they release a card that is more cut down that has the potential to go up in some ways to the speed of the premium card at a less than premium price.

        Ti4200, GF260 216, etc.

        I think those rumors of a 460 are probably true. I also expect a cut down Fermi part to arrive in the Summer before Fall Back to School time with a much improved heat and power profile that will eventually (6 months?) make its way into the high end Fermi part.

        Yields on current parts must be horrible if they haven’t released a 490 (fully unlocked part) because that seems like an easy way to make another $100-400 from people who want the absolute best.

      • Meadows
      • 10 years ago

      Would you shut up with your condescending muted arrogance already? You begin every post of yours as if you were the all-seeing oracle of Delphi or something.

      If Damage says something, then I’m well equipped to believe it. /[

        • Krogoth
        • 10 years ago

        U MAD?

        Here is a hint, look in the mirror.

          • Meadows
          • 10 years ago

          Most of the time, you look _[

            • indeego
            • 9 years ago

            If you were looking at a pool you’d see a reflection. If you were looking in a pool you’d see fish. amiriteg{

            • Meadows
            • 9 years ago

            We’re talking about mirrors here.

            • Krogoth
            • 9 years ago

            The troll is strong with this one.

        • lycium
        • 10 years ago

        if only your condescending arrogance were muted… grow up and stop trolling on TR over petty things, it’s long overdue.

      • outcast
      • 10 years ago

      Thanks for your honesty…you and LoneWolf15 were very helpful…

    • mrksha
    • 10 years ago

    LOL

    How much did you get from nvidia for this “article”?

      • Jigar
      • 10 years ago

      RTFA, Last line said _[< *[

      • kuraegomon
      • 10 years ago

      Dude,

      Lurk a _loooooooonnnnng_ while longer before questioning Damage’s integrity. Seriously. You’ve been on this site nine days. Geez.

      • KamikaseRider
      • 10 years ago

      It’s not an article, it’s a blog post. Which means that this is the place where Damage can express his PoV.
      If he was taking Nvidia’s side here (even on his blog posts he is still very reasonable) it would be much more clear.

      I think that both mdk77777 and Damage have a good point. The 480 isn’t nvidia best product launch but it was the best they could do.

        • mrksha
        • 10 years ago

        The whole “Blog post” looks like it’s written by PRIME1.

        All other websites on fermi: it’s too hot, noisy, draws too much power, losses in some games to much cheaper radeon.

        Scott Wasson on fermi: it’s not as loud as dustbuster, buy nvidia.

          • Jigar
          • 10 years ago

          No i don’t think so he said anywhere to buy Nvidia, all he said is all this discussion about the noise, power and other issue are blown out of proportion and due to this Nvidia had to send their PR to explain this on TR’s forum, which defiantly not required. I must say you need to read the blog again…

            • MrBojangles
            • 10 years ago

            In all fairness nvidia didn’t send “him” here do to overreactions about the hot,loud,and power hungry fermi’s.”he’s” been here lurking around the forums trying to downplay any and every flaw with nvidia’s products while simultaneously regurgitating the marketing bullet points long before fermi was ever launched and reviewed.So claims that nvidia had to break out the pr whoop-a$$ stick do to overreactions of the fermi launch are just we bit inaccurate.At least in this case.

            • Kurkotain
            • 10 years ago

            actually, he kind of said “buy ATI” in the way that he said his personal preference would be the 5870 in the terms of price and performance, but people always see what they want to see…

            • MrBojangles
            • 10 years ago

            I agree people do see what they want to see……never said he had completely anti ati stance.I have also seen him recommend ati products on a few isolated and rare occasions.My main point was that insinuatins that he just recently showed up as some form of crowd control due to rampant ati fans falsely accusing the Fermi cards of evils it doesn’t really commit.Are completely false he’s been in the forums downplaying any problems with nvidia hardware or software as a nvidia pr mouth piece since 2007, not just recently since the Fermi launch.

            • Jigar
            • 10 years ago

            I hope you know that dude works for Nvidia.

            • MrBojangles
            • 10 years ago

            I know he is affiliated with nvidia and gets compensated with merchandise for his efforts as a focus group member(whatever it is a focus group member actually does).My only point is that the “Folks seem to be missing that fact, which has caused Nvidia to send out its viral drones to spread the message. Such silliness shouldn’t be necessary.” comment/link in the blog is a bit unnecessary and most importantly it’s highly inaccurate.Unnecessary because imho the people running the site should not be singling out a forum member like that especially not on a front page post.Inaccurate because it’s sugesting that all this “false hate” for the Fermi noise and temps has gotten so bad nvidia had to send people to the forums just to play pr crowd control.When that’s not the case at all he has been a semi active forum member for 3 years .He didn’t just sign up a month ago at nvidias request to try and counter out of control fermi flaming and trolling.Keeping that in mind the claim that overreactions to fermi “has caused nvidia to send out the viral drones to spread the mesage” Is a pretty big spin on the facts.Especially when the “Drone” in question has been here for three years not only preaching nvidia’s good name but also making post on non nvidia/ati related threads as well.

            • ericfulmer
            • 10 years ago

            Are you saying that Damage is a shill for Nvidia?

            That hardly seems fair considering how many reviews here on TR have praised ATI/AMD cards over the years.

            Also, other than advertising dollars, I don’t see how he could have time to shill for one company. And there are definitely AMD advertising dollars on TR.

            • Jigar
            • 10 years ago

            Trust me you don’t understand the market, TR forums are considered to be one of the genuine place where knowledgeable people come to discuss hardware issues/buy advice. And right now the whole internet including the TR gerbils are not happy with the performance of Fermi (heat,power-draw & noise). What did Damage do ? he sees the issues has been blown out of proportion, went ahead and gave his part of understanding. He also noted, that Nvidia drones which are active in the market since past many years to promote their products by advising Nvidia products via forums aren’t required.

            • MrBojangles
            • 10 years ago

            “Trust me you don’t understand the market”

            I’m glad to see you have personal knowledge of me, my overall level of intelligence and what i am capable of understanding.Thinks for clearing this up for me.Cause heaven knows you couldn’t possibly be the one who’s potentially wrong and doesn’t understand.

            • Jigar
            • 10 years ago

            Well the reason why i was sure that you don’t understand the market is because you don’t understand the basics, plus i didn’t wanted to be rude, if it was Meadows, he would have wrote, Stupid, you don’t understand ****. ๐Ÿ˜›

            • MrBojangles
            • 9 years ago

            Your comment however not rude was still arrogant and misplaced.What makes you think you have a superior knowledge of “the market” as you put it. My opinion differs from yours there fore i don’t understand even the basics. Not saying it as rudely doesn’t change the over all arrogance of the tone.

            Both my points still remain completely ballad.

            They still shouldn’t be singling out forum members on a front page post(as stated before this is imho).Even if it was in a slightly indirect manor. It was still clear enough for even a preschooler to figure out who he was pointing a finger at, and to me that shows a previously unseen level of unprofessionalism.

            Claiming that this Fermi heat/power/noise thing has blown so out proportion that nvidia had to “send it’s drones” is still a complete spin on the truth.The truth is he would have been there praising the good and downplaying the bad regardless and was. He was there before the launch, he was there day one of the reviews(before it got “blown out of proportion”), and is still there now.My point is that singling his presence out as indication that we have reached some new level of mass overreaction,to otherwise minor flaws is a complete grasp at straws. Yes i do agree that nvidia spokesman patrolling the forums is a bit ridiculous, but it is something that this site and others allow to happen and has been going on for years.It is completely irrelevant to current Fermi situation, and no fanboy/mass overreactions are not the cause. Otherwise we would have intel/ati/amd reps patrolling the forums in mass on this and other review sites as well.There presence is purely another one of nvidia’s beloved marketing projects one that has been here for years.

            • mdk77777
            • 9 years ago

            l[

            • Jigar
            • 9 years ago

            X2, cannot argue any more, as my point stands corrected…

          • A_Pickle
          • 9 years ago

          Yes, “buy Nvidia” is exactly what Scott said.

          From TFA:

          g[<"Yes, I would prefer the Radeon HD 5870's overall combination of attributes, especially in terms of price and performance."<]g

    • flip-mode
    • 10 years ago

    Good to know Mr. Wasson. Thanks. It doesn’t seem like a bad card to have.

      • ssidbroadcast
      • 10 years ago

      Well… if you can /[

        • flip-mode
        • 10 years ago

        Price is a separate issue. It’s the fastest single GPU out there. I totally agree with what you’re saying, though; I shop at the $200-ish level myself. So GTX480, GTX 470, 5870, 5850 all share one thing in common: too expensive for me.

          • ssidbroadcast
          • 10 years ago

          Bingo.

    • Anonim1979
    • 10 years ago

    3 more dB is 2 times louder.
    More or less.
    6dB is 4 times louder.

    I find noise distracting. Not gaming much this days so my GPU is passive cooled. CPU has big radiator+fan with temp. menagement and PSU is moodified with second fan – both spinning now at 5V.

    PS.
    As I hate CCC – Ati drivers suxxs – I would choose Nvidia… Next revision WHEN noise will be more bearable.

    • douglar
    • 10 years ago

    Nvida did buy some nice coolers for the Fermi boards. Let’s hope the high quality coolers trickle down to the cards that sell in volume. It would be good for the industry.

    Fermi does deserver the bad rap for heat and power in dual monitor setups though. Fermi idle heat and power get really out of line when you enable that second monitor and the cards have a much lower tolerance for borderline DVI cables compared to ATI & Intel DVI ports.

      • StuG
      • 10 years ago

      I feel that both card vendors need to work on their dual-monitor support. The HD5800 series doesn’t like having two at the same time, and idles at 400mhz/1200mhz instead of 150mhz/300mhz. Those are a sizable heat/noise gain at idle compared to a single monitor solution.

      Also, I would rather see Nvidia use coolers on a lesser scale and drop prices more. Cheap priced Fermi’s are what will drive the better HD5800’s series cards down, and thus more people will buy them.

      • HisDivineShadow
      • 10 years ago

      Well, didn’t the dual slot cooler from the FX5900 trickle down to all the premium video cards after the leafblower incident? Before that, I think single slot card coolers were being used regardless of the noise or heat implications.

      That’s if my memory serves.

    • Ushio01
    • 10 years ago

    Comparing to older cards is irrelevant it’s the current competition that matters and in that regard Nvidia fails miserably.

    From TR’s own review 10 watts higher power draw at idle 80 at load. 2.4dB louder at idle 5.4dB at load that is a very noticeabledifference and as Jigar points out 480 in SLI is even louder than the dustbuster totally unacceptable especially when it’s also 9.2dB louder than 5870’s in crossfire.

    That’s not even mentioning the significantly higher price while losing to the 5870 in 6 of the 15 game centred graphs of TR’s review.

      • Palek
      • 10 years ago

      I *[

        • BlackStar
        • 10 years ago

        The GTX 480 makes nearly two times the noise in idle and four times the noise under full throttle compared to the 5870.

        Yes it *is* significantly noisier than the competition.

          • Palek
          • 10 years ago

          I stand corrected on the use of the word “significant.”

          Now I would really like to hear the difference between 44.5 db (5870) and 49.9 db (GTX 480). I’m sure I would be able to tell the two apart, but I suspect the noise gap would not be as jarring as it was between the Radeon 9800 and the GeForce FX 5800 back in the day. 47.9 db to 58.8 db is a much, much bigger jump.

          (P.S. full disclosure: ATi/Radeon user here since the 9700/9500 Pro days, currently running an HIS Radeon 4850.)

          • Voldenuit
          • 10 years ago

          As has been stated before in this thread, the decibel SPL measurement measures *[http://www.sengpielaudio.com/calculator-levelchange.htm<]ยง Of course, 1.5x louder is not insignificant, and it is up to the individual to set their noise tolerance levels (my ears are pretty sensitive). Just wanted to clear up some of the misconceptions that I see flying around.

            • Sunburn74
            • 10 years ago

            1.5 x is quite significant to me especially if you’re reading a nice book or something.

    • TurtlePerson2
    • 10 years ago

    The 480 launch actually reminded me a lot of the 2900 XT launch rather than the 5800 launch. A really great product had been out for a while ahead of time (8800 & 5800) and the competitor promised that they were going to bring out something amazing. Both the 2900 and 480 arrived late, ran hot, and didn’t perform as well as expected. The 2900 was a lot worse off than the 480, but I think the comparison still works.

    Unfortunately the 8800 prices stayed high for a long time because ATI wasn’t really offering a competitive card compared to the 8800. Hopefully that’s not the case with 5800 pricing.

      • StuG
      • 10 years ago

      I have to say I agree with your position on this, though for some reason when Intel/Nvidia does something wrong they still have a huge support that will blindly follow them to hell and back.

      AMD/ATI seem to have less of that, and thus when they make a fail product they get hurt from it more. I have a feeling Fermi won’t hurt as bad as it should for Nvidia in the sales department (as the cards become more available).

        • MadManOriginal
        • 10 years ago

        Uh, wait, you’re saying that AMD has less ‘fanbois’ than Intel? (I will give you that NV fans are pretty well unmovable.) ‘I’ll buy AMD because I root for the underdog’ (or – support a company even if it is not objectively the best purchase) is a pretty common statement and is a pretty pure example of fanboi thinking.

    • indeego
    • 10 years ago

    Wow, “Nvidia focus group.” That is crazy. Do any other manufacturers have labels of tools as suchg{

      • Palek
      • 10 years ago

      My first thought was “that’s a bit much,” but I think it should be said that being upfront about his connection to nVidia is actually a really decent move on his part. At the very least it makes “fanboi” accusations redundant, thereby improving the forum SNR.

    • mdk77777
    • 10 years ago

    l[

      • Mourmain
      • 10 years ago

      do you mean “drole”?

      (I don’t randomly correct people on the web, I just liked the rest of your comment)

        • douglar
        • 10 years ago

        Do you mean “Droll” ?

          • Mourmain
          • 10 years ago

          Damn.

      • Damage
      • 10 years ago

      With respect, I think you’ve totally missed my point. Reread, and I noted they were constrained by power and thermals early on. It was key to my point, in fact, which is about the noise levels and power draw of the card they chose to produce.

        • mdk77777
        • 10 years ago

        No, I understand:

        l[

          • Damage
          • 10 years ago

          So your contention is that no engineering, and indeed no engineering trade-offs, went into the graphics card Nvidia eventually shipped? That not only did they not hit their targets with the GPU, but having failed to do so, they somehow chose not to consider yield rates and power consumption on the ASICs when selecting clock speeds and voltages? That they didn’t select the cooler for its efficiency or tune its fan profile for noise levels?

          Isn’t that.. ridiculous on the face of it?

            • mdk77777
            • 10 years ago

            My contention is that it failed so completely, that the real engineering work is required for the next spin.

            Your contention is that they made the best of a bad situation.

            I see no merit in that.
            Anyone can hand massage extreme limited edition versions. Given a lack of any credible profit motive (They know they are going to loose their shirt on every card sold) They would have been foolish to have done otherwise.

            • Damage
            • 10 years ago

            So it’s kind of like…

            Me: “Well, at least it’s not totally unreasonable about noise; let’s be clear.”

            You: “BLAARGH! TOTAL FAIL IS TOTAL!! You’re saying the AIDS virus doesn’t taste bad!”

            • mdk77777
            • 10 years ago

            Well..OK…

            I thought I communicated a more cogent train of thought.

            But we agree that it is a total failure.

            The thing I do find misleading about such apologist reviews: This part will never be mass produced. You certainly know this. If you were serving your readers, rather than speaking of the excellence of the stopgap engineering; you would be telling them to wait for the 3870 version that will follow this 2900 XT.

            That you are so emotional indicates to me that I have struck a nerve on this matter.

            • kuraegomon
            • 10 years ago

            I see real merits to both positions, actually:

            – Given the actual technical capabilities of the card, and the size and complexity of the silicon, the real-world noise/heat penalties are _not_ outrageous

            – Unfortunately, they just _cannot_ build with sufficient yields to sell profitably.

            You may think it’s too early to make my second point, but think about just how much time they’ve had to get the yields to something workable. And this is what they’ve been able to do in the two/three weeks since launch. I think that’s what mdk’s trying to say. Ultimately, this card generation’s only purpose will be to point the way to the next Nvidia generation that is (hopefully) manufacturable.

            Looks like Beyond3D’s GrooTheWanderer hasn’t been too far off the mark this round (and I’m sure you know who’s behind that nick, Damage ;). Now if you’ll excuse me, I’m going to go throw up in my mouth for having had to give him this much credit.

            • willyolio
            • 10 years ago

            i have to side with mdk on this one.

            your argument is that nVidia made a reasonable decision in polishing a turd, and once polished, that turd is somewhat shiny and probably quite good as far as turds go, so nvidia should be praised for that.

            mdk is complaining that they even began with a turd in the first place and needed to polish it to make it marketable. the fact is that fermi was designed for even higher power draw and heat levels, but nvidia was forced to, not chose to, turn it down to make it marketable.

            • wira020
            • 10 years ago

            Glass half full vs glass half empty?

            • Damage
            • 10 years ago

            Not at all. His point does not logically oppose mine in any rational sense.

            My point is that the card isn’t horribly loud. His point seems to be that the GTX 480 is so awful in every sense, no good can come from it.

            One may, of course, believe that the GTX 480 is both not horribly loud and a rather disappointing product overall. That is my general view, as one might gather from what I’ve written. There is no contradiction there.

            If you believe in facts and nuance, in making distinctions and seeing things in full color, my point about the noise stands, regardless of your feelings about GF100 overall.

            • douglar
            • 10 years ago

            The cooler on the 480 has helped improve the state of the art in moving heat at a reasonable volume. That’s a good thing. Let’s hope that the Nvidia subsidized cooler encourages the other vendors to stop using loud sloppy cheap solutions and to start using better cooling devices.

            But honestly, PC noise & heat don’t bother me so much anymore since I put my PC’s in the basement and just ran the cables up through the floor. But video cards that can’t drive a 1650 resolution signal on a 15 foot DVI cable are annoying.

            • ThorAxe
            • 10 years ago

            Nice blog. I think you hit the nail on the head Scott. The noise issue is being blown out of proportion. Now before I get flamed as an Nvidia apologist I’d like to point out that I currrently run a 4870×2 & 4870 in CrossfireX, so don’t talk to me about loud.

            • Ditiris
            • 10 years ago

            I’m siding with mdk7777 on this one.

            Your point is that the card isn’t horribly loud and has disappointing performance as the result of engineering tradeoffs.

            His point is that the card has these attributes because it was poorly engineered. The card is louder, draws more power, and fails to deliver substantially more performance to justify those attributes.

            In ASIC production, these issues are ALWAYS a tradeoff. NVIDIA missed the mark here, plain and simple, as evidenced by the numbers in your original view. This follow-up, quite frankly, does nothing to change those numbers, and everything to damage the credibility of TR’s site.

            • Voldenuit
            • 10 years ago

            My reading of the development of Fermi suggests that nvidia’s engineers ignored a lot of chip design guidelines provided by TSMC throughout the design and certification process, resulting in a chip that was too big and complex for this process.

            Whereas ATI collaborated closely with TSMC, and aggressively axed features (like Sideport) that they felt would decrease their yield and schedule targets.

            So it’s possible that nvidia’s hubris directly led them to this downfall.

            ATI executed smartly, and the presence of their Southern Islands backup plan suggests that they may have had similar fallbacks for Cypress that were unneeded. Whereas if nvidia was as bullish about their next chip as they were about Fermi, things could get *really* bad for them next-gen, when 32nm won’t materialize on schedule as forecast when they began designing it several years ago.

            • Silus
            • 10 years ago

            And that “reading” comes from what exactly ? Where did you get the idea that NVIDIA didn’t follow TSMC guidelines but ATI did, for the 40 nm process ? And even if you did read that somewhere (I can only think of one place that could come up with such a ridiculous conclusion), where’s your common sense to discard that thought completely, because it makes absolutely no sense ?

            That Fermi had a problematic “birth”, there’s no doubt, but to somehow assume that NVIDIA ignored any guidelines, ‘just cause’ is beyond silly really.

            Also, there’s no 32nm for graphics. 28nm is the next process to be used for GPUs.

            • Voldenuit
            • 10 years ago

            Yes, I know about TSMC ditching 32nm. My point was that 32nm would have been very much on the roadmap when the successors to Fermi and Cypress were on the drawing board.

            There’s been plenty of rumour mill about Fermi and its associated delays (many of which turned out to be true in hindsight, though were very speculative at the time). I can’t pinpoint exact articles, since my impressions have been formed by gestalt over the course of the debacle, but Charlie’s articles at SemiAccurate have certainly coloured my opinions. And say what you will about his irrational hatred for nvidia, a lot of his Fermi ‘predictions’ and ‘hypothesizing’ did pan out.

            • mdk77777
            • 10 years ago

            l[

            • grantmeaname
            • 10 years ago

            That’s a wonderful analogy. Limiting the Hummer to 30mph would change its usage model, though, and removing 1/12th of GF100’s render hardware (or whatever) doesn’t. A better analogy would be taking the Hummer’s fuel line out and replacing it with one that has 5% smaller diameter.

      • Ushio01
      • 10 years ago

      The heat and power levels were so high that they had no choice but to disable parts of their chip.

      I don’t believe they disabled parts of their own chip for thermal and power reasons they released cards with incomplete chips because they can’t get any complete chips working.

        • mdk77777
        • 10 years ago

        Both reasons: Even with extreme poor yields they could have launched limited editions versions with the full compliment if the thermals weren’t so bad.

          • clhensle
          • 10 years ago

          I agree with Ushio01, except the full 512 chips do exist, but are being put in the quadro cards (maybe haha). I have a gtx260 in my computer, only because I got it for cheap, but every computer I have built or compiled a parts list for friends has been with either a 5770 or 5850 ๐Ÿ™‚

    • Jigar
    • 10 years ago

    GTX480 SLI beats FX 5800 Ultra in noise test… ๐Ÿ˜›

      • derFunkenstein
      • 10 years ago

      From what Iv’e read that has alot to do with card positioning. nVidia’s “recommendations” are to put the cards 3 slots apart (two empty ones in between) so that there’s enough room for the top card to suck in some air.

      I’m temporarily running with onboard audio for the same reason – my new Radeon 5770 gets very hot and very loud with the sound card in my only PCI slot, which is 2 slots down from the new card (one slot in between, the only other PCI slot on my board). We’re talking about a 12C difference in temperature and fans running much, much louder. This issue is not particular to nVidia.

        • MrBojangles
        • 10 years ago

        I’ll second that i recently got a 8800gts back from a friend of mine after giving him a 4850 i had lying around.I tried to use it as physx only card via a workaround but ended up scraping the idea.After putting it in the pcie slot underneath my 5770. The temps for the 5770 went from 40c idle and 60ish load.To 53c idle and 70ish load.Noise volume also went up alot but i concluded that was mostly do to the gts being added to the case.It’s apparently quite abit loader than my 5770.

          • StuG
          • 10 years ago

          I think the HD5770 is more susceptible to this problem because the internal heatsink it has is much smaller than the HD5830, HD5850, or HD5870’s internal heatsinks.

        • Meadows
        • 10 years ago

        g{

Pin It on Pinterest

Share This