Report: some Diamond Radeon HD 3800s are failing

Nvidia may not be the only company with GPUs failing at abnormal rates. TG Daily reports that AMD partner Diamond Multimedia shipped 15,000-20,000 Radeon HD 3800 graphics cards suffering from "design/manufacturing defects" between January and July.

The affected cards reportedly include all Radeon HD 3850s with 512MB of memory sold in that time frame, along with a "substantial number" of Radeon HD 3870 512MB and Radeon HD 3870 X2 offerings. TG Daily claims the 3850s have "quality issues with poor soldering and integrated memory problems," while the 3870s suffer from a bad resistor value that can "result in computers not starting up and system crashes."

The issues came to light in Alienware systems, where over 10% of Radeon HD 3870 X2s, over 2% of 3870s, and nearly 8% of 3850s failed. As the story goes, Alienware ended up returning 2,600 graphics cards and cutting business ties with Diamond because of the problems. Interestingly, Diamond didn’t actually manufacture the boards—it instead purchased them from ITC, a company that also sells its own cards under the GeCube brand. However, problems may not have struck graphics cards without Diamond stickers just yet.

TG Daily quotes CEO Bruce Zaman as saying his firm indeed encountered an isolated problem with "one vendor" that used inadequate power supplies. Diamond asserted in a separate statement, "We do not have any extraordinary customer call reports for HD 3850, 3870 512 MB boards." TG Daily says its investigation backs those claims, since Alienware was supposedly using sub-par 750W power supplies in affected systems, and the flaws "apparently affect very few users." Folks who are encountering problems should have no trouble getting their cards replaced, the site adds.

Comments closed
    • rivieracadman
    • 11 years ago

    #149 … I agree. I don’t think it’s the power supplies, but it might have been the person who speced them. Cost constraints are engineering and consumer nightmares in the making. Someone always gets bunt from it.

    #145 … No, actually that’s the opposite of what I said.

    The fact is that I had lost a few, and that I know others that have lost a few where we had never lost a few before. I don’t know what I problem is, or who caused it. I do know that we are not alone. I do know that there are people taking action for the failures. I know that Nvidia is going through great lengths to make us all happy, which is great, and we’re happy that they are taking a stand to correct the problems. And, I do know that we usually lose a card every couple years at most, but we’ve lost 7 in the last two. All of which were Nvidia 88 series and discrete. I really like some of the 8 series cards personally. For instance, I have a pile of MSI Fanless 8600GTSs that have been chugging away for the last 2 years. Their slow, and can’t be used in SLI on most boards, but they have never given me the slightest issue.

    When you have someone breathing down your neck to finish a 600mb tiff for a project and your machine fails in the middle of it, you look bad, your boss looks bad, and evenone in the office feels the tension that much more.

    The “specuation” seems very accurate to me personally, but I’m not an electrical engineer so I couldn’t tell you what caused our problems. I just know their dead when they shouldn’t be dying. This costs us time and money, and makes our hardware decisions that much harder to justify to the GM. There are better cards on the market today, with a company that has a better internal track record, so why should we waste our time?

    • Fighterpilot
    • 11 years ago

    LOL out of 1.9 million shipped since November 07 turns out 188 were faulty.
    You “Nsist” on Nvidia types want some icecream with that humble pie?

    • Silus
    • 11 years ago

    Anyway, this is getting off-topic. This is about Radeon failures.

      • Silus
      • 11 years ago

      Are you seriously saying that because you lost some cards, that means that there is a problem with 8800s aswell ?!…

      Also, that article in tgdaily only talks about market speculation and what is “guessed” to be the probable cause of the high failure rate of mobile chips (assuming that what NVIDIA itself said, wasn’t accurate). There is no proof that it affects desktop GPUs aswell. If there was, we would know about it already, especially since G80s are out for over two years now.

    • swaaye
    • 11 years ago

    Why does tech chat have to get so personal every single time? Some of the people here really need to look up what “anecdotal evidence” means and stop taking their pooter hardware so damn seriously. All of it becomes obsolete and inconsequential in a couple of years anyway.

    • derFunkenstein
    • 11 years ago

    I love that when I clicked on this article, I saw an ad for a Diamond 4870.

    • rivieracadman
    • 11 years ago

    I starting reading all the messages, then gave up have way. Let’s look at the facts:

    The problems were with a specific vender, not ATI’s chips, and is one reason why ATI doesn’t like to sell it’s chips to independant board manufactorers. Not only that, but GeCube goes on record as saying that there was a incorrectly installed capacitor on the cards, and that the power supply units were incorrectly rated for the job. So unless ATI’s making the capacitors and power supplies, then this is not any of their concern.

    Secondly, Nvidia is the company currently having problems from faulty chips. Not the cards, not the capacitors, the chips. Why do I know this? I have one. Two in fact if you include the laptops I deal with at work. Nvidia is being sued for these chips, and their failure rate is more then twice the ATI cards. That being said, there are some things I like about Nvidia cards now, and in the past, but I wouldn’t touch one right now.

    I am a professional. I am a Snr Graphic Artist and Draftsman with over 15 years exp. I sit here typing on a duel Quad Xeon workstation with 2 Quadro 3450s and a pair of 24″ Samsing monitors. I also purchased 12 4870 x2s yesterday for department upgrades. Why? Lower failure rate, better performance, better Directx 10.1 support, better display quality, better driver support, and finally they were simply cheaper. The only advantage to going Nvidia at the moment is the new cuda features in Adobe CS4, but ATI is involved with the meat of that too. Just my 2c.

      • Silus
      • 11 years ago

      Lower failure rate in what ? Laptop chips ? Because that’s the only known problem with chips failing at a higher rate than normal.

      I just hope you’re not basing your opinion about failure rates on more than some laptop chips, on Charlie’s rants, because they are completely unfounded. There’s no proof to substantiate them.

      And cheaper ? You bought 12 HD 4870 X2s, which are the most expensive graphics card at the moment…

    • MadManOriginal
    • 11 years ago

    I’d like to know what an acceptable failure rate is for cards.

    • Silus
    • 11 years ago

    Well, this doesn’t change much. I really don’t understand corporate loyalty…

    I’m still wondering though: Where is Charlie reporting this ?…oh right, he’s too busy fabricating rumors against NVIDIA.

    #16, LOL at your fanATIsm…

    • vojc
    • 11 years ago

    nvidia chips are falling like a rain πŸ™‚

    • kvndoom
    • 11 years ago

    When I saw 123 comments, I said to myself “either a lot of gerbils have bad cards, or a flame war hath erupted.”

    Didn’t take long to get an answer. :-\

      • VILLAIN_xx
      • 11 years ago

      Fo sheezy.

      Is this a troller patroling??
      I dont have time to read all these comments.

      New profile Join date today for mr Asif.

        • ludi
        • 11 years ago

        It’s a pretty good bet. If you spiked a white-chocolate mocha with two shots of tequila and a gram or so of crystal meth, fed it to an Irish Setter, and then gave that animal access to a shotgun and a paintball field stocked with a couple thousand squirrels, the results could not possibly be more entertaining than what this clown is up to. As such, it’s a pretty good bet he’s not serious.

    • Convert
    • 11 years ago

    Fighterpilot: Looking at Asif1924 maybe you will see how silly you seem most of the time. Just an FYI, eye openers like this don’t come around too often so I didn’t want to pass it up πŸ˜‰

    Asif1924: 2K a week isn’t much, as you yourself stated. One of the places I work at does that a week strictly in hardware for systems and servers and a small but still significant percentage of that is actually high end systems which include high end video cards.

    Failure rate from both camps is almost nonexistent. Really the only time we have problems is from crappy fans that die (both camps suffer from this) and trying to be cheap and buy the least expensive cards we can find. Dirt cheap Nvidia/AMD PCI cards for dual head support in operatories almost always meant we were going to have a problem of some kind. Neither of these instances usually apply to the high end video cards.

    Really out of all of the high end gaming machines we have put together I would say the most telling issue of all is that some Nvidia cards seem to have this weird stuttering problem. It tends to be with specific games and drivers have zero impact. The games can even show a high FPS but are almost unplayable. Not saying it was frequent but if we have any problem with a card it’s usually that. After an RMA it usually clears up too so I have never really wanted to pin it to other components or the software. Throwing an AMD card in its place clears it up too.

    If anything, seeing the failure rates from a system builder is more telling than your experiences, hopefully that clears it up for you. Perhaps you have had bad luck with your home systems. I also don’t want to step on any toes of other Nvidia fanboys, this is a pretty rare occurrence but since someone wanted to get down to the brass tacks when it comes to reliability I wanted to share my experiences.

    • robspierre6
    • 11 years ago

    I don’t know what are you trying to prove Asif1924. But, i can tell you that your way in discussing things is pretty openly selfish and arrogant.
    your opinions about Nvidia/AMD are …….BS…and worthless.
    I have my powercolor 4870 clocked at 865/1080 and the fan set to 35% and the card loads under 60c.

    best wishes for less stupidity from Robspierre6.
    Ohhh and i hate your nick Assif.

      • PRIME1
      • 11 years ago

      Like looking in a mirror eh Robbie….

      • TechNut
      • 11 years ago

      Thanks! Now I know what my passively cooled Gigabyte 8500 GT started a few months ago with stuttering during playing HD movies. Vista would say the driver failed and restarted…. I had wondered if I didn’t get bitten by that bad card issue. It seems fine for normal 2D, but do any high-res content and it starts to choke.

    • Krogoth
    • 11 years ago

    “It is time to kick ass and chew bubble…. I am outta gum……”

    -Duke Nuked going postal on a certain poster. XD

    • Fighterpilot
    • 11 years ago

    “Nsist on Nvidia”,”NVidia FTW” …lol
    Guess us poor old ATI owners will just have to soldier on with what we have and hope they don’t blow up.
    This new troll Asif is such a good partner for Prime1 I could hardly have organized it better myself πŸ™‚

      • PRIME1
      • 11 years ago

      How are you lovers Robspiere and Ew doing for you? :-p

        • ew
        • 11 years ago

        Excellent. You’ve insulted three people at once. This thread is heating up but it isn’t on fire yet. I have to deduct points for not poking wholes in grammar/spelling though. You also didn’t give much for your opponents to respond to. We all know that the secret to a good flame is to make it impossible not to respond. Keep trying!

          • PRIME1
          • 11 years ago

          Did I? Seems more like 1 and the same.

    • Asif1924
    • 11 years ago

    GPU Temp at LOAD:

    GTX280 – 76 C
    4870 – 84 C

    Looks like your attempt to counter me failed πŸ™‚

      • ChronoReverse
      • 11 years ago

      Sure, since the fan speed of the 4870 was deliberately set low since they can take the higher temperatures (This is ignoring setting the fan speed higher than 30% using a profile). But it’s always heat generation that matters and the GTX280 is higher there.

        • PRIME1
        • 11 years ago

        They set it low because (as Anandtech put it) it sounds like an airplane engine.

          • ChronoReverse
          • 11 years ago

          And I agree =)

          One wonders why the current generation ATI coolers are so poor (relatively). You’d think the dual-slot 4870 should have plenty of room for a nice cooler.

            • PRIME1
            • 11 years ago

            It’s not the cooler so much as how hot the 4800s run. They are clocked very high relative to their cooling solution. If they had them running cooler to begin with (by lowering clocks) the cooler would not need to work so hard.

            • ChronoReverse
            • 11 years ago

            The thing is, the GTX280 is dissipating more heat than the 4870 yet runs cooler when under load. This indicates that the GTX280 has a better cooler than the 4870 by a significant degree.

            Therefore, the 4870 stands to have a better cooler.

            What makes it even odder is how the 2900XT also has to dissipate more heat than the 4870 but it too runs cooler under load compared to the stock 4870. Since the heatsinks between the two are very similar, it’s simply because the fanspeed on the 4870 is set too low.

      • robspierre6
      • 11 years ago

      Ok that’s because the fans speed is set to run at 13%.Once you raise the fan speed up to 35% my powercolor 4870 loads at about 55C.
      Another stupid attempt to prove your point Which is ??????
      If you don’t like ATI/AMD then go and buy your Nvidia cards and stop talking Bs on their cards.Cuz AMD’s offerings now are the best.
      SB..

    • PRIME1
    • 11 years ago

    The 3800’s were garbage anyway.

      • poulpy
      • 11 years ago

      Err are you feeling lonely and looking for a troll of your own now?
      Not sure you’ll get 80+ posts with that though, seems a bit light as it’s all the rage in the other troll ATM..

        • PRIME1
        • 11 years ago

        They were destroyed in both price and performance. That’s a fact.

        I would call your statement more in line with trolling.

          • ew
          • 11 years ago

          Great, now when he responds your going to want to through a personal jab somewhere in your response. Make sure you also insult his grammar or spelling or something completely off topic like that. That never fails to keep things going.

          • poulpy
          • 11 years ago

          yeah right my bad then, calling a line of product like the 3800 “garbage” is clearly fine analysis and not trying to get attention you’re right sir.

          edit: just saw ew’s post, I’m afraid I won’t be able to follow suit when you throw the personal stuff and all that, could we postpone till tomorrow?

          edit2: ATi FTW (for good measure)

    • robspierre6
    • 11 years ago

    This news comes from the Inguirer.It has been reported according to TI that a shipment if DIAMOND’s 3870 were found to fail at relatively high rates,2-10%.this new shimpent.
    Diamond doesn’t make cards, they buy cards from large board makers. This is very common, most GPU sellers in fact do not make their own boards, they contract out to large Taiwanese OEMs, or in some cases, rival board makers.

    The boards appear to be made for Diamond by GeCube,So these chips are not the reference ATI chips.

    Some blame GeCube for sub-standard boards. Others blame Diamond for installing a substandard BIOS.
    But again,It’s not the 3800 series it’s a new batch of 3870’s and some 3850’s cards that Alienware got from diamond.And those boards were manufactured by GeCube not ATI.

    • ssidbroadcast
    • 11 years ago

    Man, this Asif guy is a real piece of work. You spend $2k a *week* on hardware, huh? Good for you. Everyone, let’s all give a round of applause and salute Asif’s enormous e-peen.

    These kind of people make me sick.

      • Asif1924
      • 11 years ago

      “Man, this Asif guy is a real piece of work. You spend $2k a *week* on hardware, huh? Good for you. Everyone, let’s all give a round of applause and salute Asif’s enormous e-peen.

      These kind of people make me sick. ”

      Your personal opinions aside, it means literally nothing to me what you think.

      In another post, I mentioned that I spend a lot on hardware per week. If you know the demands of a Data centre, you’d know that $2k per week on hardware is actually nothing. Bigger companies spend a lot more. Our company is medium-sized yet we deal with terabytes of data. So yes, SANs, Rack-mounted server setups, SQL Server 2005, Analysis Services, Oracle, Visual Studio are the normal routine around here.

      I dont give a damn if that makes you sick. This is a tech forum. Grow up and learn to contribute positively or get the F** out and go puke in the toilet if you’re sick you self-righteous assuming juvenile!

        • durt_b1ker
        • 11 years ago

        I don’t want to be a dick or nothin,

        But your op was a self important statement of opinion based on your *ahem* vast experience with hardware. Therefore your op did not contribute positively to this hardware site. If posting scathing diatribes on someone’s product based loosely on what you ate that day or your personal guesses is growing up and contributing then I would like to remain a child for a while longer.

      • no51
      • 11 years ago

      He seems like a textbook troll. I mean, if he were an IT Pro or something, it makes you wonder how effective how he is at his job seeing as he has free time to go on the internet and start flame wars.

        • TheEmrys
        • 11 years ago

        At this point, Asif has 0 credibility. Way to e-thug it up.

    • oneofthem
    • 11 years ago

    the irony here is deliciously melting

    • MadManOriginal
    • 11 years ago

    Oh man I SO want to see a three-way anonymous internet posting fight between Adisor, Krogot and this Asif1924 jackass. That would be the rumble of the century!

      • Asif1924
      • 11 years ago

      Ok, pick a topic

        • MadManOriginal
        • 11 years ago

        The topic is ‘Why is Asif1924 a flaming blind fanboy?’

      • nerdrage
      • 11 years ago

      Don’t forget to throw in Porkster, Shintel, and Proestersomething to make this the ultimate steel-cage flamewar of the trolls! You could charge admission and everything!

        • ew
        • 11 years ago

        Like WWF meets the Internet!

        • l33t-g4m3r
        • 11 years ago

        glorious would beat all of them.

          • poulpy
          • 11 years ago

          I’d play hellokitty, sure he could do the job!

      • Krogoth
      • 11 years ago

      I do not care for any company.

      I get whatever works best at the time of purchase.

        • Asif1924
        • 11 years ago

        Same here (More than 10 characters is better reading)

          • BiffStroganoffsky
          • 11 years ago

          You guys call that a fight? I want a refund! At least bite his ear off or something.

            • equivicus
            • 11 years ago

            “Use the chair! the chair!!!”

    • l33t-g4m3r
    • 11 years ago

    I was going to buy a 1GB diamond 4870, but now I wonder if I should go with another brand.

      • MadManOriginal
      • 11 years ago

      It’s basically a crapshoot if you’re worried about a problem like this, lots of them are just rebadges. If you want the best warranty though there’s always Visiontek.

    • Draxo
    • 11 years ago

    I am waiting for the first company that makes a card go into low power mode say 25 to 50 WATS at idle, no more than 75 while on the desktop. I could care less for cards that pull 120 + while sitting idle at the desk top, and all I have seen so far is the power requirements going up so all you gerbils can argue who is better and or faster. Where are the innovations? Because if you think about it, no one card will ever be fast enough.

      • DreadCthulhu
      • 11 years ago

      I suggest you look at lower end cards. The Radeon 46×0 cards offer pretty much what you are asking for – the 4650 draws ~50 Watts at full load, the 4670 ~60 Watts. Performance-wise, they pretty much match AMD’s high end cards (3850 & 3870) from last year, and can handle any game out there at pretty good resolution & detail levels.

    • robspierre6
    • 11 years ago

    eventually, this is a problem that only Alienware have had.That if…..they have had it.Cuz these days you have to take everything with a lot of salt.
    What if the whole article form TGdaily is false.and it was/is planned or directed by Nvidia to weaken the attention to their faulty chips.

    • Krogoth
    • 11 years ago

    If Radeon chips are also suffering from chip related problems. It is most likely a problem with TSMC.

      • grantmeaname
      • 11 years ago

      why would radeons be suffering from chip related problems?

    • PRIME1
    • 11 years ago

    Where’s Fighterpilot to pull his “Minister of Information” act.

    Nothing to see here….the US troops are nowhere near us.

    LOL

      • Jigar
      • 11 years ago

      Where are you when some thing goes wrong with Nvidia ? πŸ˜‰

    • Usacomp2k3
    • 11 years ago

    /me looks at his Diamond Stealth 3d 2000 video card

      • ew
      • 11 years ago

      A Diamond Monster 3D was my first 3D video card. Ah, those were the good old days.

        • DrDillyBar
        • 11 years ago

        I think Diamond made my Voodoo2 12mb

        • netkcid
        • 11 years ago

        I had a 4MB Diamond Monster too…

        *tear

      • PRIME1
      • 11 years ago

      The Diamond Moster Fusion was one of the best cards I have ever had. Diamond is a good company and this is just what happens to virtually all companies. Shiz happens.

      • Krogoth
      • 11 years ago

      This company is just Diamond Multimedia in name. πŸ˜‰

    • Asif1924
    • 11 years ago

    I have always maintained that ATI/AMD sucks. To be more precise, they might currently have a small lead over Nvidia in the market right now, in spite of beating nvidia’s top GPU with 2x their GPU’s but eventually those ATI chips will fail. I’ve had problems with overheating many times. Since then I always nsist on nvidia now.

      • ew
      • 11 years ago

      If you’d read the summery you’d realize this has nothing to do with ATI specifically and is entirely the fault of one board manufacturer.

        • Asif1924
        • 11 years ago

        Yes I read the summary Fanboi. And yes, I realize its the fault of one board manufacturer, but my post was highlighting a larger problem I’ve noticed with ATi boards for a long time, and just reassured my decision on sticking with nvidia, son!

          • Jigar
          • 11 years ago

          I think you are the fanboy here kid, cause he never said Nvidia sucks but you did … Stop moronic acts, it will just show your caliber, which is very pathetic.

            • Asif1924
            • 11 years ago

            What is with you people and misinterpreting. .

            Look man, I dont care who sucks, ati or nvidia. All I’m saying is what I’ve experience. As someone who drops 2K per week on Hardware, I need stuff that works.

            So far, like it or not you imbecile, Nvidia is better. End of story, go wipe your tears now!

          • poulpy
          • 11 years ago

          Geez have a break mate..

          Given that:
          – I’ve never had nor never heard of an ATi GPU failure around me
          – this problem is not due to ATi
          – Nvidia is the one with class action suit threats about their shoddy GPUs

          I really can’t see where you’re coming from but eh everyone’s entitled to its opinion, I guess.

            • Asif1924
            • 11 years ago

            The whole industry is going through some turmoil. As I’ve noticed recently, nvidia had a slew of defects in their chips and its now Ati’s turn.

            Whatever is happening, prices will fall. And as for the majority of both Ati and Nvidia products that are normally good (80 to 90% of them), thats what I’ll pick up.

            In the end, yes, everyone has an opinion. But not many people like you can voice that opinion without sounding subjective.

            Thanks mate.

            • ChronoReverse
            • 11 years ago

            The defect isn’t in the chips. You’re simply stretching to include ATI.

            • poulpy
            • 11 years ago

            Well if I may be so bold you did sound quite subjective mate πŸ™‚

            I mean on one hand there were/are large number of GPU defects directly imputable to NVIDIA and on the other hand this board maker which has apparently screwed up his boards but is unrelated to ATi’s design or manufacturing.

            Yet you decide to see this as a sign that you were right to chose Nvidia over ATi so I’m a bit confused but that’s ok I can live with that.

            • Asif1924
            • 11 years ago

            In that case, everyone’s statements are subjective because we all have our own personal experiences.

            Ultimately, my experiences with ATI have been bad, and since I dont like wasting time, I’ve gone with Nvidia ever since. I have never had any issues with heat or performance. And never any issues with drivers.

            Rock solid products.

            • ChronoReverse
            • 11 years ago

            No. Opinions about a company is subjective.

            But stating that ATI is on the same level as Nvidia because one vendor made defective cards whereas Nvidia had defective chips isn’t. Nvidia may very well be “better” than ATI, but in these two particular cases, Nvidia has a serious problem whereas ATI doesn’t (appear) to be at fault.

            • Asif1924
            • 11 years ago

            This discussion is getting academic. Have fun!

          • ew
          • 11 years ago

          It pains me so much to be called a Fanboi by someone who is a member of such an obviously superior “professional” class.

            • Asif1924
            • 11 years ago

            It pains me to have to explain my position so many times πŸ˜‰

            • ew
            • 11 years ago

            I know it must seem like we are all a bunch of idiots because we just aren’t getting what your saying. Must be tough. How do you cope?

            • Asif1924
            • 11 years ago

            Its tough. Scented candles really help! πŸ™‚

          • MadManOriginal
          • 11 years ago

          Maybe if you took the Jen Hsun’s e-peen out of your mouth for long enough to make a coherent post that didn’t sound like it was coming from a hormonally-challenged 14-year-old you’d remember that NV recently had to do a massive recall cost them between $180-200 million. That’s some solid product they’ve got alright!

            • cegras
            • 11 years ago

            Can someone with weight to their words like Meadows tell this guy to go away?

            • Meadows
            • 11 years ago

            This is a tall favour, and I was going to largely skip this news post, but since you asked so nicely… πŸ˜‰

          • Meadows
          • 11 years ago

          So tell us about this porous, ethereal “larger problem” that ATI seems to generally have. Before you reply, note that I’m an nVidia “fanboy”.

      • robspierre6
      • 11 years ago

      Sucks!! i’ll tell you that your Nvidia sucks.I’ve used only ATI cards in the last 3 years and they all are working perfectly.And the 4870×2 beast 3 280gtx’s in sli check the review of the card again you B*.The 4870 is the card that competes against the 280gtx.
      My bro has been using a diamond 3870 for almost 6 months and it’s working without any problem.
      I have a 4870 and yes i adjusted the fan speed and set it on 35% and it loads at mid 60s.

      Troll elsewhere……*

        • chasscF1
        • 11 years ago

        It beats 3 280gtx’s because of drivers not being able to properly increase performance. A lot of times 2 280gtx’s will outperform 3. I doubt you can say honestly at the hardware level that a 4870 x2 is better than 3 280gtx’s. And at high resolutions, the 4870×2 almost always was beat by 280gtx’s in sli. But I don’t think either of you gave a good reason for AMD or Nvidia sucking.

        • PRIME1
        • 11 years ago

        A 9800GX2 beats a 4870X2 in Crysis.

        We can all play this numbers game.

        Now go wipe the spit off your monitor and take your meds.

          • Jigar
          • 11 years ago

          Can you please post the comparison of any other games, I am sure you won’t find one.

          • Asif1924
          • 11 years ago

          Haha, spot on!

          And yes, we still do need Ati in the market, because like I have said, Nvidia’s prices would not have been so affordable lately with competition from Ati.

        • Asif1924
        • 11 years ago

        Listen Fanboi, I wasnt trolling… I am a professional who uses top quality components on a regular basis. So my decision to purchase hardware depends on their reliability and duty cycle. Nvidia is better than ATI. Given that the 4870X2 beats a GTX280,it shows that it takes 2 (count em, 2) of ATi’s chips to beat 1 of Nvidia’s.

        Nevertheless, I am not discounting Ati’s usefulness in the industry. We do need them otherwise we’d be paying a premium for Nvidia products. Competition has forces prices down which benefits us, the end-user.

        So yeah, I wasnt trolling, next time learn to interpret information accurately without blowing your top πŸ™‚

          • Jigar
          • 11 years ago

          So professionals uses this cards ?? WOW, no more replies to you… Thanks you proved again.

          • leor
          • 11 years ago

          you might want to have a look at the quadro and fire gl line in that case, pro.

            • Asif1924
            • 11 years ago

            Yup, I’ve seen them, but I dont need something that high-end. Plus, they’re out of my budget range. Cars in the $600 price range are what I’m after.

            I’m actually waiting for the new GTX350 to come out…I heard/read rumours of it recently.

          • robspierre6
          • 11 years ago

          You were trolling.who said we need nvidia.I bought once a nvidfia card and i will never do that again.It was a disaster.It failed after 3 months of regular usage.After that i have tried 3 ATI cards RX9550 Rx1650XT and the 3870. 2 were from DIAMNOD and one from MSI.Never had any problem with any of them and they still working till this day.
          When you have a problem with a certain card it doesn’t mean that the whole cards are failing….these cards are being manufactured in large numbers so 5% of this chips being faulty is normal.
          And yes the 4870 beats the 280gtx in 5 out of 7 games at 24′ res. check the review again B*.

            • Asif1924
            • 11 years ago

            Get a life. I was NOT trolling, so do NOT accuse me of trolling. I stated an opinion. If you have a problem with it, you have stated it, now keep that to yourself and stop the accusations.

            This is not a discussion about who’s trolling, its about the facts and/or opinions, which we’ve both stated. Now if you could state your opinion without accusing someone then we’ve actually achieved a decent conversation.

            Again, I repeat, do not accuse me of trolling, otherwise you will reveal your true nature of trolling to the public, and I’m sure you dont want that.

        • Asif1924
        • 11 years ago

        And btw, its not “my” Nvidia…lol

        I just happen to think their products are solid, which they are. Time has proven this.

        And yes manufacturing problems will occur but that is not the fault of the Designed (AMD/Ati or Nvidia). So that aside, the designs from Nvidia are usually better, cooler and faster.

          • robspierre6
          • 11 years ago

          time has proven this!!!!
          What are those proof???
          Kid, learn to talk before writing.

            • Asif1924
            • 11 years ago

            Kid? I bet I’m older than you, son!

            And learn to punctuate, spell and use proper grammar before thinking and speaking.

            And learn to stand up before you can breathe.

            LOL

            • ew
            • 11 years ago

            I think he was referring to maturity rather then age.

            • DrDillyBar
            • 11 years ago

            An impressive show for his first day.

            • Asif1924
            • 11 years ago

            An impressive show for who?

            • YvonneJean
            • 11 years ago

            You. It is easy to claim you are some uber computer user who spends money. However, this is the internet where anyone can claim anything. Any credibility you might have gained by that statement was promptly erased by you personally attacking somebody by calling them an imbecile. Your persistent use of the word fanboy, capitalized and specialized spelling doesn’t give the impression of a professional individual. A person doesn’t prove themselves to be a fanboy in one or even two posts. It takes a persistent stand to earn that title, but you tossed that label in a nasty manner after a couple sentences.

            You know who I am? I am a nobody. A reader with some computer knowledge who comes here to be educated. I have been reading this site for years, and enjoyed its high quality. There is only one cesspit of ignorance left on this site and it is these front page comments.

            This is a general call out to all the people who persist in fouling this part of the site. Pages of flame wars consisting of only foul language and name calling need to cease. Put downs and insults have no place in intelligent debate or discussion. To all of you who do: Get off the site. No one respects you for posting. No one cares about you. You gain nothing by doing these childish acts. All you do is annoy your betters. And if that is your only aim, all you would do is drive them away from the comments section and then you will have no audience for your antics.

            And yes, I know I have the option to not read the comments. However, it would be a shame if any of Tech Report’s readers had to avoid a section of the site that occasionally contains posts of true knowledge and intresting points due to a few horrible individuals. To all in the past who posted and continue to post relevant content: Thank you. I have learned so much in the past, and it is part of what keeps me coming back to this site.

            • Asif1924
            • 11 years ago

            Kudos to you for such an intelligent post. Forgive me if I used foul language, but I have seen a lot of posts by people using similar language and only retaliated in the same manner.

            I prefer to discuss facts and not flame people, but since I’ve started posting there were many ignorant comments directed to me which were unnecessary.

            Yes it is easy to claim anything on the internet and no one needs to prove him or herself to anyone here. I speak from experience and I will always state things from what I know.

            I do indeed spend a lot on hardware every week. Our company runs a data centre. We do Business Intelligence and Data Mining. I setup Servers and configure machines for processing terabytes of data. Like I have mentioned before, reliability and duty cycle are important for me in my business and that importance carries through to my personal endeavours in setting up gaming rigs.

            Thats why Nvidia is always at the top of the list every time. Anyone who has a dispute with that or my credentials can go F** themselves.

            Finally, thanks for your post…

            • ChronoReverse
            • 11 years ago

            Erm. A data center for business… using Nvidia GPUs? What?

            Intel. Enough said.

            • ew
            • 11 years ago

            Obviously they are using some sort of cutting edge GPGPU software for data mining. Geesh.

            • Asif1924
            • 11 years ago

            Read my post again several times to clarify your thinking process and you will see that I clearly highlighted the following:

            1. Our company does Business Intelligence and Data Mining with our Data Centre
            2. Our Data Centre requires top-notch hardware and software and of them I mentioned a few to give you an idea: Rack-Mounted Servers, SANs, SQL Server 2K5, Oracle, etc.
            3. Purchasing Top-Notch Hardware and Software is done by me. My decisions are based on past experiences with various products, including Nvidia’s GPUS, which are used in our desktops, and some of our servers. I build all these systems myself.
            4. Since we have been using Nvidia hardware for a long time, it was only natural for that decision to influence my own personal purchases. To put it in laymen’s terms, whatever I decide to buy for the business I also buy for home use.
            5. Hence I buy Nvidia GPU’s for personal use (i.e. my own personal gaming rigs and systems I build for my own small business clients)

            I did not say we use Nvidia GPU’s for our data mining tasks. That was an assumption by someone.,But our R&D department is involved on using any possible means to speed up our data processing for our clients, so GPGPU is one of our approaches. Our typical approach is 64-bit computing.

            • ChronoReverse
            • 11 years ago

            Yes I got that the first time. And if anyone ever recommended that business desktops should have Nvidia GPUs in them, they should be laughed out. Much less servers.

            • Asif1924
            • 11 years ago

            Well, can you qualify that statement.

            We have had *no* issues with our desktops or servers. None whatsoever.

            • ChronoReverse
            • 11 years ago

            Then clearly you haven’t been using the 6150 chipset and integrated graphics in your desktops or haven’t been using large numbers of them.

            I suppose you could have been using older Nvidia chipsets and integrated graphics which were indeed reliable enough. But then you include AMD in your blanket statement…

            But they’re all integrated anyway so extrapolating that to discrete GPUs doesn’t make too much logical sense.

            • Asif1924
            • 11 years ago

            “Then clearly you haven’t been using the 6150 chipset and integrated graphics in your desktops or haven’t been using large numbers of them.

            Most of our servers use the built-in chipset. Either Intel or Ati. No problems with the Ati integrated graphics so far. But no problems with any of our laptops that use integrated GeForce either.

            Some servers have discrete GeForce products which I’d drop in as an upgrade. For the desktops, its always a mid-range Geforce product (8800 GT for example). And for gaming rigs (i.e. for my own personal customers, always Geforce). No problems.

            “I suppose you could have been using older Nvidia chipsets and integrated graphics which were indeed reliable enough.”

            Everything is new. Or whatever has been new for the past 5 years or so.

            “But they’re all integrated anyway so extrapolating that to discrete GPUs doesn’t make too much logical sense.”

            I’m not Vulcan and logic has its fallacies. I’m rational, my decisions are rational. When something works, dont fix it. Simple as that. Nvidia.

            • ChronoReverse
            • 11 years ago

            So basically you’re saying that it doesn’t matter if logically ATI is pretty much as good as Nvidia because logic fails in the face of what you insist.

            The arguments end here then.

            • ew
            • 11 years ago

            Well I thought the post you mentioned was supposed to demonstrate that due to your profession you have some deep insight into why ATI cards are teh suck. Now your saying your elaborate experience is with business desktop graphics. It also sounds like you haven’t even used ATI hardware for a long time. Not very convincing.

            • Asif1924
            • 11 years ago

            No I think you read me wrong. Let me clarify.

            My profession is in BI/Data Mining/Analytics. But I build systems as a side business for customers. I’ve been doing so for over 10 years. I used to use ATi products in the past. We’re talking around 10 years ago. Back then I had many issues with ATI’s products, their drivers and their chips heating up. So I dropped them completely and stuck with Nvidia.

            One very sweet card was the Ti4400. It was a damn good card back in the day. Today one of my gaming rigs uses a 9800 GTX+. Another one uses a GTX260. No issues with Nvidia. None whatsoever.

            My insight in to why ATi cards suck are due to my experiences with them in the past. Today I would not want to waste time with them based on my bad past experiences. Nvidia works for me, my business and my customers. When someone works, you usually dont switch.

            I’ve said this statement over a thousand times…its all based on my experience. It seems like you people still cannot understand that, yet my efforts to further qualify and clarify by mentioning the kind of work I do, and the urgency in my business to make good business decisions seems like its going nowhere. So I’ve assumed you people either cannot understand what I write or you’re just too young to understand where I’m coming from…

            And finally, I’m not trying to convince anyone to buy Nvidia. I’d switch to Ati, if their products were better, cooler, less expensive. Buy what you want. This is just my own personal preference.

            • ChronoReverse
            • 11 years ago

            Good grief.

            Radeon 9700 vs Geforce 5800. 3D goodness versus dustbuster.

            Over the past the heat advantage has swap back and forth between Nvidia and ATI. A blanket statement like “Oh, ATI has been hotter” is just cherry picking to stack your own statement.

        • Meadows
        • 11 years ago

        Singularity calling the pot black.

        You’re the other end of this ooze-infested stick, there’s no call for your fanboyism either. You should both crawl back under a rock and deal with it in person.

        • robspierre6
        • 11 years ago

        Faster and more solid?
        Only a technician or a fanboi would say that.Your definitely not a technician cuz you seem tooo stupid to be one.
        If you checked the 4870 review on this site,you’d know that the card is in the performance level of the 280gtx “matches/beats” with 50% of it’s price.”newegg”.
        So,if you love your Nvidia keep you L* for yourself.

          • Asif1924
          • 11 years ago

          The 4870 is similar in performance to the GTX280, so Nvidia had to lower prices to stay competitive. Thats why Ati had to debut their Crossfire-on-a-stick solution (4870X2) to beat it, because they couldnt do it with just one GPU.

          Secondly, even with GDDR5 and 800 Stream processors, the 4870 is lackluster compared to the GTX280’s 240 Stream processors and GDDR3 memory. That says alot about the design of the RV770 vs the GT200.

          Third, the GTX280 produces less heat than the 4870 and definitely less than the 4870X2.

          ATI’s chips are competitive but in the long run, they will burn out due to higher heat dissipation. Nvidia’s chips are better designed and will last longer.

          pwnd

            • Krogoth
            • 11 years ago

            GT2xx and R7xxx architectures have little in common other than they both comply with DX10 standards.

            It is foolish to do an apples and oranges comparison between the two.

            IMO, the GT2xx architecture is just underwhelming considering that it potentially could easily outrun R7xx family. Instead, it performs like a scaled-up, tweaked G80.

            I hate to break it to you, but both GTX 280 and 4870 generate practically the same amount of heat when loaded. Neither GT2xx and R7xx are good at power consumption a.k.a running cool. You have to go to the previous generation for that.

            The whole argument of product longevity is laughable as both GPUs will be become obsolete long before the effects of thermal stress and electromigration take their toll.

            • Asif1924
            • 11 years ago

            “GT2xx and R7xxx architectures have little in common other than they both comply with DX10 standards.
            It is foolish to do an apples and oranges comparison between the two.”

            This is an apples-to-apples comparison because they are both High-end GPUs targetting the niche section of the market reserved for “extreme gamers”. I’m surprised you didnt go deeper and state that even at the DX10 level, they differ because only the RV770 supports 10.1 while the GT200 doesnt. But even so it would not matter because it is a fair comparison.

            “IMO, the GT2xx architecture is just underwhelming considering that it potentially could easily outrun R7xx family. Instead, it performs like a scaled-up, tweaked G80.”

            Agreed. Thats essentially what the GT200 is. And yes, I agree it is underwhelming. Yet, in spite of this, its performance is on par with the RV770 which contains faster memory and over twice as many stream processors.

            “I hate to break it to you, but both GTX 280 and 4870 generate practically the same amount of heat when loaded.”

            Key word there, practically. From the many performance benchmarks I’ve seen, the 4870 was slightly hotter. And of course the 4870X2 (the GTX280’s direct competitor) was much hotter. 2xGPU’s will generate considerable more heat than one. Which further underscores my statements about Nvidia being a better and cooler product.

            “Neither GT2xx and R7xx are good at power consumption a.k.a running cool. You have to go to the previous generation for that.”

            Granted. But I’ll wait for the GT350 on the 55nm process to come out before I purchase. I never buy the first generation of any product, whether its cars or computer hardware. I take a wait-and-see approach and use that product’s maturity in the market to influence my buying decision. The GT200 is a new core and they will improve on it. First will be the die shrink, then we’ll see what Nvidia has up its sleeves to increase performance. Its exciting and I’ll be watching how things develop very closely.

            “The whole argument of product longevity is laughable as both GPUs will be become obsolete long before the effects of thermal stress and electromigration take their toll.”

            But not laughable if you’ve used ATI products have had them consistently burn out over the years due to overheating. Even my past customers for whom I’ve built systems have had to switch to Nvidia because their 4 or 5 year old ATi cards were causing system instability and burning out. Yet, I’ve never had a problem with any nvidia card I’ve used for any of my clients—or any of my gaming rigs for that matter.

            I’ve said it before and I’ll keep saying, Nsist on Nvidia!

            (Thanks for the upscale post. Much appreciated given the series of flaming ignorance I’ve been seeing here lately)

            • ChronoReverse
            • 11 years ago
            • Asif1924
            • 11 years ago

            “You know he gave you an inch on the heat issue and you take a yard.”

            Thanks for the chart. Now let me use those numbers against you:

            GTX 280 – 122 watts at idle
            4870 – 156 watts at idle

            Why the hell would I throw in a card into a system that uses more power than a Geforce card at Idle?

            “The 4870 generates less heat than even the GTX260 when under load.”

            Yes it consumes less at load, and that is why I will wait until the next generation. As I said before I never purchase first-generation of any product. That is why I’ve got the 9800 GTX+. I will wait for a performance-optimized die-shrink of the GTX280 (i.e. GTX350).

            Your counter-post only validates my claims.

            “And in case you don’t know, power consumption is heat generation.”

            I suppose you studied Engineering as I have. If so you would know that your statement is false. What you mean to say is that “Power Consumption is **directly proportional** to heat generation”. It is not **equal** to it. And secondly, lets see a temperature chart so we can discuss the actual numbers.

            “Temperature is a different issue (and greatly influenced by the cooling solution besides the power consumption).”

            I see where you’re **trying** to go with this, but that is a very weak case for your argument. Very very weak. Show me the temperature charts and lets talk numbers.

            • ChronoReverse
            • 11 years ago

            In the case of electronics, it actually is equal. Seriously. The energy has to go somewhere and it’s heat.

            As for idle powers… the difference is like 30 watts. That’s less than a lightbulb. Plus these are high end video cards. They’re only in systems drawing hundreds of watts. Simply a more efficient power supply would already dwarf the difference at idle.

            • Asif1924
            • 11 years ago

            “In the case of electronics, it actually is equal. Seriously. The energy has to go somewhere and it’s heat. ”

            “Seriously”???

            What, are you trying to convince me now? Where did you study?

            • ChronoReverse
            • 11 years ago

            Okay tell me this then

            Energy goes into the system. At a rate of 100W lets say.

            Where does this energy go? The only way it gets out is heat. At 100W rate.

            It’s not exactly rocket science.

            • Asif1924
            • 11 years ago

            Dude, I’m not going to teach you engineering on a stupid tech forum. Spend the money like I did and get an education lol πŸ˜‰

            Topic done. Over and out. Peace 9000. Hungry for food.

            • ChronoReverse
            • 11 years ago

            This isn’t engineering (although I am an electrical engineer by training).

            This is pure basic science. Conservation of energy.

            All you have to do is tell me where the energy goes if not heat. This should be an easy answer. Perhaps modern GPUs glow or DVI interfaces draw large current? Maybe the RAMDACs generate enormous microwave energy?

            • ew
            • 11 years ago

            It’s not worth it man. Your talking to a stump that has figured out how to work a keyboard.

            • anjulpa
            • 11 years ago

            Being directly proportional was probably enough to rebut, anyway.

            • ChronoReverse
            • 11 years ago

            A lot of things sound weird until you put enough thought into it.

            It seems odd that the power consumption of a chip is equal (technically almost equal; obviously being very slightly less) to the heat dissipation. But when you think about where the “consumed” energy went, it makes sense since there’s pretty much nowhere else the energy could have gone except as waste heat.

            • anjulpa
            • 11 years ago

            Electromigration maybe πŸ™‚

            • cegras
            • 11 years ago

            Oh yes, because a large amount of the energy input goes there, considering 9800 Pro’s are still in use after 5 years ..

            • anjulpa
            • 11 years ago

            Kindly delete if I’m not allowed to post links, but here’s a good discussion. I totally forgot that capacitors take energy too:

            Β§[< http://forums.anandtech.com/messageview.aspx?catid=50&threadid=2133425&messid=27832930&parentid=0&FTVAR_FORUMVIEWTMP=Single<]Β§

            • ChronoReverse
            • 11 years ago

            Capacitors take energy to charge but that energy is either released back into the circuit (and thus not measured as power consumed) or… it’s wasted as heat.

            It’s thermodynamics. Not only can’t you win, you can’t even break even. And heat death overtakes everything.

            • Asif1924
            • 11 years ago

            Listen man,

            Its been fun. You just proved what I was saying. Argument over.

            I gotta go grab some food. Peace to all.

            Nvidia FTW

            • Meadows
            • 11 years ago

            You may have studied engineering, but it was still _[

      • asdsa
      • 11 years ago

      Wow, you started some serious flamewar. Couple of Diamond boards fail and suddenly “ATi sucks” even nvidia has far more serious problems in its hands. I guess this was inevitable but I’m surprised Meadows isn’t here waving nvidia-flag. You may still have some hope left Meadows ;).

        • Asif1924
        • 11 years ago

        I didnt realize that people take their hardware so seriously.

        Saying Ati sucks is just a personal opinion—something everyone is entitled to. At the end of the day its just a company selling a product and their products need to be criticized because it benefits the consumer, i.e. you and me.

        Its people on here who have taken words personally that have made this into a flame war, not me.

          • ChronoReverse
          • 11 years ago

          Except you tied it to something it’s not related to.

          Hurricane Ike did a lot of damage. Therefore Asif1924 sucks.

          • clone
          • 11 years ago

          I’ve owned 58 video cards, 48 of them ati and Nvidia…. some I’ve owned for years some I’ve owned for weeks and then flipped once I was done benching them.

          here is what I’ve learned……. both companies make very nice cards that work quite well, ATI was slow to get it’s drivers inline but that was fixed when Catalyst was introduced and now both companies are essentially even.

          I’ve never had an ATI video card failure, I’ve sold about 200 ATI add in cards and none have failed.

          I have had Nvidia cards fail, I’ve sold about 180 Nvidia graphics cards with 6 of them failing.

          I don’t begrudge Nvidia for these failures, 2 were MSI video cards a 4600 and a 5950 the other 4 were BFG video cards with lifetime warranties that were honored and the replacements are working fine in systems.

          the dead BFG’s were 2 X 6800’s 1 X 6600 GT and 1 8800 GT OC 512mb.

          I have friends who sell as well as a hobby like myself and combined they’ve seen 2 ATI’s fail and 3 Nvidia’s.

          I’m good with either company although for Nvidia I buy only BFG and or EVGA for the warranty concerns with ATI I avoid Sapphire at all cost and try to stick with Gigabyte or Asus for the strong warranty support…. Sapphire’s support is horrendous by any stretch of the imagination, Vendors are spotty on who support the warranty and once it’s up to Sapphire good luck getting ahold of them.

          your opinion that ATI sucks is your own and not indicative of my experiences or those that I know.

          *[

            • Asif1924
            • 11 years ago

            Now that is certainly interesting.

            That is almost the opposite of my experience.

            I appreciate your post and the insight into your own experience. All I can say is, I’ve never had any nvidia card fail.

            BTW, did I mention I live in Canada. ATi is headquartered here, in fact they’re a 20min drive from my business.

            Nevertheless, interesting post.

            • clone
            • 11 years ago

            I just wanted to make note that with the x800 series and some of the 6xxx series suffered from clogging of the gpu heatsinks along with fan failures, the newer cards are too new to tell yet but generally the older series Geforce 2’s&3’s and Radeon 7&8’s have either been forgotten or have disappeared into yard sale heaven with many still running to this day.

            I’ve got 4 32mb G2 GTS’s and a Radeon 7000 along with an AIW 7500 all still good to go…… I refuse to get rid of them nostalgia and all having collected them from trade ins.

            hoping to find an 8500 64mb or 128mb in the future, I know where one is but it’s still being used.

          • Krogoth
          • 11 years ago

          Pot calling the kettle black…..

      • BoBzeBuilder
      • 11 years ago

      Someone tell me why trolls get all the attention in the world.
      57 replies? First time I see such thing. Nice.

      • YeuEmMaiMai
      • 11 years ago

      LOL what an assanine comment by someone blinded by greenboy fanboism……..this is NOT AN ATi DEFECT, IT IS AN AIB MANUFACTURING DEFECT COUBLED WITH CRAPPY PSU DEFECT

        • clone
        • 11 years ago

        while I agree it’s an AIB manufacturers issue I’m not sure powersupply’s should be held accountable….. that smacks more of deflecting blame than anything.

        their are plenty of end users using computers with the worst of power supplies that are not experiencing the issue.

    • DrDillyBar
    • 11 years ago

    While it’s not proof, I did run into POST problems when my Diamond HD3870 was in my system previuosly. Went in lots of circles troubleshooting memory and BIOS setting during the boot process. Eventually got a HD4870 for different reasons, but this makes me wonder.

Pin It on Pinterest

Share This