Are Nvidia G92 and G94 parts failing, too?

Charlie over at the Inquirer hasn’t been on the best of terms with Nvidia for, oh, at least a year or so, it seems.  So when he reports widespread Nvidia parts failures, he does it with an awful lot of gusto.  That doesn’t, however, mean the story isn’t true.  He now says Nvidia’s chip failures aren’t just confined to mobile GPUs, as the firm contends, but extend to desktop variants of the G84 and G86 and to the G92 and G94.  If true, failures of those last two will affect PC enthusiasts quite a bit more than any G84/G86 failures, since the G92 and G94 are used on the GeForce 8800 GT, 8800 GTS 512MB, 9800 GTX, and 9600 GT, among others.

So the question is: Is it true?  Are they failing at abnormally high rates?  We could go back and forth with the Inq and Nvidia all day long collecting competing assertions and denials, but I figure we can gather some information ourselves, too, as we did with the IBM GXP fiasco.  If the failures are really widespread, we should hear a lot about them, right?

So here’s the deal.  If you have one of these Nvidia GPUs, please vote in our poll.  If you’ve voted about a failure, please follow up by posting a comment explaining what failed on you and how.

This ain’t exactly science, but if something huge is happening, we ought to get a sense of it, I’d think.  Let’s see what happens.

Update: Voting is now closed. Final results analysis is here.

Comments closed
    • Fighterpilot
    • 11 years ago

    Looks like the poll wasn’t too far off after all…
    §[<http://www.nvnews.net/vbulletin/showthread.php?t=114925&page=17<]§

    • Usacomp2k3
    • 11 years ago
    • moritzgedig
    • 11 years ago

    25% failed acording to this poll
    assuming, that multiple = 2
    I get 25% failed shipments.

    I have to say that I had 100% (5/5) Asus failures, thus I would care for a poll in that direction.

    • zgirl
    • 11 years ago

    I know I had a 8800GTS 512 fail on me. But mostly due to the driver not spinning the fan up when under load. eVGA replaced it and with lots of fussing and riva tuner I was able to get the fan to move air correctly and it has been fine ever since.

    Still that should not be a problem out of the box.

    • WaltC
    • 11 years ago

    I have to say that I’m a bit puzzled by something. Every time AMD reports a loss, TR is usually right on the story splashing it with a headline. But yesterday, nVIdia reported a $120M quarterly loss, and a drop in revenue, and so far I haven’t seen anything mentioned about that at TR. Well, it was only yesterday, and this may be nothing but a timing issue for the site’s personnel. In looking at a number of other sites, though, who also seem quick to headline AMD losses, I also can’t find the story. Here’s the link:

    §[<http://www.marketwatch.com/news/story/nvidia-reports-huge-loss-stung/story.aspx?guid=<]§{E137EFF1-8220-40EC-B324-366D99578629}&dist=msr_1 Or, you can just Google "nVidia quarterly loss" and see where it shows up.

      • PRIME1
      • 11 years ago

      A.) It was only 10% of what AMD lost & B.) TR did post about it 😉

    • Fighterpilot
    • 11 years ago

    Perhaps a run through with Driver Cleaner?
    Reinstall with a previous set of known good drivers for your card?

    • xtremevarun
    • 11 years ago

    does anyone here know how to put a permanent end to my BSOD nv4_disp.dll error in my 9600gso? its horrible and other than a single 5 hour gaming session , i have not been able to play anything for over a week despite update to 177.83??!!!!

    • miles2go
    • 11 years ago

    Inquirer is known for its low standards, even before this nVIDIA thing; they publish multiple articles, first speculating, then contradicting it, and every possible outcome, so they can say, we told ya, you read it here first.

    it used to be more Intel vs AMD, but there’s not much there now. last I read, they claimed that AMD [Phenom] will end up faster than Intel, I don’t follow it anymore…

    • Thorburn
    • 11 years ago

    I’ve had a few different NVIDIA cards fail on me at work, but I use so many cards its hard to say whether it is anything unexpected (say 2 or 3 cards out of the 30 or 40 I’ve used over the past 6 months or so)

    • kmansj
    • 11 years ago

    My (G80) 8800 GTS 640 is sweet. But I’m thinking my next could definitely be a 4870.

    • Fighterpilot
    • 11 years ago

    OMG..*[

    • FubbHead
    • 11 years ago

    My 8800GT hasn’t failed on me, yet, but I won’t buy another Nvidia product again for a very long time. Their driver team is completely useless.

    Not only is their control panel so bloated and horrible I finding myself even missing the good old CCC. But I haven’t had a single driver that didn’t have some really stupid bugs in different places for a good while now. And their installation puts several annoying context menues (control panel, play on my TV, etc) in place aswell, of which some can’t be disabled without going into the registry.

      • axeman
      • 11 years ago

      I picked up an 8800GT recently because it was a steal of a deal, but that’s only because the price/performance of recent AMD cards has caused prices on NVIDIA kit to drop like Freddie Mac’s share price. I hope the card doesn’t take a dump, I’m still waiting for the MIR. And although the drivers seem to work okay, I don’t know what the #$%! they’re doing on the driver front. The control panel blows chunks, and they release updates once a year. BLECH.

      • odizzido
      • 11 years ago

      Same goes with me. Nvidia’s drivers are so terrible right now that even if their cards were faster I would probably still go ATI.

        • TREE
        • 11 years ago

        I’ve got an 8800 GTX and have had it for a long while, it has been replaced once since i bought it due to a strange overheating core. But i can say that it has got to be one of my best graphics card purchases as it still out performs G92.

        However i can also sadly say that ever since 169.xx drivers from Nvidia I’ve started to see a decrease in graphics quality. I’m thinking they’re doing this to boost performance.

        Once driver example is when playing a game like Mass Effect, GRID, or Crysis there are some very strange texture or shading issues… They seem to appear from a certain angle of view and then disappear when that angle changes.

    • Forge
    • 11 years ago

    Scott – An idea: Next time you post a potentially inflammatory poll, please keep the results closed until the poll is. I don’t think anyone here would, but showing the results after each vote might encourage ballot-stuffing on a fanboy-sensitive issue like this one.

      • Fighterpilot
      • 11 years ago

      lol @ “fanboy-sensitive issue”…nice!

      • clone
      • 11 years ago

      my first thoughts pointed towards potential “ballot stuffing” but that said only time will tell but what can be said is………. their were a number of “quiet” recalls on revision 1 factory overclocked 8800 GT’s due to higher than normal failure rates which lead to lots of talk in forums of online retailers as well as at BFG’s and Evga’s forums…….. the arguements weren’t just about failures but also about shipping costs for replacements and occasionally brokerage fee complaints…. that said most of those threads seem to disappear from BFG and Evga’s forums once they are resolved so if your not directly involved you likely won’t notice it.

      the 7900’s went through the same process of high failures and a quick brushing under the rug and in some cases 7900GT’s became 7900 GTO’s…… I had 2 of them and a friend still has one.

      as for the 8800 GT OC’s from BFG I bought 2 revision 1’s and upon hearing about the failures immediately put Arctic cooling accelero heatsinks on them with 1 120mm fan in one case and a pair of 80mm low rpm fans in the other….. I’ve since sold them and have heard nothing back but they are only 4 months old and have a lifetime warranty so I expect if they do fail I’ll hear about it from the customers eventually.

      I hope not, having sold my 8800’s I wound up buying a Gigabyte 4850 512mb….. I got decent coin for the 8800 GT OC’s which was the reason I sold them and not because of failure concerns but with the coin I got I was able to upgrade to the 4850 for a pittance.

    • cegras
    • 11 years ago

    Does anyone know what the exact problem is?

    Someone from ATi told me that because the substrate (PCB) uses is RoHS, but the solder balls on the chip still have high lead content, constant thermal cycles causes them to merge and crack over time.

      • MadManOriginal
      • 11 years ago

      Your souce is as good as anyone’s it seems.

      I had a thought that if thermal cycling is the major cuplrit here then it’s possible that by lowering idle temps on cards while still having higher load temps we may actually be doing some harm rather than good. Maybe a stable idle-load temperature differential isn’t bad after all even if it is higher than we’d like and the idle seems high.

      • Forge
      • 11 years ago

      The failing part is the joint between the actual silicon and the package, not between the package and the PCB.

      The tiny silicon bit is pulling up off the green plastic with the solder on the bottom, not the green plastic bit with the solder on it pulling up off the board itself.

      Sorry for the repetition, just trying to be clear.

      • ludi
      • 11 years ago

      Having lingering lead content in one or the other component’s solder ought to /[

      • lolento
      • 11 years ago

      RoHS has a special exemption for flip chip type package to use lead.

      High-lead solder is much more reliable than Eutectic Sn/Pb and way more reliable than Pb-free.

      In fact, Intel (the first to establish a Pb-free bumping process for flip chip) changed the build up of Pb-free solder structure altogether in order to meet rel requirements (per Jedec).

      So that should answer your question.

      And, I also have close friends at Nvidia, and I was told that the C51 chip is the culprit to all the defects. The defects are pin specific to the PCIX interface and it is due to over heat; laptop fans does not monitor chipset temperature. Thats why you see all the failures related to G84 and G86 AND wireless issues (wireless is connected to mini PCIe). Also, you dont see this problem in C51s where it is used as igp as well.

    • Khorgano
    • 11 years ago

    Obviously, this poll is not scientific, and there is no way of knowing if someone is spamming it either.

    Further, 12% failure rate is pretty high for a consumer product.

      • Khorgano
      • 11 years ago

      whoops, meant to reply to #94

    • Maddog
    • 11 years ago

    My 9600 GT runs 24/7 … never misses a beat. I’m happy with it.

      • stix
      • 11 years ago

      They fail when turned on and off alot that is probably why.

        • Maddog
        • 11 years ago

        Suppose it is good that i have a nice UPS on PC – hopefully I wont have any extended power cuts!! 😉

        (Its a permanent eval unit anyways … no big lose if it goes pfft!)

    • Prototyped
    • 11 years ago

    My desktop 8600 GT (G84) is still going strong after a year of use.

    • alex666
    • 11 years ago

    If you take out that last option, and based on the current returns:17% report problems, 83% report no problems. All things considered, including the population being sampled here,I wonder if that represents a genuine trend.

    • kilkennycat
    • 11 years ago

    Please note that the first 8800GT (G92) BIOS release only ran the fan at a fixed 29% of maximum speed until the GPU core was > 90 degrees C. This was a ridiculous mistake, corrected by nVidia and (at least) eVGA and BFG about a month after shipment. The updated BIOS is download available from the manufacturer websites. With the updates BIOS, the fan speed rapidly increases above a GPU core temp of ~ 73 degrees C, in line with the fan-speed profile on previous nVidia GPU families. Remember that all commercial-grade silicon is standardized for long-term reliability at a maximum case temperature of 70degrees C — about 80 degrees C core temp, typically. Regardless of whether it is ATi or nVidia. All of the ATi and nVidia silicon comes from TSMC and comply with TSMCs design and thermal dissipation rules.

    Remember that the failure of the nVidia parts in the laptops was solely due to the failure of the bonding material between the chip and header, not due to the silicon itself. How widely that bonding material was used, only nVidia knows. From the information currently available, it seems that nVidia was aware of a potential problem with the substrate bonding well over a year ago, but collecting all the information to compute a $200million write-off in conjunction with the laptop OEMS on a time/temp -related failure would have taken a lot of time after the problem was first confirmed. Very likely that none of the latest 8xxx/9xxx families were still using that particular bonding materail.

    Scott, it must have been the back-ache that clouded your judgement with regard to even considering taking anything from the (National) Inquirer at face value. Your informal survey is also likely to be contaminated by failures due to user-overclocking — or in the case of the 8800GT (G92), the failure by users to update the erroneous BIOS in the first shipments.

      • flip-mode
      • 11 years ago

      Are you for real? When did Scott take it at face value? I don’t see it, and in fact, the question Scott asked is q[

      • cegras
      • 11 years ago

      Very likely that none?

      More like very likely that they did.

      But they just haven’t gone under enough thermal cycles to fail. Even notebooks haven’t been failing in droves, yet.

    • ADRENALIN
    • 11 years ago

    i got my 9600gt on April and so far so good (knocks on wood)

    • Perezoso
    • 11 years ago

    Right now is 62/317. ~20% failure rate sounds too high to me.

    • leor
    • 11 years ago

    a cooling algorithm? i want to find the bit of math that makes my chips run cooler!

      • eitje
      • 11 years ago

      it’s located deep in the RDF. 😉

      • PRIME1
      • 11 years ago

      It’s called the BSOD algorithm.

        • Convert
        • 11 years ago

        So Windows ME was on to something after all.

          • PRIME1
          • 11 years ago

          A lot of WinME PCs saw zero use and thus very low CPU temperatures 😉

    • FubbHead
    • 11 years ago

    Put an Accelero S2 cooler and a slow 80mm fan on my 8800GT almost immediately, and it’s at around 55C when working. No problems yet.

      • Ethyriel
      • 11 years ago

      Heh, that’s where my 9800GTX+ runs with the stock cooler, I can’t wait to see what happens when the TRad2 is released. That’s a far cry from the 8800GTS 640 it’s replacing, which ran 25C hotter in the same case with much higher fan speeds. That thing really scared the hell out of me.

        • FubbHead
        • 11 years ago

        Well, I dunno about your stock fan, but this solution is practically silent.

          • Ethyriel
          • 11 years ago

          That’s why I’m eagerly awaiting the TRad. The stock cooler at about 30% is quieter than my Velociraptor suspended out of it’s sled, but it gets pretty loud under load. It’s still a hell of a lot quieter and cooler than my 8800GTS 640. In contrast, that card bottomed out at 60%, and if I forced 30% would hit 90C at idle

          I hope the smaller TRad is as efficient as the HR-03 and your Accelero, but it’s really the only choice while still using the expansion fan on my Lian-Li case.

      • asdsa
      • 11 years ago

      So, you say that all these chip problems goes away, everything is cool and no refunds if you go buy yourself an accelero S2? And I was just expecting that nvidia is screwed… 😛

        • Ethyriel
        • 11 years ago

        It can only help, since the delta between hot and cold is less extreme, there’s less stress put on the packaging.

    • ucisilentbob
    • 11 years ago

    I own a G80 8800GTS and it failed on me in less than a year. But that wasn’t an option on the poll.

    • elty
    • 11 years ago

    One of the XPS laptop at work failed and caused video issues. Replaced the graphic card twice.

    • slaimus
    • 11 years ago

    My BFG 8800GT had a BIOS update released in the beginning of the year to increase the fan speed. Hopefully that will keep it alive long enough until it gets upgraded if it was defective.

    • flip-mode
    • 11 years ago

    AMD GPUs = hot

    Nvidia GPUs = fail

      • BobbinThreadbare
      • 11 years ago

      Via GPUs = ???
      Intel GPUs = profit

        • eitje
        • 11 years ago

        well played, sir. that was awesome.

    • greeny
    • 11 years ago

    My desire to make my next update a green one is dying by the minute!

      • Chrispy_
      • 11 years ago

      Make it a red upgrade instead, then watch as your 90 degrees C radeon cooks itself within six months

        • dragmor
        • 11 years ago

        I’ve got a passive ATI 9550, that’s been running at 90c idle and 110c load for the last 4-5 years without any problems.

        • greeny
        • 11 years ago

        I will make it a red upgrade, and a cooler upgrade too, and it will still be cheaper

        • Kaleid
        • 11 years ago

        Why use stock coolers? They’re are pretty much terrible.

    • gtoulouzas
    • 11 years ago

    I wonder when the inevitable lawsuit is going to hit Charlie. It’s one thing when his rants are confined to the inquirer’s cesspool, and another thing altogether when they spill over to respectable sites such as Techreport, as they have lately.

    Unless the rumors are correct after all, and nVidia is afraid of taking it to court for fear of a class action lawsuit or public stock meltdown.

    I’m still having trouble taking anything coming out of Charlie at face value, though. His nvidia rants sound more like something out of a borderline psychopath rather than your run-of-the-mill harmless techie hack with an axe to grind.

    • tocatl
    • 11 years ago

    My cheap 9600GT with passive cooler is working perfectky fine, even after some unnecesary overclocks 15% above factory numbers, the same for my 8600GT wich was OC up to 15% as well and not a single problem…

    • 5150
    • 11 years ago

    Personally I hope Charlie keeps sticking it to them until they come clean. In the mean time, I wish I had gotten the Intel X3100 graphics instead of the NVIDIA I got in my Latitude D630. At least then I wouldn’t have to worry about this thing crapping out on me when I need it most.

      • Flying Fox
      • 11 years ago

      Not to mention if you value battery life over 3d eyecandies, the Intel chip beats Nvidia one by quite a bit.

        • 5150
        • 11 years ago

        Bingo.

    • moose17145
    • 11 years ago

    Have an Asus F3 series notebook with a 8600M GS in it (not sure of the code name… i dont really follow mobile chips a lot, but i will assume it is in the 86 / 84 series) So far it has been running admirably. Granted i mostly play EVE anymore, but it handles the premium content just fine and without lag. Have owned it for a bit over a year now i think without issue (other than having to take the bottom cover off the laptop to blow dust and hair out of the heatsink fins and fan… but that is to be expected 🙂

    Since all this has been happening though i have been keeping a closer eye on it to make sure it doesn’t acting weird. Hope it’s not an “affected” one if there is a problem though.

    §[<http://www.newegg.com/Product/Product.aspx?Item=N82E16834220163<]§ Been a very good laptop so far!

    • PRIME1
    • 11 years ago

    /[<"Charlie over at the Inquirer"<]/ I stopped reading after that.

      • MadManOriginal
      • 11 years ago

      While it’s true that anything from the Inq and especially Charlie talking about NV needs to be taken with hefty skepticism he was right about the mobile parts when everyone poopoo’d him.

        • PRIME1
        • 11 years ago

        Actually no he was not right as he said “every single chip” was bad including desktops.

          • Meadows
          • 11 years ago

          Prove that he was wrong.

            • StashTheVampede
            • 11 years ago

            Of all the people DENYING the issue is the MAKER of the cards. Who do you really believe: the OEMs getting the cards back because they died under warranty or the company that supplied the potentially defective card?

            Those cards were all made using very similar processes. Desktops and notebook parts are under different forms of heat stress, making the notebooks more prone to failure. The same issue *could* be said of the desktop parts, we simply may not know until more months go by.

            • yogibbear
            • 11 years ago

            All it really proves is that some engineer accidentally thought that the standards used for Melting Point of the solder material in desktop parts was the same for Laptop parts (or something as simple as this, I obviously do not understand the issue at hand, please flame away).

            This does not mean that “Desktop parts are failing too”. They might fail if we put our desktop cards in a notebook, but that doesn’t mean that they are defective. If anything it may mean that if you overclock your card with stock cooling your card might have a greater chance of failure, but that has always been the case.

            • ozy666
            • 11 years ago

            From your link:

            “Could NVIDIA be lying? Yes, it could, but there’s no concrete evidence to support such a conclusion.”

            How is that proof that Charlie was wrong?

            • Meadows
            • 11 years ago

            That is not proof. It’s from nVidia.
            I guess if someone said intel processors crap out at extraterrestrial rates and intel said “no, no, just please keep buying them”, you would believe intel.

            • PRIME1
            • 11 years ago

            So proof is some guy at a crappy rumor site?

            OK well let me link to you an article about the crop circles that Hitler implanted in Paris Hilton’s butt. It’s from the National Inq er Enquirer so it must be true!

            • cegras
            • 11 years ago

            Logical fallacy at it’s finest.

            If a != b, a == c

            Nice.

            • moose17145
            • 11 years ago

            LOL! Wow, for whatever reason, that made me laugh. Thank you!

            • ssidbroadcast
            • 11 years ago

            “… Prove me wrong, children! Prove me wrong!”

            • Reputator
            • 11 years ago

            If you want proof that Charlie is wrong, look at all the freakin G9x users who are doing just fine.

            The poll is a good example of this. It you want to talk about logical fallacies, it only takes one instance of contradiction to disprove a generalization.

            • Scrotos
            • 11 years ago

            12% is an acceptable (from the current poll numbers), not abnormally high failure rate? Or are you dismissing those all stupid enthusiasts who are overclocking and therefore getting their just desserts?

      • ssidbroadcast
      • 11 years ago

      q[<"Charlie over at the Inquirer" I stopped reading after that. <]q *[

      • lolento
      • 11 years ago

      I agree with you, this guy is phishing for info and forcing nV’s hand.

      Like I said before, C51 chipset is the culprit. Even the G84 and G86 weren’t effected, they just got bundled with the C51 chipset.

        • grantmeaname
        • 11 years ago

        do you have any evidence of this?

      • CasbahBoy
      • 11 years ago

      Why do some have this vitriolic hate for the Inq? Is it because they’re occasionally wrong, don’t ‘do’ NDAs, or what? I don’t really understand.

      I been reading The Register a lot and have really enjoyed it, I haven’t paid much attention to the Inquirer.

    • ThomasDM
    • 11 years ago

    Does NVIDIA-hater Charlie Demerjian have any proof to back up his claims? Why does he hate NVIDIA so much?

      • 5150
      • 11 years ago

      You’re right, he should shut up so NVIDIA can get away with this.

      • eitje
      • 11 years ago

      they ate the last of his peanut butter, and didn’t buy more.

      • grantmeaname
      • 11 years ago

      well, other than the fact that nVidia took a charge right after he said they were going to and the fact that right after he said the mobile parts were going to start failing and the fact that his logic about thermal cycling makes perfect sense and the fact that they had the oh-so-common Taiwanese insiders tipping them off about it, no, he doesn’t.

    • pyrophreak
    • 11 years ago

    Currently on my third Nvidia video card on my computer in a year. Ati is looking good for the first time for me since that crappy 8500 I had.

      • TheEmrys
      • 11 years ago

      Way to ignore that year and a half of 9700/9800 dominance…. those were good times for ATI.

        • JustAnEngineer
        • 11 years ago

        Yep. Radeon 9700Pro was two generations ahead of NVidia when it was released. I do not expect that we will see that sort of domination again in the near future.

    • Hattig
    • 11 years ago

    I had an 8600GTX fail on me after a week last year.

    I have no idea what code name it was though.

      • Meadows
      • 11 years ago

      That’s an interesting new videocard model.

        • asdsa
        • 11 years ago

        That just points out that nvidias naming scheme sucks. There are so many GS, GT:s, GTX:s, rebranded 8800s,…that only enthusiasts can keep track on them.

          • Kurotetsu
          • 11 years ago

          Or, y’know, you can go through the Device Manager, Display Properties, GPU-Z, look at the box the video card came in, or open up your comp and look inside to see what video card you have. Those are all methods that don’t require you to remember anything.

          Of course, not being able to remember what you spent money on is not really the fault of Nvidia or anyone else.

            • d2brothe
            • 11 years ago

            Err…no…

            An overly complex naming scheme *IS* nvidia’s fault….

            • Meadows
            • 11 years ago

            Read the comment you replied to.

        • Hattig
        • 11 years ago

        Oh, 8600GTS. Sorry. Well, I’m not, because nVidia could have stopped faffing around with their brand names and just sold it as the 8700.

    • Corrado
    • 11 years ago

    About 20% so far… not a good sign.

    /me hugs his 3870.

    • no51
    • 11 years ago

    I’m on my third G92GTS. The first one was borked out of the box. Kept getting driver errors and pink screens. RMA’d that. Second one was also borked. No better than the first one. RMA’d that and the problems were gone. The first two times I thought it was a driver issue, but then by the third one, I was leaning towards hardware issue. But knowing Nvidia, could be both.

    • deruberhanyok
    • 11 years ago

    I don’t own any of the newer nvidia parts. In fact, I haven’t bought any nvidia products since I was burned by the mess with the GeForce 7900/7950 cards. I eventually got a working 7950GT that is still chugging along but the whole experience has turned me off of them completely.

    I have found this whole thing an amusing throwback to those very same 7900 issues.

    • Roscoe
    • 11 years ago

    I’ve got an 8800GT that’s running fine in my desktop PC.

    The 8600M-GT 256mb (ddr2) in my Dell Vostro 1500 laptop is definitely flaky though, and has been since the moment I first switched it on… the symptom is vertical banding of changed colours across the desktop (2d image). It’s only intermittent though.. and has appeared less since the Dell technician updated the BIOS and video drivers 3 or 4 months ago. (The fan only really kicks in to high gear when playing 3d games though). I can’t actually remember the last time it happened… must be quite a few weeks ago.

    I’m very happy with the performance of the parts, I just wish I didn’t have to worry about them dying on me 🙁

      • adisor19
      • 11 years ago

      That’s what happened with my XPS1330. Get the mobo changed and the problem will go away. For a while. Then it will start again, and you’ll have to get the mobo changed yet again. Fun.

      Adi

        • eitje
        • 11 years ago

        i think it’s pretty disingenuous to proceed to tell him his issue based on your experience. you’ve done absolutely no troubleshooting!

    • jjj
    • 11 years ago

    we might find more about this today in the Q2 Earnings Conference Call but at the very least we’ll see how much this costed Nvidia

    • lex-ington
    • 11 years ago

    I own a 7600GS and I am having troubles leaving my machine on overnight. No matter what I do, the video drivers fail each and every night – for the past 5 months now. The card works fine in my Linux machine, but not in my XP machine. It is REALLY annoying to have to restart my working machine every morning.

      • UberGerbil
      • 11 years ago

      Tried hibernating? Or is the machine actually running a job all night? (In which case I’d suspect overheating)

      I assume you’ve been updating drivers trying to fix this issue.

        • lex-ington
        • 11 years ago

        No hibernating. No jobs running, unless you call AVG a job. I always let the monitors turn off, but that’s it.

        It started off sporadic, but now its every night. I have never had this problem with anything, so it is confusing me alot.

        I’ll be getting a 3870 soon anyways – I keep seeing instant rebates on them to drop the price to around $139, so I’ll bite soon.

          • d2brothe
          • 11 years ago

          Agreed, that would be annoying, but why not hibernate your machine anyways, save on power, if its not doing anything.

    • Jigar
    • 11 years ago

    my system didn’t boot today and i didn’t had much time to mess with it … So don’t know if my 8800GT is dead or not…but i will check it when i’ll be home.. Guess, heavy rain at my place might have some thing to do with this..

    • Krogoth
    • 11 years ago

    It seems that the whole problem if any is probably linked to a certain batch of chips.

    I want to see the dates on “defective” chips if there is any. There is where you can start to make some correlations.

    I suspect this problem to be with TSMC and perhaps Nvidia’s QA being too lenient on yields.

      • HighTech4US
      • 11 years ago

      ./[

        • DASQ
        • 11 years ago

        The XP/Thunderbird exploding is not analogous to this situation, as that is simply no cooling combined with a lack of internal throttling measures.

        As opposed to the heat sitting at a high median for extended periods of time.

        • Krogoth
        • 11 years ago

        Ahem, I believe TSMC is responsible for the packaging of the chip itself. Which is why I suspect the problem to be mostly their fault.

        Semiconductor fabs not only make the silicon itself. They also package the chip assembly on site.

    • DASQ
    • 11 years ago

    Is the last poll option really necessary?

      • Fighthouse
      • 11 years ago

      yes……………..

      • Meadows
      • 11 years ago

      No, it completely distorts the poll and overshadows its purpose.

        • DrDillyBar
        • 11 years ago

        I think it’s completely relevant based on the question.

          • DASQ
          • 11 years ago

          How does “I don’t own one” possibly work into a figure detailing the failure to ownership ratio of the G92/94 series of GPU’s?

          It’s statistically retarded, actually.

            • DrDillyBar
            • 11 years ago

            It’s easy to exclude the numbers if you want. And at least we get click “vote” that way.

            • DASQ
            • 11 years ago

            That’s true, but at least make it something like ‘Bring Back Wild Mountain Chicken Sandwiches at Wendy’s’

            • tray56
            • 11 years ago

            I say we have a poll on the poll

            • A.C.Sanchez
            • 11 years ago

            Wouldn’t you rather have an “I don’t own one” option rather than a bunch of non-owners choosing random responses and REALLY screwing up the poll results?

            Makes sense to me.

            • DASQ
            • 11 years ago

            No, because there wouldn’t be an actual difference. The people who are just going to screw around and randomly pick answers aren’t going to be voting for the ‘Null’ option anyway. They’re going to f*ck up your poll no matter how many options you include.

        • eitje
        • 11 years ago

        luckily, this is a dictatorship, not a democracy – your vote (and mine) don’t matter!

          • DASQ
          • 11 years ago

          It doesn’t matter if this was a facist dictatorship or an open democracy, because we’re not deciding anything through the votes, it is merely polling the populace to gather statistical data.

          Take your angry politics elsewhere? 😀

            • eitje
            • 11 years ago

            psch – you read angry politics, i was showing my religious fervor for all that is TechReport. 😛

            take your misguided assessments elsewhere! how about them apples? 😉

    • adisor19
    • 11 years ago

    2 Dell XPS1330 failed in less then 1 year. Bravo nVdia, BRAVO!

    Adi

      • DASQ
      • 11 years ago

      Why aren’t you blaming Microsoft and then praising Mac? I’m confused.

        • adisor19
        • 11 years ago

        LOL

        Well i wish i could but the subject matter just doesn’t give me anything to relate it to 😉

        Adi

          • MadManOriginal
          • 11 years ago

          Come on, get creative, I’m sure you can do it!

          • eitje
          • 11 years ago

          i bet macs use a better cooling algorithm, or more superior case design, so the cards simply don’t overheat and fail when installed into Apple computers.

            • ucisilentbob
            • 11 years ago

            Doubt it, My sister is on her third MacBook Pro in 4 Months.

            • grantmeaname
            • 11 years ago

            no, the JDF cools them so they won’t fail.

            • adisor19
            • 11 years ago

            Sadly, it doesn’t help the situation. It’s still the same NVidia crap chip inside :s

            Adi

Pin It on Pinterest

Share This