Charlie’s latest rant says GF100 is a lost cause

As we step into the latter half of the first-quarter release time frame for Nvidia’s GF100, Charlie Demerjian of SemiAccurate has published another scathing article that labels the upcoming graphics processor as "unworkable, unmanufacturable, and unfixable."

According to Demerjian, Nvidia is facing silicon yields in the single digits with the current GF100 spin—and that’s with the GPU clocked down to 600MHz with 64 of its 512 stream processors disabled. After three spins, the GF100 is allegedly not doing much better than the first iteration Nvidia got back in September. (Word is that spin topped out at 500MHz with similarly low yields.)

The problem, in Demerjian’s view, is that Nvidia failed to do its homework by working around issues with TSMC’s 40-nm process using smaller, previous-generation GPUs first, then carrying the workarounds over to the GF100. While AMD was making "salable quantities" of its 40-nm RV740 in April of last year, he claims, Nvidia only managed to ship its 40-nm G216 and G218 GPUs to PC vendors in August, and the cards only hit retail in October. Nvidia had trouble enough getting those low-end GPUs with footprints around 100 mm² out the door, and then, it immediately turned around and tried to get a 550 mm² design through. AMD, on the other hand, had time to implement workarounds learned from the RV740 in its Evergreen series of GPUs.

The result, he claims, is lower-than-expected performance, very high power consumption, and extremely poor yields for the GF100. If a single GF100 wafer costs Nvidia $5,000 and only contains 10 usable GPUs, Demerjian points out, then each chip will cost the company $500. That’s not counting the bill of materials for the rest of the card, either. By contrast, the article suggests AMD’s high-end Cypress GPUs only cost around $50 a pop.

To do things right, Demerjian reckons Nvidia would have to retool the GF100, do a "base layer respin," and follow up with "at least one metal spin on top of that." That process would take six months, and even if Nvidia got started today, the re-architected GF100 would show up mere months before the arrival of the first 28-nm graphics processors. In the end, the only Nvidia DirectX 11 GPUs he expects we’ll see before the end of this year could be limited numbers of GF100 cards, all sold at a loss. (GF100 derivatives reportedly "haven’t taped out yet.")

Demerjian is known for mercilessly slamming Nvidia, of course, so you may want to take this report with a grain of salt. Still, Nvidia could be in trouble if the situation is even half as bad as he claims.

Comments closed
    • anotherengineer
    • 10 years ago

    Either way it just goes to show how we all take silicon for granted. And we just expect everything to work and dont realized the incredible challenges involved in making a fully functional chip with great yields.

    Its taken years of time, billions, if not trillions of dollars, and more people with PHD’s than probly any other field to get where we are in the silicon age.

    Hopefully the advancement continues as such.

    • thomasxstewart
    • 10 years ago

    Charlies’ Mentor, Thomas Stewart von Drashek MD, both Live in SE Minneapolis ,Mn On University of Minnesota Campus, both are Engineering Students plus, Mad Mike Magee, Both thomas & Mike are located in Oxford, although Thomas Stewart Spends Much of time In Washington,Dc on Bolling Air Force Base. Are Oddest Team & Accurate Team At That of Predicting Future of Computing, ToDays Results & How It Might Work.

    Mad Mike gone On to Simply Plant Self inside Intel & Party, Finding Out more in year Than Lifetime of Reading. Charlies’ Dive Into How Everywhere Simiconductors Are used, Is Enormous.

    Charlie, Tom & Mike Share Same Women, Smoke Same dope & Rant Same Rant. Leading Question Is: Is Charlie Mike & Mike Drashek or Vice Versa.

    If Three Muskateers, along with Few Females In toe where North American natives, It’d Bee same Scene.

    Mike & Charlie Write, Thomas Commentos’. Sometimes type so Blazing HOT, Lights Screen On FIRE. From Murder, Mayhem to Science & Imagination, Bursting Seams & Playing public finer Lute than Ever Scorn’d earth BeFore.

    STeWie

    • Sahrin
    • 10 years ago

    Posted in the wrong place

    • moritzgedig
    • 10 years ago

    deleted (some how failed to reply to the thread)

    • anotherengineer
    • 10 years ago

    Well I dont think its a lost cause, just a bit late thats all. Better to have something late that works then something right away with bugs.

    In other news, looks like it may be a good year for ATI drivers 🙂

    • Sahrin
    • 10 years ago

    I don’t know any of nVidia’s internal engineering data about GF100; and logic aside Charlie does tend to foam at the mouth a bit when talking about nVidia.

    All that said, at this point it really seems like their best option is to write of GF100. The most encouraging thing for nVidia is that they made money in Q4 – despite the fact that ATI was crushing them competitively. I wish I knew what special sauce keeps nVidia in the black when ATI struggles with such obviously superior product.

    This sucks from a consumer standpoint, though, because the promised price drop of Evergreen will not be forthcoming if nVidia launches a single high-end card for $600 that is only 15% faster than Cypress at $400.

    *frowny face*

    I wonder where they are with a 40nm shrink of GT200b? That could be a *very* attractive part in the mid-range if it was priced right.

      • moritzgedig
      • 10 years ago

      “/[

        • Sahrin
        • 10 years ago

        Cancelling GF100 only makes sense if it’s unfixable *or* of the cost and time of fixing it exceeds the length of the product cycle. As AMD leanred against Intel the hard way with Budapest (and Intel learned against AMD with Prescott), being late and broken can be worse than not showing up at all.

        If the roadmap holds, AMD will have RV970/Northern Islands out in 7-10 months. If Fermi is even the 60% faster than RV870/Cypress that nVidia claims (and it sounds like that will most assuredly not be the case), RV970 at 32nm looks to be 100+% faster than RV870. nVidia will be ‘showing up’ with a last-gen part; then the issue becomes how do they get back in competition? AMD will have launched two new GPU’s on successively new processes in; on top of that, nVidia won’t even have a DX11 part to migrate to 32nm (and thus test out a proven product on the new process).

        I’m not trying to say “OMG nVidia Doom” I’m saying at some point, nVidia should seriously consider writing off Fermi, respinning GT200b as a 40nm and selling it in the midrange against 5770/5850 (which it can be competitive against) – and focus engineering resources on a top-to-bottom launch of GF200 (Fermi successor, whatever you want to call it).

        I’m rooting for nVidia to do a big launch with GF100 in March – but if that doesn’t happen, I’m willing to ‘write off’ this generation from a performance and competitiveness standpoint in the interest of seeing them compete in Fall 2010.

          • moritzgedig
          • 10 years ago

          /[

            • Sahrin
            • 10 years ago

            l[

    • moritzgedig
    • 10 years ago

    Shouldn’t TSMC take alot of the blame?
    Seems like they promised reliability they can’t yet deliver.
    Or someone made the very dumb decision to fabricate a huge chip on a process with high defect-density.

      • flip-mode
      • 10 years ago

      Yes, TSMC deserves a lot of flack for all of this, but the buck stops with Nvidia. Nvidia is the one that decides how big their chip will be, what steps to take to handle defects, and how much to trust a new process.

      With as big as Nvidia has been designing chips since the G80 and then the GT200, and now the Fermi, they leave themselves no room for error at all. Realistically, they got lucky with G80 and they already took a hit with GT200, except with GT200 they had G92 as a much more acceptable fallback until they got GT200b out. I’m surprised the GT200 didn’t scare them enough to avert this. No doubt, this Fermi debacle with influence company decisions for years and years to come, much as the FX5800 debacle did.

      • clone
      • 10 years ago

      I don’t really believe they do…. TSMC got the ATI silicon working it was Nvidia that chose to go with a huge monolithic design, they chose to start new from the ground up, they chose to ignore ATI’s direction entirely and the lessons that were open for all to see following the triumph of ATI’s 4xxx series.

      even had things gone better it’s not like Fermi wasn’t going to be a bear to produce or that yields wouldn’t always have been an issue.

      Nvidia was overly ambitious……. it’s funny now that I was stating that Nvidia wouldn’t have working cards out before Christmas….. I’m surprised it’s been this long and apparently growing to be much longer.

        • moritzgedig
        • 10 years ago

        /[

          • Sahrin
          • 10 years ago

          This is all based on Charlie’s rumor mongering, but if his yields are even within 100% of accurate then nVidia’s problems are an order of magnitude worse than ATI’s problems were (TSMC reported 40% yields in Q3, Charlie claims nVidia is getting in the low single digits).

          • clone
          • 10 years ago

          I’m not sure anyone said that 40nm was going to be easy…… Nvidia and ATI both were “betting” that 40nm would be up and running by the time their designs were good to go…….. Nvidia bet far more heavily on it than ATI by going with such a huge transistor count and seems to have lost this round quite badly.

            • moritzgedig
            • 10 years ago

            /[

      • Sahrin
      • 10 years ago

      Not really; TSMC passes almost all the risk along to the customer. You pay per wafer (meaning that your cost is fixed plus mask sets and other materials); so if you get 1 good die out of a wafer you paid $10,000 (or $5k, something like that) for that die … if you get 1,000 you paid $10 per die.

      They also pass along their spec for the process (what characteristics it will have), and they will promise very, very little in the that spec. It’s the job of the designer to figure out how conservative TSMC is being in the spec – because if the design assumes the spec is true the designer will lose a colossal amount of money.

      TSMC is a foundry; they develop process technology to the best of their ability, and it’s up to the designer to make effective use of it. nVidia’s design isn’t well adapted to TSMC’s process – ATI’s is.

      Now, there was the initial issue which Charlie ascribed to a miscalibrated tool in production – that is something that is TSMC’s fault – because TSMC wasn’t producing to spec. This is not the norm, however. Wafer defects, for instance, are *not* TSMC’s responsibility, they are considered an artifact of the manufacturing process.

        • moritzgedig
        • 10 years ago

        but the coustomer can’t effect the defect-density?
        nVidia can’t change the detail geometry or electrical characteristics? all they can do is choose a process variation with a different optimisation emphasis?

          • Sahrin
          • 10 years ago

          Not really, no. Think about it – how would feature size/density/whatever affect defect density (without requiring a negative adjustment). Every wafer has defects; manufacturing’s job is to minimize them.

          The customer’s job is to plan for the defects and make the design as resilient to them as possible.

            • moritzgedig
            • 10 years ago

            /[

      • darkpulse
      • 10 years ago

      why should tsmc take the blame??*[

    • bogbox
    • 10 years ago

    Charlie is right and Nvidia confirms somewhat.

    This launch in April- may will be a small downgrade from its original design.

    In article ,I read ,Nvidia said the next gen Fermi(not GF100) will launch in 4 trimester, by Xmas this year .
    Maybe the next-gen Fermi will be GF101 a real product, while in April will be a spin

    • nsx241
    • 10 years ago

    I’ll say this: There’s Charlie and then there’s Nvidia Marketing. The truth is probably stuck in the middle there somewhere.

    • clone
    • 10 years ago

    I bought an HD 5770 early on and was hoping to wait for price adjustments before buying a 2nd or selling it and upgrading to whatever Nvidia offered or a 5870 depending…… no price adjustments and no Nvidia silicon.

    sniff.

    • mcnabney
    • 10 years ago

    Single digit yields?!?!

    Is there a precedent to this level of fail?

    • albundy
    • 10 years ago

    well, they would have released it around when amd released their 5xxx series, but i guess things didn’t go to well for fermi. the problem lies in selling it if its released later this year, as amd already has an armada ready to sell. Its really hard not to consider a chip that’s future proof, and unfortunately, no NV card falls in that category these days. its not that it’s a lost cause, but how long will the catch-up game last and how much of a loss can NV sustain during that time.

    • tejas84
    • 10 years ago

    Demerjian and Nvidia = Senator Joseph Mc Carthy and Communism

    Demerjian is an uneducated idiot journalist. Come back to me Charlie when you have a BSc and MSc in Computer Science like me and are actually educated in the field that you cover.

    Having said that, Fermi is a total disaster and Nvidia probably know that this generation is utterly lost. ATI Cypress is just too good and I should know as I’m sporting one right now.

    Long live Tech Report and Scott and the crew

      • Vhalidictes
      • 10 years ago

      Holy missing the point, Batman! If Charlie was a Computer Scientist he’d be working at ATI… on Driver software.

      Besides the point that you are spouting entirely the wrong degree (Electrical Engineering for those out of the know), Charlie gets the info right because he is actually a reporter, and *reports* what he finds out on his blog.

      In my opinion, he seems to know enough about the field to get by, as we are all discussing his work.

      • Vaughn
      • 10 years ago

      Wait so because Charlie doesn’t have a Comp Sci degree he is not allow to have an opinon?

      Do I have to be a engineer before I say Honda civic’s are shit cars?

        • anotherengineer
        • 10 years ago

        ” I say Honda civic’s are shit cars? ”

        really?!?!

        Guess I will stick with my 1976 chevy 3/4 pickup, the old 350 V8 still runs flawlessly, if only I could find cheaper body parts, the salt eats steel body parts for breakfast up here. Its so old the gauges are in imperial and I’m in Canada lol, I think GM came out with metric gauges up here about 78?

      • PrincipalSkinner
      • 10 years ago

      No, you did not use your post to brag about your degrees. Not at all. Talk about ego…

      • SNM
      • 10 years ago

      You know, Damage’s degrees are all in religion. You’re one hell of an education ass, and coming from me that really means something.

    • NeXus 6
    • 10 years ago

    Anybody got a long list of DX11 game titles coming out in 2010 to share?

      • 2x4
      • 10 years ago

      there are a few metro 2033 being one of them. this one sounds good – no HUD, no indicators, no radar, you want to live you have to manually check the oxygen level in your gas mask, etc. cant wait

      • albundy
      • 10 years ago

      here you go…enjoy.

      §[<http://www.digitalbattle.com/2009/10/27/top-10-upcoming-directx-11-games/<]§ you should seriously google before you post.

    • FuturePastNow
    • 10 years ago

    That’s about as devastating an assessment as one can make about a company’s product. Charlie has been on-point about Fermi to date, so…

    Hopefully this will not make ATI complacent, or greedy. Prices have already gone up a bit in the last year.

    • no51
    • 10 years ago

    So… why did Nvidia concentrate on Fermi instead of making GT200 derivatives again? Seems reminiscent of the jump from the X1xxx to HD2xxx series when they should have gone HD2xxx to HD3xxx.

    • PRIME1
    • 10 years ago

    There’s no hurry for Fermi to get here. With how poorly the 5xxx series is doing, NVIDIA is actually gaining market share with video cards. Using last gen parts. Now that has to hurt ATI a lot and their fans even more.

    Funny how Charlie never reports facts like that.

    §[<http://jonpeddie.com/press-releases/details/astounding-year-to-year-growth-in-pc-graphics-quarter-to-quarter-also-beats/<]§ l[

      • Vaughn
      • 10 years ago

      lol I think its time to get off the meds prime!

      How’s poorly the 5xxx series is doing??

      is that why demand has far outstripped the supply of these cards?

      Are you serious or seriously delusional?

      I usually like reading your post but sorry -1 rep!

      • MrBojangles
      • 10 years ago

      Did you miss the part in that same article were it states that amd lost ground in discrete do to supply constraints. IE they were selling the cards faster then they could make them. I suppose you also missed the chart showing a year to year market growth of 91.5% for amd Vs 47.3% growth for nividia.

      Funny how prime1 never reports facts like that.Even from his own article links.

      • Krogoth
      • 10 years ago

      Nvidia is riding on its OEM sells for this round. (mostly through Apple)

        • PRIME1
        • 10 years ago

        Apple equips very few of it’s systems with video cards. Not nearly enough to make a big difference in market share.

          • glynor
          • 10 years ago

          What are you talking about? ALL Macs now have a minimum of a GeForce 9400M GPU, including the Mac Mini, iMacs, and base-model Macbooks (which is by-far their best seller).

            • PRIME1
            • 10 years ago

            The 9400m is an integrated chip, it’s not a separate video card.

            • mcnabney
            • 10 years ago

            I thought we were counting GPUs and not discrete cards? Nvidia’s numbered are bolstered by all of their integrated GPUs, an area that ATI is nowhere near a dominant player.
            However, Tegra2 is pretty damn cool.

            • glynor
            • 10 years ago

            Nevermind….

      • Game_boy
      • 10 years ago

      Read my post #27. Charlie said this round of Nvidia financials would be very positive.

    • Krogoth
    • 10 years ago

    Fermi is already the next FX.

    The delays from manufacturing difficulties, the hush-hush on its real-world performances, PR releases that are amounted to “wait for us!”, the architecture is too overambitious and falls short of what it can do on paper.

    I just hope Nvidia can respin the design into a more suitable form. The second batch of FX series wasn’t nearly as bad.

      • Meadows
      • 10 years ago

      g{

        • Krogoth
        • 10 years ago

        Nvidia’s hush-hush on actual performance figures spells trouble. They just keep throwing up marketing speak and place massive emphasis on Fermi’s applications in GPGPU-related tasks. That is another bad sign.

        In the past, Nvidia had “leaked” performance figures when they knew that their next-generation product was going to rock the market. The only time that didn’t do that was with the launch of FX series.

          • Silus
          • 10 years ago

          Actually the last time that happened was with G80. There was almost no worthwhile information on it, up until 2-3 weeks before launch.

          As for “Charlie” and his rants, it gets tiresome to see what this moron writes in sites like TR. Unfortunately, it’s still posted on the front news page as if it was actual news…

            • flip-mode
            • 10 years ago

            Again with this nonsensicle notion that you get to decide for the rest of us what gets posted on the site and what does not. Why don’t you leave it up to the TR staff – it’s their site… you can always choose not to read any given posting.

            • Silus
            • 10 years ago

            You’re right. It’s “nonsensicle” and has nothing to do with what I said. I simply stated how unfortunate it is to see it posted here, given the type of “article” this is. Never was I deciding what gets posted here. You need to read better next time.
            And I do have the right to express my opinion here, even if that was the case. It’s certainly not you that’s going to stop it, despite your “Charlie protection” efforts, for whatever reason.

            • flip-mode
            • 10 years ago

            Likewise, your post has nothing to do with what I said; I simply said it is unfortunate that you continually criticize the posting of certain types of articles here. But, yes, you have the right to express your opinion, and I have the right to express the opinion that your opinion is -[

    • beck2448
    • 10 years ago

    BTW, Charlie predicted the 280 would be a failure. How did that work out?

      • Game_boy
      • 10 years ago

      What are the specific claims you are disputing (with quotations)? Ignore the ‘opinion’ parts of his articles, those are exaggerated. I mean statements which are provable (like ‘GT200 will use more power than x’ not ‘GT200 will suck’).

      • glynor
      • 10 years ago

      Actually, he repeatedly said it would be big, hot, difficult to manufacture, and lead to profit and margin declines. Conveniently, Nvidia just released their Q4 and Year End results for 2009 a few days ago:

      l[

      • jdaven
      • 10 years ago

      So being wrong some of the time in your world means being wrong all the time. Oh and FYI, it’s worked out great since Charlie was right this time.

    • Fighterpilot
    • 10 years ago

    Looks like the Green car is about to crash and burn….ouch..poor old InFermi.
    Brian_S,Maroon1,Z-Man and Primey…..ATi fans salute you 🙂

    • dklingen
    • 10 years ago

    I find Charlie’s articles to be very interesting, well written technically, and I don’t recall seeing many holes in how things have played out. I do agree that he definitely does not like Nvidia and the bias shows.

    I do find it interesting how many people responding against Charile and give Nvidia a total pass on their PR, lying, and the FACT they still don’t have a product (not to mention the wacky rebranding). If you want the Nvidia fan page read Fudzilla (their articles and support of Fermi have been nothing short of living in denial).

    Anyhow, we need the competition to bring the pricing down. AMD is likely laughing all the way to the bank at the moment because I would bet their original model already had the 5850 < $250 and the 5870 < $350 by now. But without competition they are milking the profits (which they probably desperately need). I would also bet the 5890 is just sitting in the wings awaiting Fermi…

    What ever happened to the 5950 BTW?

      • Game_boy
      • 10 years ago

      There never was a 5950.

      • Preyfar
      • 10 years ago

      A 5890 may be nice, but it would depend on cost. The 5870 is expensive enough as it is. And as nice as the 5970 is… so few people can actually use it due to the huge length of the card. Anybody with a mid-size tower is sort of SOL.

      ATI’s definitely in a good place though, I just hope future cards aren’t so freakin’ long.

      • anotherengineer
      • 10 years ago

      “I would bet their original model already had the 5850 < $250 and the 5870 < $350 by now. But without competition they are milking the profits (which they probably desperately need).”

      I think originally the 5850 was retailed at 299 US by ATI, which it is, as I can get it for 310 cnd. Which I think is a decent price considering I bought my 4850 for 220 several months after it came out, and the 5850 is almost double the card with more features also.

      I dont know why everyone complains about the pricing, I remember paying 300 for my old x850 pro agp card with only 256MB ram.

      The 5850 is on par to slightly above the gtx 285 in most things and the gtx is often 100+ dollars more expensive, considering that I think 300 bucks for a 5850 is a good deal.

        • toyota
        • 10 years ago

        well the 4850 had an MSRP of only $199 at launch and quickly became cheaper. the 5850 raised that price point to $259 at its launch and then got more expensive hitting $299 or more on average. IMO a 5850 only looks like a decent deal compared to the severely overpriced gtx285. other than that all of the 5000 series are much worse deals than the 4000 series ever were. the 4000 series had a vastly superior bang for buck advantage.

          • ihira
          • 10 years ago

          Why should they price the 5850 @ $199 when its better than a GTX 285 that was selling for what, $350ish then?

          Pricing is (usually) relevant to performance, not model numbers.
          The 4850 had to compete against $200-$300 G92s at the time so AMD priced it $199.

          5850 is priced very well compared to nVidias offerings IMO.

            • toyota
            • 10 years ago

            cards usually fill in about the same price points as before while increasing performance at those price points. if we always paid more for increase in performance then we would be paying several thousand dollars for video cards and cpus by now.

            I already said the gtx285 is severally overpriced so thats the only reason the 5850 is a good deal. heck all cards are overpriced at the moment compared to a year ago. even the gtx260 costs more now then when I bought one over a year ago. right now is a very bad time if bang for buck is important.

            • rhema83
            • 10 years ago

            Which is why while I believe Charlie’s rant (and Anand’s article on the via and channel length issues) but yet hope that NVidia can pull a miracle. Otherwise I will be stuck with my 4850 512MB (not a bad card by any count, of course) for another year.

    • JokerCPoC
    • 10 years ago

    Nvidia bring back production of the GTX 200 series chips and boards until Fermi is firmly in production, Yer shooting Yerself in the foot otherwise.

      • clone
      • 10 years ago

      weak performance, weak feature set, expensive to manufacture…. no point, it would be a loss leader for the sake of a fickle market… volume production would lead to a huge write down later in an effort to clear out the pipeline, the write down would steal future sales….. a lose lose situation.

      pass on the market share now and take the stock hit then look all the better later and grab it back when you get something out to earn it.

      one thing about the graphics market is that it’s fickle…. these aren’t huge purchases so regaining a position isn’t that hard.

    • guest85
    • 10 years ago

    “…Nvidia *[

    • PRIME1
    • 10 years ago

    Sorry charlie but the chip is already in production and has been for awhile.
    §[<http://www.engadget.com/2010/01/07/live-from-nvidias-ces-press-event/<]§ What a douchebag. He should stick to writing what he knows about.....nothing.

      • flip-mode
      • 10 years ago

      As accurate as he has been up to this point, I can understand why you bristle at this. Don’t take it personal.

        • jdaven
        • 10 years ago

        PRIME1 is our resident Nvidia fanboy. It’s in his/her best interest to support his/her worldview by ignoring everything Charlie has said in the past and reset his/her beliefs every time Charlie gets another prediction right.

        • MadManOriginal
        • 10 years ago

        If one is to take NVs demonstration and production statements as true it’s not unreasonable to bristle at it, fanboy or not.

      • MrBojangles
      • 10 years ago

      99% of that press conference seemed to be focused on tegra and 3d vision.The only remarks listed about GF100 were:

      “2:43PM The new GF100 GPU is in production, it’s “ramping very hard,” and will be on display here at CES.”

      “2:45PM He still didn’t tell us when GF100 would ship, and it’s over! What a tease. At least he’s aware that people are curious, very sensitive of him.”

      I’m just confused by how that link was meant to discredit charlies prediction in any way .Imho it just kinda gives it more ground to stand on seeing how release dates and production was a subject avoided to the very end. With essentially nothing concrete said about either one.

      Note: both comments were in the last 5 mins presumably just before the speaker high tailed it off stage Or at least i would have at that point.

      • glynor
      • 10 years ago

      Nice straw-man argument, Prime1. Charlie never argues that the chip isn’t in production. In fact, his second paragraph starts off with:

      l[

      • ClickClick5
      • 10 years ago

      Guys, stop feeding the troll.

        • PRIME1
        • 10 years ago

        Yes, stop feeding Charlie.

          • flip-mode
          • 10 years ago

          You need to get over it. Charlie has been right on the vast majority of his coverage of Nvidia, from “bump gate” and through his ongoing coverage of Fermi. And there’s little he says in the article that hasn’t been known already: Fermi is large, hot, power hungry, and the yields are way terrible bad. The cards are going to be very expensive for Nvidia to produce, and may have to be sold at a loss. The boldest claim Charlie makes is that Nvidia basically has no handle on TSMCs fabbing problems – they have no “double via” solution that ATI has – and I imagine that is a claim that will never be able to be fully proven or refuted. Nvidia has not been making any noise about Fermi – no product demos, no backroom demonstrations; they ran one demonstration of the card at CES as far as I know.

          You love Nvidia, Charlie hates Nvidia; Charlie just has more material to work with than you right now. No matter how you slice it, ATI is feeding Nvidia its own can of whoop ass – power consumption, performance in DX10, DX11, single chip performance, single card performance, multi-card performance, visual quality, product availability, desktop, and now mobile. Nvidia’s just having a really, really bad generation with Fermi, a.k.a. another FX5800. The fact that Nvidia has done so well on the software side of things with CUDA, drivers, and game profiles and TWIMTBP are the things that are saving them right now, and the fact that ATI has a huge gaping $150 hole between the 5770 and the 5850 (how ATI managed to have a hole in the most important price slot is beyond me) where Nvidia can credibly sell GTX 260s.

          Nvidia will get back on its feet and inevitably the tables will turn at some point and ATI will have another bad generation. Of course, it would be nice if both companies could execute flawlessly in perpetuity… but that’s just not the way it goes. Even Intel will likely misstep again and give AMD another opening if we all wait long enough….

            • jdaven
            • 10 years ago

            No matter how reasonable you sound flip-mode, prime1 just can’t hear you. He can’t or the choices he has made in life, to align himself with a hardware corporation of all things would be for nothing. He must not hear you. Therefore in his head, everything Charlie has written must be wrong and there never was a bumpgate, no fermi delays or fake samples.

            On a lighter note, the rest of us here are fine. We hear you flip-mode and realize that these are just corporations and we’ll buy whatever works well and makes us happy with our disposable income.

            • PRIME1
            • 10 years ago

            Actually no, Chuck was saying the bump issue extended onto desktop parts. Which turned out to be complete bullpoop. I can see why ATI fans such as yourself polish his knob. But to the rest of the world he is just a tabloid monkey looking for shots of Paris Hilton’s shaved kitty.

            • ClickClick5
            • 10 years ago

            Speaking of fanboys, has anyone seen the movie ‘Fanboys’?

            This is just getting to hostile here; time for some comic relief.

            • Fighterpilot
            • 10 years ago

            Which “world” would that be then….? Wally World?
            The one we presently live on thinks Charlie has it right about this.
            Witness the tone of the posting here….
            Just suck it up…you’re gonna need the practice 😉

            • flip-mode
            • 10 years ago

            No one polishes his knob, you just can’t see it because you hate him so much for speaking out against your beloved, whether accurate or not.

            Similarly, I keep telling you I’m not an ATI fanboy, and past comments will prove it, so go start digging if you care enough.

            • PRIME1
            • 10 years ago

            I’ve read your posts, you are a hypocrite.

            • flip-mode
            • 10 years ago

            Heh, OK then. Thanks for playing. (By the way, you should have used a semicolon there….)

            • VILLAIN_xx
            • 10 years ago

            Flip, you know who prime is. Theres no reasoning with it. lol.

        • ew
        • 10 years ago

        PRO TIP: Click on the little plus minus symbol on the bottom right of a comment to easily ignore the whole thread.

    • cegras
    • 10 years ago

    I would believe him this time, but most of his article seems like a lot of fluff around two key points that Anand pointed out, e.g. the vias and the transistor variance. I’m not sure if he knew about this before Anand’s article, and now that it’s out we’ll never know.

    • Illissius
    • 10 years ago

    I like nvidia (not a giant fanboy, just like them), but Charlie’s been completely and utterly correct so far, so I feel inclined to believe him. Well, actually, he’s not /[

      • Game_boy
      • 10 years ago

      Do we know PolyMorph is a success yet?

      In particular, how does it function under non-synthetic tesselation [that means not Unigine] and how well does it scale to lower-end GPUs (does scaling with shader count represent adequate or inadequate capability for the entry-level cards? Is AMD’s fixed performance across the lineup a better option?)

      We can’t tell until first reviews.

        • Illissius
        • 10 years ago

        We can’t tell how well it works, but we can tell that it was an intentional effort at innovation and not a hack born of necessity, as Charlie claimed.

    • MrBojangles
    • 10 years ago

    #19 “Intel GPUs are like the prizes in Crackerjacks, they’re not very good and no one would have them if they weren’t included. ”

    I literally loled

    • blitzy
    • 10 years ago

    saw this coming a mile away

    • Jigar
    • 10 years ago

    Looks like we will see one more profit quarter from AMD 😉

    • VILLAIN_xx
    • 10 years ago

    This doesnt sound very well. Im no where near a Nvidia or ATI fan, but I just dont like the news of Nvidia wont be putting pressure on their red competitors to make better future products. Well, I hope AMD doesn’t get too complacent just before Intel came out with Conroe.

    Perhaps being temporary speed king of video cards and charging higher prices would help out their financial problems a little. O_o

    • beck2448
    • 10 years ago

    According to Charlie, Nvidia should be out of business by now but they’re still going strong. They’ll come out with a great product before too long

    • spigzone
    • 10 years ago

    Actually, Charlie has been r[

    • glynor
    • 10 years ago

    All of this points to Nvidia planning a classic paper “launch” where the press is sent hand-picked chips that are individually selected to run at the absolute lowest power draw and higher clock speeds, with vague promised retail availability “in a few weeks”. Then the retail cards show up late, hotter, and slower than the ones the press got (maybe with functional units disabled and a lower SKU being the only one available at retail). If they are even available at all at retail on launch, I wouldn’t be surprised to find that the initial stocks evaporate instantly and then after-launch availability ends up being even WORSE than Cypress was when it first came out.

    With this in mind, I’d like to ask for TR to do two things for us when/if Fermi ever does actually launch:

    1. If Nvidia announces two or more SKUs, but only sends out the top tier SKU for review, please call them on it and follow up with a full review of an actual RETAIL shipping card when they are available. Doing this, rather than using a “downclocked” or “simulated” version of the lower SKUs will prevent them from playing power consumption games with the hand-picked press cards.

    2. Either way, plan ahead of time to do a follow-up article reviewing actual retail samples of the cards when they eventually become available. I would especially like to see performance per watt numbers and performance per dollar analysis with actual market pricing.

    I really hope that Charlie is completely wrong. Good competition in this space is absolutely required, not just to keep AMD pricing in check, but to keep the PC Gaming environment vibrant and alive. If game publishers see the market leader flailing, it will only serve to further cement their impressions of a drowning PC gaming market.

    Unfortunately, I don’t think he is completely wrong. I’m sure he is spinning his information in the most anti-Nvidia way possible, but that doesn’t change any of the facts he is reporting. And all of the public evidence out there perfectly fits with his story.

      • spigzone
      • 10 years ago

      ‘Unfortunately, I don’t think he is completely wrong’ = he has factually been completely r[

        • glynor
        • 10 years ago

        No. I’d guess he is probably mostly correct. I like reading Charlie’s stuff, but he does spin hard against Nvidia. He got angry with them over the “bumps” stuff and hasn’t ever forgotten.

        I don’t blame him, frankly… And I don’t trust ANY corporate PR, but Nvidia does seem to have been on a terrible run lately. Their efforts to slow sales of AMD’s 5000 series cards by leaking vague and irrelevant, but well-timed, info on Fermi have been completely transparent and distasteful.

          • JustAnEngineer
          • 10 years ago

          Where has Brian_S been lately?

    • kpo6969
    • 10 years ago

    It’s all part of the cycle, just like coke or pepsi.

      • spigzone
      • 10 years ago

      That’s the best you could come up with?

        • derFunkenstein
        • 10 years ago

        Seriously. It’s not like you’re on a timer to come up with a witty comment. Take your time! Do it right!

    • wira020
    • 10 years ago

    He might have been too hard on nvidia but so far, most of his prediction and “rants” are proven true… so taking it with a grain of salt isnt really possible to me…

    Nvidia been quite for a while, i’m guessing they’ll show something at their twitter page soon.. maybe robot powered by fermi doing dances or something… or just to prove charlie wrong, a picture of room full of fermi silicons… then we’ll have those “Is it fake” discussion again…

    • jdaven
    • 10 years ago

    Wow, after reading the comments thus far the attitude towards Charlie has turned surprisingly positive. When he posted his first rant towards Nvidia almost a year ago, the internet lit up in flames towards Charlie and his apparently biased opinions.

    A lot has happened since then and many of the things Charlie predicted have come true. Good job Charlie. Keep up the good work.

    Oh and don’t worry about competition. For most users, video cards as far back as the Radeon HD 3xxx and Geforce 7xxx series are still good and there are lots of them available from the likes of Newegg. My HD 4670 is still happily chirping along.

      • ClickClick5
      • 10 years ago

      He has been oddly /[

      • mcnabney
      • 10 years ago

      Charlie is still a pr&*^k, but nobody cares because he has made shocking claims that all turned out to be dead-on. You get to talk smack when you can back it up…

    • Vrock
    • 10 years ago

    Nvidia, ha. Did they let the 3dfx engineers design a chip again?

    • swaaye
    • 10 years ago

    Both companies have had their own “delayed chip due to latest process” scenario. NV had NV30, ATI had R520 and perhaps R600 fits there too. It seems like it’s just a matter of time before they run into major snags.

    • Arag0n
    • 10 years ago

    I talked with nvidia spain/portugal on MWC and they were talking about a 40-60% performance up from the GTX 295 on a single chip design. They also talked about a next month sending cards to the press and they will have a conference in madrid in few weeks to talk about fermi.

    So, at least that the nvidia guy on the MWC knows to lie so well, we may expect some benchmarking on the next month.

      • Game_boy
      • 10 years ago

      The great thing about tech rumours is that we can confirm or deny them with hard numbers upon launch.

      If the GTX480 launches with 1200MHz shader clock, 448 shaders, performs no more than 12% better than the 5870 and is hugely supply-constrained we’ll know to believe him next time.

      If it launches with 512 shaders at 1500MHz and destroys the 5970 I won’t even visit his site any more.

        • glynor
        • 10 years ago

        What I suspect will happen is:

        Nvidia will “launch” it with 512 shaders at 1500MHz. These cards will go almost exclusively to the press. Hand picked, and designed almost exclusively for review. There will be a handful of these out in retail, but they’ll evaporate in hours and then the actual asking price (if you can ever find one at all) will make the initial markups on Cypress look like a fantastic deal. These will never actually ship in volume.

        The vast majority of retail cards will actually be a lower SKU of Fermi, that will fall more in line with what Charlie is predicting. However, unlike most “cut down” parts, these won’t run any cooler than the top-end SKU at all (maybe even hotter than the review samples) and may STILL be extremely hard to find. Eventually, these will actually ship in small volumes (mid/late-summer), but really they’ll just eventually get replaced with a B1 respin of the silicon. That probably won’t be until back-to-school season or holiday season though.

    • pogsnet
    • 10 years ago
    • flip-mode
    • 10 years ago

    Charlie has been pretty accurate up to this point.

    Haven’t heard a peep out of Nvidia about GF100 in a while. Any noise that has come out of Nvidia has been bemoaning the 40nm process.

    At this point, GF100 is sooo late to market that it does seem like the GF FX5800 all over again. If that pattern holds, the GF100 will release in small quantities and will be very quickly followed up by the respin.

    Dunno. Regardless, it can’t be happy days at Nvidia HQ.

      • spigzone
      • 10 years ago

      By ‘very quickly’ you mean 6 months minimum?

      • crsh1976
      • 10 years ago

      Nvidia honestly still has some time, DX11 isn’t a big deal yet as games won’t be shipping en masse until the second half of 2010 (if not late 2010, even), where it’s hurting bad is that they’re next gen chip is basically a nightmare to manufacture and they apparently don’t have a plan B.

      Speaking of which, what are their options? Refab Fermi with the 55 nm process until they can stabilize the 40 nm? Or just rebrand the higher-end 200 series as the “new” top cards.. I’d be scared if that happens.

    • pogsnet
    • 10 years ago
    • anotherengineer
    • 10 years ago

    maybe Charlie should goto TSMC and fix evreything since he is so pro lol

      • WillBach
      • 10 years ago

      He’s not saying that anything is wrong with TSMC’s process, but that NVIDIA designed for it the wrong way.

        • KikassAssassin
        • 10 years ago

        Well, he is saying there’s something wrong with TSMC’s manufacturing process, it’s just that ATI managed to work around TSMC’s problems, and nVidia didn’t.

      • Game_boy
      • 10 years ago

      5xxx seems to be working fine, yielding properly and is in good supply. It’s not TSMC’s fault as Nvidia could have designed around the issues just like AMD managed.

        • StashTheVampede
        • 10 years ago

        AMD, in this instance may not have taken as many risks with their chip (smart). Nvidia, on the other hand, are probably taking huge risks with its design.

    • wiak
    • 10 years ago

    Demerjian is known for mercilessly slamming Nvidia, of course, so you may want to take this report with a grain of salt. Still, Nvidia could be in trouble if the situation is even half as bad as he claims.

    most sites bitch about ati and amd instead so why not let charlie rant at nvidia ^^

      • NeelyCam
      • 10 years ago

      Charlie has been right about a bunch of things on bumpgate, faked fermi boards etc. He must have some connections to insider info…

      He might be exaggerating just a little bit, but so far his predictions have been incredibly accurate.

      NVidia is toast.

        • wira020
        • 10 years ago

        Toast as in Nvidia turned up very profitable last quarter?

          • Game_boy
          • 10 years ago

          Charlie has been saying they will have a profitable quarter this time. I can point you at a few posts where he says their margins will increase (and they did).

          He thinks next quarter will be a lot worse. AMD did not release the lower-end or mobile 5xxx in this financial quarter; for example, Nvidia was the only one with 40nm mobile parts and they got some design wins.

          They also restricted supply of GT200 heavily so didn’t lose any money on that (because selling them against 5700 and 5800 would have been unprofitable).

    • TurtlePerson2
    • 10 years ago

    I suppose this sort of thing is possible. Deep sub-micron integrated circuits have had a history of yield issues. Nvidia keeps using large chips, which only increases the yield problems.

    The fact that we haven’t heard from nVidia in a long time and haven’t seen many leaked performance numbers might mean that the product is further from market than we may have believed.

    It would be a shame if this were all true because the competition between nVidia and ATI in the graphics division is what keeps prices going down and performance going up. The 5xxx series cards are just now starting to hit their MSRP.

      • WillBach
      • 10 years ago

      q[

        • NeelyCam
        • 10 years ago

        Intel’s IGPs are in a completely different class. Fermi would’ve been competing with 58xx/59xx cards.

        Once dust settles, Northern Islands based cards are priced north of $500, and NVidia has quit the graphics business.

          • anotherengineer
          • 10 years ago

          #17 lol are you Charlie in disguise??

          Although I dont care much for nvidia either since they mothballed ULi, but as long as they make money they will be around, which in the end is the best for consumers.

        • TurtlePerson2
        • 10 years ago

        Intel IGPs are for people who don’t know what GPU means. You can’t get an Intel GPU without getting it on a motherboard. Intel GPUs are like the prizes in Crackerjacks, they’re not very good and no one would have them if they weren’t included.

          • Thorburn
          • 10 years ago

          Now that’s not entirely true – I have a Radeon HD 5870 in this system, and an 8800M GTX equipped laptop, but I also have a GMA X3100 equipped Thinkpad and both my media centers are Intel Integrated, one GMA 950, the other GMA X4500 HD.

          In the media centers the main concern is low noise (and in the X4500 systems case, Blu-ray playback), while in the Thinkpads its battery life. Intel’s Integrated Graphics is better than a discrete graphics solution in both these cases.

          • NeelyCam
          • 10 years ago

          You’re dead wrong – Intel IGPs are perfectly fine for most people. What “good” means depends on the application (I’d hate having my laptop battery getting sucked dry by a fast GPU I don’t need).

          Your argument is like saying people who buy Toyotas are idiots – a Toyota is not a real car; what one needs is a much more expensive, gas-guzzling Ferrari/BMWX5/PorschePanamera/whathaveyou. Otherwise one’s trip to the grocery store just sucks, and life is generally worthless.

      • willyolio
      • 10 years ago

      well, intel won’t be any competition at all for a while, and technically nvidia isn’t the only competitor to the video card market.

      consoles are probably the biggest competitor. lots of people are switching over to consoles instead of PC gaming, mostly because the initial set-up is cheaper and the hardware lasts longer, even if the graphics are turned down. video cards don’t have much of a market outside of gaming (professional cards don’t count, they have their own ridiculous pricing) and have to stay competitive with console gaming.

      this is especially true when more and more games are becoming multi-platformers, with fewer PC exclusives and more console exclusive games.

        • NeelyCam
        • 10 years ago

        PS3 FTW! (*

        *) BluRay player included, free-of-charge

    • SecretMaster
    • 10 years ago

    It seems like he just read the Anand article on the 5xxx series and used it to predict failure, IMO

      • UberGerbil
      • 10 years ago

      So then after reading the Anand article he went back in time to write all his other, earlier pieces where he expounded on nVidia’s problems?

      Tech journalists share many of the same sources; the information just gets combined and published (and sometimes spun) in different ways.

      • pogsnet
      • 10 years ago
    • can-a-tuna
    • 10 years ago

    This Charlie seems to know stuff. I guess in a month or so we know was he right or not.

Pin It on Pinterest

Share This