Nvidia denies reports it will soon cancel GTX 200 series

Soon, AMD will complement its Radeon HD 5000 lineup with $100-200 cards based on the Juniper graphics processor, and Nvidia will find itself up against a whole range of relatively small, 40-nm, DirectX 11 GPUs. What does that mean for mid-range and high-end 55-nm GeForces? The rumor mill has some ideas, but Nvidia says those products aren’t going anywhere just now.

Let’s back up a little first. Two days ago, DigiTimes reported that supply of 55-nm graphics processors had run tight. More interestingly, word was that Nvidia didn’t plan to do anything about it, even though its 40-nm graphics processor based on the Fermi architecture probably won’t be out until next year. Charlie over at SemiAccurate had an explanation: Nvidia was faking the shortages to clear out inventory before a massive price cut, using online publications to spread the false rumor.

Yesterday, Charlie updated his prediction, saying the shortages are actually genuine and herald a complete pullout from the mid-range and high-end desktop graphics markets by Nvidia. Quoting sources “deep in the bowels” of the company, Charlie said the GeForce GTX 260, GTX 275, and GTX 285 have all reached or will soon reach end-of-life status, with the dual-GPU GeForce GTX 295 “almost assured to follow.” The GTX 285 has already bitten the dust, he asserted, while the GTX 275 will follow within a couple of weeks, and the GTX 260 will be discontinued by the end of the year.

Independently, we’ve also heard that the GeForce GTX 275 may be reaching the end of its run soon.

Why would Nvidia kill off these cards? Charlie believes the firm “can compete [with AMD] on performance, but not at a profit,” so it could choose to focus on lower-end, cheaper-to-produce parts until its next-generation DirectX 11 GPU, based on Fermi, shows up. That does sound pretty plausible, especially considering AMD’s Radeon HD 4000 lineup already forced Nvidia to cut prices across the GTX 200 series last year. Nvidia could simply wait for Fermi before re-entering the mid-range and high-end markets, or it could wait even longer for a smaller, Fermi-derived GPU.

We asked Nvidia to weigh in this morning. Is the GTX 200 series really doomed? Nvidia spokesman Ken Brown responded quickly and told us flat out there is “no truth” to the discontinuation rumor. Ostensibly, then, the company intends to keep using current-gen products to compete with AMD’s DirectX 11 parts until its own DX11 GPUs arrive.

While we’d take all of the speculation with a grain of salt, we wouldn’t be so quick to discount Charlie entirely—despite his history of badmouthing Nvidia. SemiAccurate got a fairly good estimate of Fermi’s release schedule way back in July, and the die-size estimate wasn’t too far off from our own. We’ll wait and see.

Comments closed
    • tejas84
    • 10 years ago

    I hate Charlie and he is a generally a FUD propagator BUT…

    He has been ominously correct with all of his Fermi stories. Although I am an Nvidia GPU guy I am shocked that Nvidia think they can allow ATI to rape them for 3 to 4 months and think it wont affect them.

    Frankly Nvidia are too arrogant and they will pay for it, by gifting precious market share to ATI. Cypress is an amazing GPU and by the time Fermi comes, it will come up against the 5890 and/or possibly 68xx series.

    What the hell were they thinking with having Fermi so HOT and LATE?? As Snakeoil would say ” What is wrong with Nvidia?”

    It saddens me to see Nvidia like this but to be honest they deserve what they are getting due to their complacency. Come on Nvidia get your act together or you will lose support of even your most loyal fans…

    • paulhamm
    • 10 years ago

    Charlie only stated that the g200b based cards are being EOLed. Not the entire line of g2xx based cards. Though I believe there is only 1 part available atm that is not g200b based.

    paulhamm slaps PRIME1 on the back of the head. Minnesota is part of the US if you believe otherwise I recommend you got back to your grade school and demand that they actually teach you.
    §[< http://www.semiaccurate.com/2009/08/18/t-mobile-wont-obey-law/< ]§ Silus etal picking nits and throwing fecal matter should be left to feral carnival monkeys, it is decidedly unbecoming in a species that claims to be more advanced. Stop! To all the "Nvidia denies" so Charlie is full of BS. Ken Brown is Nvidia PR. That means everything he says should be taken with a large salt cellar within arms reach. When the paycheck has Nvidia printed on it you have to accept that those statements will be biased for Nvidia. BTW until Nvidia announce that the g200b cards are EoL "There is no truth to Charlie's statements." is factually correct while not stating when or if they will be EoLed in the near term. Charlie is obviously biased against Nvidia. You take that into account while reading his articles. If you take the 2 articles as a whole you will understand why Nvidia will almost certainly EOL g200b. It is all about cost. AMDs Juniper kit has $35 or better pricing advantage over Nvidia g200b kit. I estimate the Juniper HD5750 kit at ~$70 and the HD5770 kit at ~$85. Compared to Nvidia GTX260-216 kit at ~$110 In the $100-200 price range that is huge. The pricing advantage is in the BoM. That means retail advantage is more. The GTX260-216 is going to be facing off against the HD5750 at $129. In this segment price is king, Nvidia can not compete. Getting the g300 and its half scale derivative out must be the priority for Nvidia. This article is the more interesting of the two. §[<http://www.semiaccurate.com/2009/10/06/nvidia-will-crater-gtx260-and-gtx275-prices-soon/<]§

    • deinabog
    • 10 years ago

    I have to say this is all meaningless. If Nvidia is going to cancel its GTX 200 line of GPUs then we’ll find out soon enough; if not then it’s moot.

    What I find a bit annoying is all this fanboy slobber over which next-gen architecture is going to emerge supreme. Fermi isn’t out yet and until it’s released no one really has any idea how it’ll perform against AMD’s Radeon 5000 series. Better just to wait and see what actually happens than to get stoked over maybes, what-ifs, and all other prognostications.

    • beck2448
    • 10 years ago

    Fermi will crush ATI like roadkill once again.

      • flip-mode
      • 10 years ago

      I think Fermi will probably beat Cypress by a good margin. If it does not, Nvidia is in an extremely, extremely, life-threateningly-bad position, mainly because of die size. In terms of die size and price/performance, I expect that ATI should be in a very good position. By the time GT300 launches, ATI will have launched it’s entire line-up, and yields improvements will have been realized. Nvidia may win the performance battle, but that is only one piece of the puzzle.

      • pogsnet
      • 10 years ago
    • berrydelight
    • 10 years ago

    Charlie has been generally right lately. It is obvious Nvidia will eventually cancel the GTX 200 series, but the question is it as soon as Charlie says Nvidia will?

    • jinjuku
    • 10 years ago

    What happened to the days of nVidia stomping ATI’s collective brains out? nVidia is starting to remind me of 3Dfx.

      • OneArmedScissor
      • 10 years ago

      What happened is that things finally reached a point with games where enough is enough. It happened for pretty much any other sort of home computer use, long ago.

      Think about other everyday things the aveage person does on a computer. Unless you’re playing a game, how often are you actually slowed down by even a $300 laptop? Games were pretty much the last wall to come down, and it took longer, but it was inevitable.

      It won’t be long before integrated graphics are plenty for small laptops, and low power discrete cards make their way into everything else.

      There will always be specific professional applications that benefit from specific hardware setups, but that’s apples and oranges, compared to home/entertainment use.

      • berrydelight
      • 10 years ago

      Nvidia has only dominated ATI in 2007-2008 (i.e. 8800GT). Since then ATI has been competitive in all market segments. Now with the 5870, ATI basically has the performance crown.

      • SomeOtherGeek
      • 10 years ago

      NVidia let it get into their heads that they were king and got lazy. They are paying the price now.

    • Rza79
    • 10 years ago

    Well the shortages are genuine.
    I have GTX 285’s in backorder already for a while and none came through. ATM i’m cancelling the BO’s and replacing them with HD 5850’s (which surprisingly are available).

    • bLaNG
    • 10 years ago

    I don’t quite understand where is the problem with this news? Ok, it is based on Charlie’s article…so what. Yes, he is anti-Nvidia. But he writes interesting articles. Is everything true that he writes? No. Is he right sometimes? Yes.

    And even if he is not right with his conclusions you’ll find some interesting piece of information in his work.

    I actually wish there would be more writers than him. He tries to bring background information into his article and being investigative. The only thing I miss is someone that is anti-ATI and writes the same way Charlie does.
    Anyway, this is muuuuch better than the usual press release articles you find on so many tech related websites.

    No one forces you to simply believe what he is writing. Don’t switch off your brain while reading his stuff, and draw your own conclusions. But simply turning down everything that comes from him is a bit like putting your fingers in your ears and go “lalala”.

    EDIT: Sorry for bad grammar, I am obviously not a native… 😉

    • trinibwoy
    • 10 years ago

    The hilarious thing about this is that Charlie is just a sad little man but still he has Nvidia running around refuting stuff he pulls out of his ass. It must give him great joy.

      • reactorfuel
      • 10 years ago

      Charlie’s not a sad little man, he’s a smart businessman. He rakes in the pageviews by publishing every outlandish rumor he hears, and every fanboy flamewar gets him more publicity, pageviews, and ad impressions.

      If he did balanced, sober analysis and only published verified stories, he’d be just another tech blogger. Instead, he throws outlandish rumors out there – some true, some false – and gets everybody talking about them and linking to his stuff. Which do you think ends up bringing in more dough?

        • MadManOriginal
        • 10 years ago

        Smart or not is that a good waay for news reporting, even if it is rumors, to be moving?? Page hits and $$ >> proper reporting? I suppose I should look at ‘infotainment’ like John Stewart (who I do think is funny but that’s it) and not be surprised that the unwashed masses lap it up.

          • reactorfuel
          • 10 years ago

          I’ve got a problem when it involves policy decisions like whether to go to war or what to do about healthcare, and when opinion’s presented as Fair and Balanced news.

          However, we’re talking about video cards, not civilian lives in a far-off country or the healthcare of millions of uninsured people. The worst thing that could possibly happen would be a drop in somebody’s stock price, if investors were stupid enough to base purchasing decisions on what Charlie says. Even that’s pretty unlikely, though – the only real result is a storm of fanboy rage that ultimately effects absolutely nothing.

    • noko
    • 10 years ago

    What concerns me is Charlie’s premises are, well true. ATI/AMD’s next generation blows Nvidia’s current tech out of the water which appears will be from top to bottom. Cheaper GPU’s, more powerful, cheaper boards to make for GPU. Who will actually want to buy large quantities of nV GPU’s? Plus Nvidia not having a complete line top to bottom with the top delayed to who knows how long actually doesn’t sit well. So in a nutshell, Nvidia is actually in big trouble now from a business standpoint of ability to make profits in the near future.

    • Wintermane
    • 10 years ago

    I was expecting this bump in production for awhile now as I expect both nv and amd are moving some of thier old lineup to 40 nm to cut power and cost.

    Id expect to see a 40 nm variant of the 250 260 and yes 285 295. And definetly variants of the 240 230 and 220 depending on how well the 40 nm process is working for smaller chips.

    • WaltC
    • 10 years ago

    I think the interesting thing here is that even if nVidia had planned to cancel production I don’t believe they’d want that known until they were ready to do it–so I think they’d deny it over the short run even if cancellation is in the cards longer run. They’ll need to keep selling whatever product they can until they have something to replace it with, obviously, and if they admit to cancellation plans then it’s possible they might not be able to move what they need to move before their next product is ready for market.

      • MadManOriginal
      • 10 years ago

      So it should be something like…

      “HEADLINE: nVidia to stop production of old chips, introduce new chips! Tune in for more completely obvious statements tomorrow!”

    • Convert
    • 10 years ago

    What is the deal with you people complaining that TR posted this news item?

    Did you even bother to read it?

    They take Charlie’s claims, add third party information to it and go so far as to ask Nvidia about it as well.

    Cyril even states: q[< *[< While we'd take all of the speculation with a grain of salt <]*, we wouldn't be so quick to discount Charlie entirely—despite his history of badmouthing Nvidia. SemiAccurate got a fairly good estimate of Fermi's release schedule way back in July, and the die-size estimate wasn't too far off from our own. We'll wait and see. <]q

      • Silus
      • 10 years ago

      Fairly good estimate ? LOL don’t make me laugh…

      So Charlie says GF100 won’t see the light of day this year, while on the other hand NVIDIA, the creator of said hardware, says it will and it’s a “fairly good estimate” ?

      As for die size, Charlie spews 530 mm2, while a “fairly good” speculation @ Beyond3D, you know with ACTUAL known facts combined with pieces of known silicon (GT215), but it at roughly 480mm2.

      But hey, Charlie said it so it’s a “fairly good estimate” and everything else must be ignored!

      Hilarious 🙂

        • Convert
        • 10 years ago

        According to the article linked it could see the light of day in December, isn’t December this year? It’s not like Nvidia is all that good about release schedules anyways, are they talking fiscal? I didn’t see the article where Nvidia stated the release date for Fermi.

        As for the differences in die size, the phrase “Fairly good estimate” did not apply to it, Cyril mentioned the release schedule was a fairly good estimate. The stated die size on the other hand wasn’t that far off from their own and even less than Beyond’s. The difference isn’t that much.

        You are complaining about comparing an estimate to Charlie’s “source”, until Nvidia releases the product we won’t know and you are just making a lot of noise over nothing.

    • Silus
    • 10 years ago

    And TR continues with the Charlie based “stuff”…Quite amazing…

    The title of this has “NVIDIA denies reports it will soon cancel GTX 200 series”, when the only “report” (which is not even a report, it’s just the usual wild speculation) comes from Charlie.
    “Wild Speculation” became “reports” in TR’s news title…wow…

    Thanks for continuing to give hits to that moron’s site TR. He certainly appreciates it, with all the ads and banners for AMD stuff he has there…

      • SomeOtherGeek
      • 10 years ago

      Wrong, I didn’t go to his website and I don’t think derFunk among others did either.

        • Silus
        • 10 years ago

        And I think so too. “Some” didn’t go, but as you can see in this very thread, the majority is a Charlie lover and probably did. They love seeing AMD ads 🙂

    • spigzone
    • 10 years ago

    Charlie’s trying to HELP Nvidia.

    Hitting Charlie for pointing out the needful is classic shoot the messenger.

    Help happens when someone accurately points out F-ups, shining light on the bad in an attempt to drive change for the better. Same principle applies across life.

    Charlie doesn’t hate Nvidia, he hates what Nvidia (Jen-Hsun) is doing. As with all of us he surely WANTS Nvidia to be successful and competing to keep driving the tech forward and prices down. Everyone wins.

    Those who DON’T shine light where it needs to be shined are the ones failing us and those who protest those who shine the light where it needs to be shined are retards.

      • MadManOriginal
      • 10 years ago

      It’s not what he does, it’s how he does it. He has a childish and imo unprofessional way of going about things, especially in the tone of his writing when it involves NV, and if he was just out to ‘help’ as you put it he wouldn’t have to be such an ass about things. I guess the trashy Brit tabloid style gets page hits but it’s not a good way to go about helping.

      And yes he does hate NV. It goes back to some press event or something where NV snubbed him and he got his panties in a twist. The sad thing is that it was years ago so he obviously has problems letting things go.

      • Silus
      • 10 years ago

      Well, I had already complained about that a while back and I was greeted with the swarm of AMD fanboys and anti-NVIDIA crowd, without any kind of comment from the site staff. Which surely indicates they don’t really think of changing anything in that regard.
      Did the same at [H] and got a reply from the site staff, which was that if the person that usually posts news, finds Charlie stuff worthy of front news page, then it’s posted without problems, regardless of content, which is a funny concept IMO in a site that’s supposed to be trustworthy.

      I actually made a suggestion to TR, which was to include a section with links to the known rumor sites and if people wanted, they would click on them. But definitely NOT include them in the front news page.

        • MrJP
        • 10 years ago

        So you just want reprinted press releases on the front page? Nothing too controversial? Really?

        Whether this is all true or not, it is interesting and entertaining, as evidenced by the number of posts on this article compared to all the rest of today’s news. Note that TR actually contacted Nvidia for comment and are reporting their response, so the approach has a fair degree more integrity than just reprinting rumours.

          • Silus
          • 10 years ago

          So what’s needed is controversy ?
          And here I thought tech sites were about facts and proper news stories. Guess I was wrong….

          Also, TR talked to NVIDIA, which denied the “story” (and NVIDIA’s side of it took what…1 line) while describing Charlie’s “report” took the rest of the news piece and at the end added

          “While we’d take all of the speculation with a grain of salt, we wouldn’t be so quick to discount Charlie entirely”.

          So TR asks NVIDIA about it and then went on to say that despite NVIDIA flat out denying it, Charlie may still be correct…

          “Yeah I’m not taking sides, but I don’t think that particular one must be ignored.”…

            • flip-mode
            • 10 years ago

            Well, you can twist his words that way, but that’s not what he was saying. He was saying a story should not be avoided just because it is controversial.

            • blazer_123
            • 10 years ago

            There isn’t much of a difference between what the rumor mill says and what the PR people at both Nvidia and AMD say. The people in marketing and public relations continually print heavily distorted ‘FACTS’.

            During the days of the Radeon 2000’s the AMD people were specialists at making black white and a great deal of powerpoints showing why their product was just as good as Nvidia’s. Nowadays, Nvidia is beaten on both price and performance. All they seem to say is CUDA, PHYSX, NVISION.

            Meanwhile, I have seen so many PR-proclaimed ‘life changing’ breakthroughs that never actually saw wide implementation that it makes me practically cry.

            Take the rumor mills with a grain of salt. Take the PR people with a pound of salt.

            • Silus
            • 10 years ago

            But that’s the whole point…Just look at TR’s pitch on the thing. First they called Charlie’s wild speculation “reports”, which is already ridiculous enough. A report is supported by facts, not wild speculation.
            Second, even though they mention that rumors should be taken with a grain of salt (they are now rumors, unlike the news title where they are reports…), they go on to say that we (TR) wouldn’t be so quick to discount Charlie entirely and use stuff he “wrote” as evidence to not discredit him: 1) Fermi release 2) Fermi’s die size.

            And I ask, how can TR use that as something to “credit” the idiot, when that hasn’t even happened yet ? Is he “right” even without his “speculation” actually coming true ?

            As for what you said about PR in general, I can’t agree more.

            • flip-mode
            • 10 years ago

            It’s a rumor and it has been couched as such. TR has always reported rumors regarding Intel, ATI, AMD, Nvidia, Apple, Dell, Asus, nettops, desktops, video cards, CPUs, AT&T, iPhone, and pretty much any rumor that is related to computer hardware or electronic gadgetry. For my part, I hope they continue to do so. You taking exception to this one just because it bodes ill for Nvidia is kinda pathetic.

            • Silus
            • 10 years ago

            No. TR clearly “credits” Charlie for something that hasn’t even happened (if ever): Release schedule and die size.
            And as you can see in the “news” title, Charlie’s wild speculation is called “reports”. Report != Speculation/Rumor.

            And I’m not taking exception to this one. It’s more than pathetic you see it that way, after I had already showed my opposition to these types of “articles” before. I oppose to any sort of rumors from a single source, without any sort of base. And a biased one a that.

            • flip-mode
            • 10 years ago

            It’s a rumor, and it was couched as such.

        • WaltC
        • 10 years ago

        I have no objection to TR referencing what Charlie says so long as TR continues to state it’s rumor and outs the source as being Charlie. The only time I object to rumor is when it is disguised as real news and the sources for the rumor are not revealed.

      • jdaven
      • 10 years ago

      Wait a minute. The writing style and language of Hightech4US looks awfully familiar to a rabid nvidia troll by the name of SiliconDoc that invaded and was subsequently banned from Anandtech.

      Hightech4US I deeply apologize if you are not the same SiliconDoc from Anandtech but if you are, don’t try your crazy troll stylings here.

        • HighTech4US
        • 10 years ago

        So posting actual facts (with links) here that counters a known nVidia biased source (that is quoted many times in the article) is considered trolling. Yea right.

        When an article is as bad a job of journalism as this one is it deserves to be called out.

        It seems like you have a problem with the actual facts I posted that refute charlie.

        Please feel free to refute the facts I posted with links from sources that aren’t from charlie or sites that just quote him?

        as for your so called slanted apology: Pftt

          • jdaven
          • 10 years ago

          There is nothing wrong with your post as long as you are not this SiliconDoc character from Anandtech. He is absolutely crazy. Again if you are not him/her I apologize even though you seem to take such things callously.

            • glynor
            • 10 years ago

            |[So posting actual facts (with links) here that counters a known nVidia biased source (that is quoted many times in the article) is considered trolling. Yea right.]l

            I note that in the livid response to your original postulation (that HighTech4US is SiliconDoc), he never actually denied that fact. Classic misdirection… Get all upset over something related, but not exactly germane, but never actually deny the core argument.

            Still doesn’t necessarily mean anything (and, to be fair to HighTech4US, it is very difficult to defend against an argument that you “might be” someone else on a web forum). Still, fascinating observation though.

      • SomeOtherGeek
      • 10 years ago

      No, I think this is good cuz it gets us something to think about and to research on. But I think it is also a warning that simething is up and we are to be cautions.

      Remember “all a world’s a stage…”

    • Kent_dieGo
    • 10 years ago

    Lately the tech news sites have been quoting Semi Accurate more than I ever remember. It seems Charlie has more credibility than nVidia now days.

    • SomeOtherGeek
    • 10 years ago

    From Semi-Semi-Accurate: This just in: nVidia is not losing their GTX200 line, the real fact is that the whole company is going belly-up!

      • Kent_dieGo
      • 10 years ago

      If Semi Accurate really said that then you could bet it is true. May not be too long at the rate they are going. At least they can re-label a few GT260’s to GT300 to keep going for a while longer.

        • SomeOtherGeek
        • 10 years ago

        Need to read my post again, “Semi-Semi-…”

    • MadManOriginal
    • 10 years ago

    The ‘shortage’ probably isn’t a rumor and it’s not limited to NV. It may be somewhat artificial though in that it’s not a manufacturing issue. Supposedly both companies don’t like having 90 day supplies because card vendors wait until the end of a quarter and order a huge amount of GPUs and expect a discount for doing so, then the vendors have their stock for the next quarter. AMD and NV are virtually forced to sell at that point in order to move inventory and show revenue for that quarter. Keeping just a 30 day supply means that vendors will buy fewer GPUs per order more often and not have such an advantage by negotiating prices just at the end of a quarter. In other words, it’s all about business dealings between vendors and GPU companies. I think this information was in a recent shortbread…

    As for the GTX series, they might be discontinued but probably only to be renamed as part of the ‘300 series’ 😀

    • Phimac10
    • 10 years ago

    Charlie is the real deal I say.

    • PRIME1
    • 10 years ago

    I wonder how much AMD pays Charlie?

      • SubSeven
      • 10 years ago

      As much as Nvidia pays you.

      • BailoutBenny
      • 10 years ago

      Bloggers are now required to disclose paid affiliations. If you think he is being paid, and not disclosing it, I’m sure you can bring up some charges.

        • PRIME1
        • 10 years ago

        Charlie is not based in the US. Also the ads on his front page are evidence enough.

          • flip-mode
          • 10 years ago

          The adds you post in TR comments is evidence enough that you are a pot shouting a a kettle.

      • SomeOtherGeek
      • 10 years ago

      I get paid 60K just to stay in the middle… Does that help?

    • rodney_ws
    • 10 years ago

    Who actually knows what the Nvidia GPU costs to manufacture? When the 280GTX first came out, wasn’t Nvidia all eager to charge $600 a pop? Now that the ATI 5870 has the top spot, wouldn’t it make sense for ATI to charge $600? Or was ATI aware of the costs Nvidia incurred and opted to price their card at a point where Nvidia couldn’t compete?

    Call me crazy, but I don’t think Nvidia is able to make money at a price point that competes with the current ATI batch of cards.

      • wiak
      • 10 years ago

      well AMD knows how much they get for their cards at TSMC so they can calculate it :p

      btw i love reading charlies articles, most other sites, say what NVIDIA wants them to say

      AMD knows how to start a war, heck they have been in a war with intel for many decades now ;P

    • Vasilyfav
    • 10 years ago

    I have to agree, even though I’m an NV fanboy sometimes, Charlie did get quite a few things right (although i wish he would stop writing like a pretentious douchebag).

    I’d like to see him predict Fermi performance if he’s got so much foresight.

      • Game_boy
      • 10 years ago

      I too wish he would stop writing like that – it’s making people angry for no reason. I agree with what he’s saying, and he’s so much more reasonable-sounding on his forum posts.

    • dermutti
    • 10 years ago

    I can’t believe that nvidia would completely leave the mid- to high-end market. AMD and Nvidia go back and forth all the time with who’s got the best value, but they never quit all together until they can reclaim the crown. They’d much sooner operate at zero profit for a few months (until Fermi comes out) than quit their industry.

      • Game_boy
      • 10 years ago

      It’s that attitude that caused Nvidia to lose money over the last few quarters – cutting the GTX2xx’s prices to hold marketshare at all costs.

      This is actually a good move, a sign that Nvidia is willing to drop the prestige of the crown for a viable business.

    • ThomasDM
    • 10 years ago

    He has a grudge against NVIDIA, that’s for sure. Everything he writes about NVIDIA has a very negative tone and a lot of his calls were very wrong.

    One thing I still remember is that he called the GeForce 8800 series a total disaster. IIRC he said “they really botched that one up”.

    • flip-mode
    • 10 years ago

    I really do like Charlie. He keeps it interesting. And, by my count, he’s been far more accurate that people around here will ever admit.

    • Game_boy
    • 10 years ago

    If they /[

      • reactorfuel
      • 10 years ago

      Charlie’s like the National Enquirer – he publishes any crazy rumor he comes across. Sometimes, it’s total made-up BS. Sometimes, it’s shocking, completely true, and a scoop that nobody with a more strait-laced approach to journalism would ever get. His website has a good title – it’s a shame some of his more passionate supporters can’t see that even he admits he’s wrong quite a bit of the time. Unfortunately, his site doesn’t seem to index articles more than a month or so old, and it’s not in the wayback machine, so it’s pretty difficult to go back and get a good idea of his record absent confirmation bias.

      I do remember he was screaming about desktop 8-series chips developing the same issues as the 8600GT-Ms. He’s broken some big stories, to be sure, but he’s also published stuff that would be huge and never actually panned out.

      As for actually EoLing the chips, that won’t happen until Fermi products are out. Even if Charlie’s totally correct, you’ll see sort of the inverse of a paper launch: the parts are still theoretically available for order, but nobody knows when they’ll get more in. Officially dropping the current flagship product would likely do awful things to the stock price.

      • flip-mode
      • 10 years ago

      No, they’ll keep hating, mainly because Charlie really does have it out for Nvidia, and people have gotten tired of his relentless crusade – he’s been at it for a couple years now.

    • derFunkenstein
    • 10 years ago

    Surprise! Charlie Demerijan said something inaccurate about nVidia! News abounds!

      • Farting Bob
      • 10 years ago

      Yes, everything he says is guaranteed to be wrong!
      Which is why nvidia is currently producing market leading DX11 GPU’s that outpace ATI’s chips by 50% at just half the price.
      Oh wait no, nvidia are struggling right now to compete with ATI, which is the sort of thing Charlie was saying earlier in the year.

        • derFunkenstein
        • 10 years ago

        He has such an inaccurate record that you can’t take anythign he says in either direction at face value when it concerns graphics.

        And up until abotu 2 weeks ago, they were competing with AMD just fine. A couple of weeks isn’t going to make or break anybody.

          • jdaven
          • 10 years ago

          I’m sorry to say derfunkenstein but Charlie has been right about a lot of things with regards to nvidia. He probably has an inside mole or just really good sources/intuition. I like nvidia and I’m not trying to bad mouth them while sticking up for Charlie but reality is reality.

          Besides this very story we are commenting on states that Charlie was right a year ago and includes links. You can’t say he is inaccurate when readers can simply look above your comment and read the TR story and see that he was accurate on some things. That doesn’t make sense.

            • derFunkenstein
            • 10 years ago

            He’s just not accurate enough for me. I don’t frequent his site as a result. He’s got a chip on his shoulder. That’s fine that he can be right once in a while, but his underlying bias makes him unreadable unless you’re an AMD fanboy.

            Also: this is yet to be proven.

            Also also: all high-end parts get eliminated eventually, and in teh case of something really expensive to make that’s replaced by something much nicer, they’re eliminated almost immediately upon being obsoleted. Prescotts (and for that matter, Pentium Ds) in the same price range as Core 2 Duos disappeared pretty quick. So did the Radeon 2900XT upon the 3870’s release.

      • sweatshopking
      • 10 years ago

      something inaccurate? he seems to be pretty close to the truth… i would imagine that there isnt much they can do right now. anyone but a fanboy looking to buy a card today, would be out of their minds to go green. ati has them beat on all fronts, and juniper will only extend that farther.

        • derFunkenstein
        • 10 years ago

        my point is that this has been true for all of about (exaggerated) 15 minutes. Up until the 5870’s release, they were pretty solid. They’ll be solid again with the next release – DX11 isn’t important for the new generation of cards and wont’ be important until long after these cards are too slow to be worth using with a DX11 game. Another way to say that is, by the time DX11 games are ubiquitous this card and the 5870 will be outdated and will have been replaced anyway. So another DX10 card isn’t a big deal in this release cycle.

          • asdsa
          • 10 years ago

          You seem awfully defensive of nvidia. If you don’t own nvidia card i’ll eat my hat.

            • derFunkenstein
            • 10 years ago

            You’re a moron – and it’s time to eat your hat. See my forum signature. Radeon 4870, Phenom II, AMD 790X motherboard.

            I can just recognize when a journalist is stretching the bounds of the truth to grind an axe, that’s all.

Pin It on Pinterest

Share This