Nvidia on GF100 derivatives: Our current-gen GPUs are fabulous

Yesterday, we linked to a rant by Charlie Demerjian of SemiAccurate, who claimed mainstream derivatives of Nvidia’s GF100 graphics processors had yet to move beyond the drawing board. Today, X-bit labs quotes a statement by Nvidia head honcho Jen-Hsun Huang that seems to corroborate the alarmist wing of the rumor mill somewhat.

According to the CEO, Nvidia is doing just fine with its current line of mainstream graphics products:

All of that just depends on 40 nm supply and we are trying to finesse it the best we possibly can. For the entry-level products, the truth is that the new architectures […] are probably not extremely well appreciated anyhow. People, who buy the new architectures, tend to be early adopters and they tend to be the game enthusiasts, workstation designers or creative artists or – there are very specific reasons why it really enhances their experience. Our current-generation GPUs are fabulous and all the things that mainstream consumers would use their computer for.

Huang went on to predict that his company’s current-gen GPUs "are going to continue to do quite nicely in the marketplace," although he vowed to transition to newer products "as fast as we can."

To the CEO’s credit, few games out there have DirectX 11 support at this point, and Nvidia has managed to stay roughly competitive on the performance front as far as sub-$200 desktop graphics cards are concerned. However, at the higher end of that market, the company is still forced to compete using larger, more power-hungry graphics processors made using 55-nm process technology, which may be cutting into its margins.

Comments closed
    • maxxcool
    • 10 years ago

    Sweet Jebus…. 255 posts ?? This is worse than the Snakeoil i7 review debacle.

    • MrBojangles
    • 10 years ago

    This is from the upgrades section of anandtech latest system guide.It’s some of the most levelheaded advice for buying a new gpu i’ve heard in a long time.Just thought all you peeps living in denial about nvidia’s current state of affairs would get a kick out of it.

    “While it’s true that some games can be limited greatly by the power of your CPU, hardcore PC gamers know that for the most part, the true heart of a gaming PC is its graphics card. What that means is that, up to a certain point, you can pick a powerful GPU and get greatly increased graphical performance in even an entry-level Pentium Dual-Core system like the one on page 3.
    If you plan to run titles that aren’t particularly strenuous or you don’t mind running at reduced graphics settings, the same $100 Radeon HD 4850 we recommended for our Mainstream rigs will provide plenty of power to one of our 1440×900 entry-level PCs. If you need even more power, however, a $160 Radeon HD 5770 should be able to provide enough horsepower for 1920×1200 gaming—in addition to Eyefinity and DX11—with minimal bottlenecking. If you’re considering an upgrade for one of our Mainstream configs, there’s once again the Radeon HD 5770, the Radeon HD 4890 provides excellent 1920×1200 performance for $200, and the Radeon HD 5850 hits the ceiling of reality at $300. We can’t really justify the price for cards more expensive than the 5850 due to diminishing returns,l[http://www.anandtech.com/guides/showdoc.aspx?i=3739&p=6<]§

    • jdaven
    • 10 years ago

    I really don’t understand what people like Hightech4US and PRIME1 are trying to convince us of. Nvidia and ATI are both great companies that sell very good products. Why do we have to convince each other one is better than the other or that one should be put out of business? What would one gain from making such convincing arguments? Will you somehow make money if more people buy one of the companies products over the other? Do you have stock in one company that will hopefully go up if you convince enough people? Will you happiness in life increase when you successfully win some perceived argument or view point.

    Now don’t get me wrong. I understand the whole fanboi-enthusiast commenting mindset. I do it myself and have been commenting on these forums for years. But I never really step back and say what is the point of convincing some stranger over the internet that X company is better than Y company. What do we get out doing it?

      • Fighterpilot
      • 10 years ago

      Both of them have done far more damage to Nvidia’s reputation than would have occurred naturally by turning off potential customers who otherwise might have bought an Nvidia card.
      Congrats dudes….you are ATi best players.

      • MadManOriginal
      • 10 years ago

      Rabid fanboys get +10 epeen.

      p.s. Apple tax!

    • Bensam123
    • 10 years ago

    S3 used similar logic with their Chrome graphics processers (from what I remember). They felt that they were feeding the current market with what they needed for what they needed it for.

    At the time I made the same comment that it never pushes anything ahead when all people concentrate on is ‘now’.

    For the most part software developers develop their software for currently technology in the market place, this includes games, if new technology is never entered into the market place we end up with a chicken and the egg problem.

    Traditionally hardware makers have always pushed things forward, because, well, thats what they’re there for. Intel doesn’t design chips specifically for todays tasks, they design them for tommorows, next weeks, next years, and next decades tasks.

    I’m noticing more and more software companies and hardware alike are concentrating purely on ‘today’ and not ‘tommorow’. It can be seen in the hardware we use, the software, video games are a prime example of this. They are now all about sucking ever last ounce of cash out of people rather then making something people just want to play. It’s all about ‘stuff’ rather then just having a fun experience.

    This indirectly ties into the zeitgeist presented here. I know Nvidia is just having yield isssues and this isn’t completely their ideology, but it’s important to keep the two separate and not fall into the former.I doubt it will turn out this way, but I hope Nvidia won’t turn into the next S3.

    Now, excuse me while I go load up my xbox with more points so I can buy the latest Modern Warfare 2 booster pack.

    Remember back when DLC wasn’t called DLC, it was called a patch, and it was always free?

    • HighTech4US
    • 10 years ago

    nVidia GAINED desktop market share during 4Q. ATI LOST market share in the same quarter. So nVidia is correct in that they still have a healthy market for their cards.

    All that AMD/ATI chest thumping of 2 million DX11 cards sold didn’t make any difference as nVidia sold 1.33 million more cards than the previous quarter.

    AMD/ATI only sold 375 thousand more so that 2 million cards killed off 1.625 million of their older cards.

    §[<http://investorvillage.com/smbd.asp?mb=476&mn=167141&pt=msg&mid=8483074<]§

      • swaaye
      • 10 years ago

      AMD has to work on consumer mind share. Their current line up is probably their best in history though so lets hope for the best. They were selling all the product they could build for awhile there, but I think production has picked up and stock is better now.

      • PRIME1
      • 10 years ago

      The 5zzz series failed. Hard.

        • flip-mode
        • 10 years ago

        Sorry PRIME1, I missed your reply to #213. I think it’s appropriate that you provide a reply after all that finger pointing over making up or ignoring facts. If there is anything in the world that had Failed. Hard. it is your credibility.

      • flip-mode
      • 10 years ago

      q[

        • PRIME1
        • 10 years ago

        You place far too much weight on DX11 flippy. Devs don’t care and the severe lack of games proves that. If anything more Devs are flocking to PhysX than DX11 and that may be just another reason for the 5zzz series incredibly poor sales.

          • SHOES
          • 10 years ago

          did someone just play the Physx card?

          • Meadows
          • 10 years ago

          PhysX is currently being abandoned as a hardware feature for the PC. Note the dismal number of notable PhysX titles compared to the dismal number of Dx11 titles.

          Except Dx11 has been out for 4 months. PhysX? Years.
          Which has failed then for certain? PhysX. Dx11 is supported by Microsoft as well as practically the /[

          • flip-mode
          • 10 years ago

          You overplay the importance of PhysX, disregard its corrupt use by Nvidia, all while you downplay the real important issue: Nvidia can’t get it’s new card out the door. Nevermind DX11, Nvidia’s cards all get beat in DX10. And you still can’t respond to #213?

            • PRIME1
            • 10 years ago

            No you downplay the importance because your company is so full of fail. NVIDIA is launching their DX11 card next month. Will ATI have Physics or 3D by then? Nope, they will be back of the bus as always.

            As for DX10 performance. My card plays DX10 games just fine. You don’t need to waste money on a 5zzz card when any ole DX10 card will do. Especially if a 5zzz lacks a lot of features that other cards have supported for years.

            • Meadows
            • 10 years ago

            You keep ignoring us because we defeated you.

            • PRIME1
            • 10 years ago

            I keep ignorning you because it’s what most people do.

            • Meadows
            • 10 years ago

            You’re one pathetic retard.

            • flip-mode
            • 10 years ago

            If you’re actually saying that your opinion is better than my opinion, then I won’t argue, but you still haven’t replied to #213 and #224. You’re the one in love with a company, not me, which is why Charlie D hurts your feelings so much, as if he called your girlfriend names.

            • PRIME1
            • 10 years ago

            Meadows response and yours are precisely the reason I am happy to stand on the other side of the fence from people like you and him.

            When all else fails, call people names. That will make you look cool.

            Any argument that places me away from you, meadows, fighterpilot, spigzone, etc. is exactly where I want to be.

            Truly read the responses from these people.

            • flip-mode
            • 10 years ago

            “Truly read” how Prime1 cannot muster the courage to reply to #213, #224 and concede his own fact fail. Spineless. Indeed, please stay on the other side of the fence with the rest of the zoo creatures.

            • PRIME1
            • 10 years ago

            Actually I responded to 213 a few hours ago. It must be exhausting to be wrong so often. The other one needs no reply as he ignored so many facts that he may as well be the information minister for Iraq as US troops were knocking on his door.

            • flip-mode
            • 10 years ago

            PRIME1, watch this, here’s something you don’t know how to do:

            Oops, missed that, my bad.

            As for what you posted in #259… I’m not sure what you are trying to say with that. The link has very little to do with ATI, it’s all about AMD.

            Fact is that you made a claim: that ATI was on the verge of bankruptcy when AMD bought them. That claim has been debunked.

            So, while you did respond to #213, your response was nonsensical. Congratulations?

            As for #224, if he’s so obviously out of line, you’re the only one here that sees it that way, so please respond to him point for point. Otherwise, it seems that he demonstrated that you’re just plain ignorant.

            • PRIME1
            • 10 years ago

            Just because you are in such deep denial, does not make my statement “debunked”

            I’m not trying to convince someone as clearly biased as you are. I’m just pointing out the truth. Shame you are so wrapped up in attacking me that you can’t take a second to see that.

            • flip-mode
            • 10 years ago

            What truth? That Nvidia could buy AMD? See #224. That PhysX is better than DX11? That’s called opinion, not truth. What else you got?

            • Shining Arcanine
            • 10 years ago

            PhysX and DirectX 11 are APIs for two separate things. One cannot be better than another. :/

            If it were OpenGL 3.2 versus DirectX 11, I would have to say OpenGL 3.2. Feature-wise they are similar, but I think OpenGL is better because OpenGL is cross platform while DirectX is not.

            • Shining Arcanine
            • 10 years ago

            Just for the record, of the two posts in this long argument that I have actually read, #265 and #266, I think PRIME1’s post is more convincing.

            • flip-mode
            • 10 years ago

            You’re one of those that says you read a book (an incredibly lame book, in this case) after just having read the last page. I mean…. can’t help you with that…. that’s your problem.

            • Shining Arcanine
            • 10 years ago

            You are the one here that sounds like a lawyer. :/

            • flip-mode
            • 10 years ago

            Then you dealt with some pretty shoddy lawyers in your time… yikes!

      • designerfx
      • 10 years ago

      from the same guy who just posted saying that theoretical benchmarks are reliable? pshaw. I hope you leave the website.

    • spigzone
    • 10 years ago

    Charlie D > verification the Unobtanium GTX480 ~5% gaming performance edge over 5870.

    §[<http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/<]§ Talk about your nightmarishly worst potential outcome realized.

      • spigzone
      • 10 years ago

      Holy Smoked Unobtanium GTX480 Batman!

      Same performance ~ 10x the cost = Epic Pwnage of Another Major Disaster.

        • JustAnEngineer
        • 10 years ago

        q[

      • geekl33tgamer
      • 10 years ago

      Lets all point and laugh at Nvidia. They have definatly lost this round…

      …On the otherside, you could save a few pence on your heating with one of these in the longterm. The room your PC is in wont need any!

      • Krogoth
      • 10 years ago

      If those figures hold.

      Fermi is offically FX 5800U 2.0

      • Sunburn74
      • 10 years ago

      What a scathing article. Excellent read. Makes me want to make moves to grab that 5850 right now… Still waiting for it to be only 10 bucks over MSRP

      • moritzgedig
      • 10 years ago

      /[

    • thermistor
    • 10 years ago

    I have to admit that the 2xxx series and the 3xxx series represented ‘bad’ and ‘getting better’ for ATI…because they didn’t have the ‘halo’ product at the time (though the 3850/70 were at least competitive), I managed to scoop up a bunch of ATI goodness pretty cheap at online retailers.

    I can foresee in 3-6 months nVidia being in a similar situation, fire-selling their midrange stuff for dirt cheap just to keep their vendors going.

    GTS240 for $50 I’m hoping! 9600GT for $60…c’mon Jen!

      • FuturePastNow
      • 10 years ago

      I was thinking more like GTX260 for $150… at which price Nvidia would surely be losing money on them, but that’s not my problem.

        • MrBojangles
        • 10 years ago

        That might end up being sooner than you think.My local microcenter just had a firesell on a bfg gtx 260 for $151.99 and a gtx 275 for $198.99.Understandably there already sold out of both they do still have a bfg gts 250 for $117.99 though

          • MadManOriginal
          • 10 years ago

          That’s nice and all to be ble to get those prices but it’s kind of pathetic in a way too – for the GTXes those prices were all over the place well over a year ago.

    • spanky1off
    • 10 years ago

    interesting article highlighting the cynical and unethical marketing tactics employed by nvidias marketing gurus

    §[<http://www.theinquirer.net/inquirer/news/1051123/nvidia-cuts-reviewers-gts250<]§ now im no software engineer or computer expert but as a consumer i want to know the truth about a product for better or worse...but it seems even that right is being taken away. it does make wonder who u can trust when it comes to product reviews.

    • cegras
    • 10 years ago

    Has anyone noticed yet that whenever hardware performance is mentioned PRIME1 simply ignores the subject and vomits out the same tired rhetoric mixed with semi digested chunks of no longer relevant content (LOL ATI BUYOUT LOL)?

      • Meadows
      • 10 years ago

      …Yes, I can’t help but admit it’s pathetic.

    • Krogoth
    • 10 years ago

    This is what I see when the die-hard fanboys are in denial.
    §[<http://tinyurl.com/ydnk6ly<]§

      • pogsnet
      • 10 years ago
    • DrDillyBar
    • 10 years ago

    As a CEO, it’s not like he can say “Meh” and be done with it.

    • cavedog
    • 10 years ago

    Reading PRIME1’s comments conjures up images of the “internet tough guy” in my head. I don’t know why. Here is a link on what an internet tough guy may look like. 🙂
    §[<http://i38.tinypic.com/if1zsj.jpg<]§

      • spigzone
      • 10 years ago

      ROTFLMAO … … truly, my sides are hurting.

    • anotherengineer
    • 10 years ago

    wholy flamers

    take this

    SPAM

    and Prime1 #163 “All AMD needs to do is file bankruptcy to get rid of their crushing debt and NVIDIA could buy them with cash and gut ATI. ”

    Highly unlikely in Canada and USA with the laws and rules of monopoly, fair competition, etc.
    And I wasn’t aware ATI was on the verge of bankruptcy, I guess you must be an accountant or CFO there or something??

    Please keep your hate and trolling to yourself Prime1

    Thanks

      • PRIME1
      • 10 years ago
        • flip-mode
        • 10 years ago

        Are you saying Nvidia wants to buy AMD? If not, then what’s your point… why are you going on about this? If it’s just to say Nvidia’s market cap is higher than AMD’s then just say so and be done.

        Beyond that, what do the relative values of either company have to do with anything? Is it just a topic because it’s an area where Nvidia is still the winner? I guess that’s something for you to feel good about but I don’t see how it helps Fermi…

        Also, see post 196… according to him, you’re totally full of crap anyway.

          • PRIME1
          • 10 years ago

          §[<https://techreport.com/discussions.x/8474<]§ Right before AMD bought them their stock was tanking. Imagine what would have happened if ATI had to ride out the R600 disaster without AMD. Your support of spig only reinforces my assessment of you. Especially since neither of you use facts.

            • flip-mode
            • 10 years ago

            You’re weird. Yes, R600 may have been rough on ATI, but since you bring up those pesky “facts”, it is in no way a fact that ATI was headed for bankruptcy, as you so confidently claim.

            • PRIME1
            • 10 years ago

            So you also ignored that link about financial woes.

            • cegras
            • 10 years ago

            Congratulations, links that are over a year old are certainly relevant to the discussion at hand.

            • PRIME1
            • 10 years ago

            Do you even know what we are talking about? ATI’s financial situation before AMD bought them. How would there be a new link about that.

            ATI fans are horrible and discussing facts. They get lost so easily.

            • cegras
            • 10 years ago

            I know that you are basically following a tangent that you probably started in the first place because you’ve run out of things to say.

            • MrBojangles
            • 10 years ago

            Check out #213 if your so concerned with “facts”.I just want to see if you actually reply to it intelligently or just start spewing out the words “Nvidia pwns all” from a different angle.

            • Roffey123
            • 10 years ago

            Remove thy brain from just beyond your rectum and stop trolling, I hear its a nice sunny day outside of the bridge you’re hiding under…

            • Kurotetsu
            • 10 years ago

            I can’t believe I’m posting in this trainwreck, but that Techreport link you’re referencing was from June 2005, and the financial results it links to are from Q3 2005. The AMD/ATI merger was announced in July of 2006 and closed in October of the same year. Now, obviously, the process was started way before then, but if you really want to see how ATI was doing “right before” AMD acquired them then you need to look at their Q2 2006 results:

            §[<http://www.ati.com/companyinfo/ir/2006/atiq206.pdf<]§ They hit RECORD revenues that quarter, and, assuming I'm reading the results correctly, their net income was 34.1 million. Unlike the Q3 2005 results which showed a loss to net income. Long story short, spigzone was absolutely right. You're full of crap.

            • flip-mode
            • 10 years ago

            Fact Bump. PRIME1 Fact Fail.

            • PRIME1
            • 10 years ago

            34 million in a multi-billion $ industry. How does that compare to their losses in 2005 or even the previous quarter? Don’t forget this is the year AMD bought them.

            Read this and tell me that ATI was doing well.
            §[<http://www.hardwaresecrets.com/article/471/2<]§ ATI's value has gone down several billion since AMD bought them. This is according to AMD's own public filings. So if you want to argue with anyone argue with them.

            • Sahrin
            • 10 years ago

            So has the value of just about every single company in the world – including nVidia.

    • wira020
    • 10 years ago

    r[

      • rhema83
      • 10 years ago

      Nothing to do with pro-NVidia. We are just tired of paying $300 for a $250 HD5850.

    • PRIME1
    • 10 years ago

    Reply to #157

    ATI was on the verge of bankruptcy, just like AMD is now.

    ATI is to AMD like AOL was to Time Warner.

    In fact NVIDIA is in a position where they could actually buy AMD. Go look at the Market Cap for each company and how much money NVIDIA has in the bank.

    All AMD needs to do is file bankruptcy to get rid of their crushing debt and NVIDIA could buy them with cash and gut ATI.

      • Game_boy
      • 10 years ago

      A few problems.

      1. No company could buy AMD outright. Even with the recent cross-license changes I doubt their x86 license is transferable.

      2. Nvidia would not be allowed to buy AMD anyway, since that would create a monopoly on high-end graphics. Intel could theoretically come in with Larrabee but for now there are only two such vendors.

      3. AMD is not on the verge of bankruptcy. They are profitable in the real world (GF’s finances don’t affect whether AMD can pay its day-to-day expenses which is the real criteria for bankruptcy).

      4. ATI fixed AMD’s developement process. ATI managed to turn their own R&D pipeline around with RV670 then RV770, and their executives taking over many AMD operations gave us the new process that led to good execution on Shanghai, Istanbul and now Magny-Cours. ATI also gave AMD reliable chipsets, and Fusion, which is the only hope AMD have of becoming competitive in the mobile sector ever again.

        • PRIME1
        • 10 years ago

        1.) AMD should have lost their license when they sold off their foundries, so it seems that may not be an issue. Especially with Intel worried about further monopoly lawsuits.

        2.) Intel is the largest GPU manufacturer. With ATI in 3rd. NVIDIA can claim that in order to compete it needs AMD for the CPU business. There are still several other GPU companies out there like Matrox and S3. No one cares if their is a monopoly in a niche market like high end graphics.

        3.)http://finance.yahoo.com/tech-ticker/article/336235/Ten-Big-Companies-That-Are-Veering-Toward-Bankruptcy?tickers=AMD,LVS,S,M,GT,MYL,HTZ

        If not for an Intel lawsuit and selling off their most valuable asset, they would already be bankrupt

        4.) ATI did not fix anything. They have not been there long enough. AMD has made solid chipsets in the past. In fact ATI’s chipsets had lots of problems (read past reviews) and were far behind the nForce platform in sales.

        So yes, my theory stands firm.

          • Lans
          • 10 years ago

          1.) I can’t argue with whether or not AMD “/[http://www.nytimes.com/2009/11/13/technology/companies/13chip.html<]§ "/[

          • Sahrin
          • 10 years ago

          l[

            • Meadows
            • 10 years ago

            “One up” to this man. Prime1, die.

            • jdaven
            • 10 years ago

            I too agree this comment should get a +1. Most well written and researched comment I’ve read in a long time. I thoroughly enjoyed reading it. Sahrin you are right on. PRIME1 please leave.

            • geekl33tgamer
            • 10 years ago

            Sahrin 1 – 0 Prime1 (Troll) – Well said 🙂

            Looks like the troll has gone into hibernation now, thankfully. Either that or Nvidia just cut Prime1 from their payroll/fanboy alliance…

      • wira020
      • 10 years ago

      That wont happen… it’s anti-competitive for the gpu market…

        • PRIME1
        • 10 years ago

        There are plenty of GPU makers.

        S3, PowerVR, Matrox etc.

          • wira020
          • 10 years ago

          Get real… if you look at any gpu market share pie charts, i’m pretty sure those company arent even mentioned.. probably included in “others”slice… Nvidia acquiring AMD/ATI would mean monopoly in which those charts will show Nvidia owning 95+% market share… nope, wont happen… i can guarantee you this… unless another competitor appear that is…

            • PRIME1
            • 10 years ago

            Intel owns the largest share of the GPU market right now. Do some research.

            • wira020
            • 10 years ago

            Not if it’s dedicated gpu… intel’s only no.1 cause those charts choose to include embedded graphic and chipset too…

            • Meadows
            • 10 years ago

            You’re making it very hard for people to converse with you without an unstoppable urge to badmouth your person. I’m serious.

            • PRIME1
            • 10 years ago

            It’s all the facts you are having trouble with isn’t it Meds.

            • Meadows
            • 10 years ago

            You’re having trouble with the term “fact”.

          • Meadows
          • 10 years ago

          None of them are in the gaming hardware business, and it’s in nobody’s interest for anything to be a monopoly (i.e., the hypothetical case of AMD failing), contrary to what you might be erroneously thinking.

            • SPOOFE
            • 10 years ago

            /[

      • spigzone
      • 10 years ago

      Abu Dhabi’s pockets are vastly deeper than Nvidia’s and Abu Dhabi not only owns Global Foundries, it also has an 8% stake in AMD.

      Not to mention Nvidia is currently face-planting, and AMD needs welders googles, it’s future looks so bright.

        • Game_boy
        • 10 years ago

        You’re making AMD fans look irrational. AMD may not be dead, but its future is in no way clear or an easy ride.

          • spigzone
          • 10 years ago

          Au Contraire mon frere …

          With GloFo poised to increasingly providing timely node competition with Intel, and AMD having the inside track at GloFo what with Abu Dhabi owning a sizeable stake in it and all, AMD is perfectly positioned with it’s coming Fusion and GPU products to be fully competitive in the laptop and desktop spaces from top to bottom, if not outright own some of those segments, and start making serious inroads into the GPGPU market.

          Looks like it’s also eyeing the mobile segments, from phones to tablets to netbooks etal.

          There IS an advantage to AMD buying out ATI, that being it is now poised to bring CPU<~>GPU solutions to the market over the next few years nobody else can even engineer, much less compete with.

      • Mat3
      • 10 years ago

      ATI was never on the verge of bankruptcy. Where do you come up with this garbage?

        • spigzone
        • 10 years ago

        PSOOTA ~ Pulled Straight Out Of Their Asses.

        I looked up the financial history of ATI = financially sound when AMD bought them.

      • duffy
      • 10 years ago

      Perhaps Nvidia could buy GM then.

    • Shining Arcanine
    • 10 years ago

    Scrap what I was saying in my previous comments, I just saw in the article that Fermi was pushed back again.

    It is time for me to wait for Fermi II, as things do not look good for Fermi:

    §[<http://www.semiaccurate.com/2010/02/17/nvidias-fermigtx480-broken-and-unfixable/<]§ Hopefully I will be able to get my hands on a Fermi II by the time I graduate. I am itching to program it.

    • spigzone
    • 10 years ago

    Nvidia doesn’t need those sunglasses.

    A COMPLETE evergreen refresh (across the entire 5000 series) in ‘2H 2010’ was promised by an ATI exec. in an interview.

    Nvidia will have to do a base layer respin, followed by one or more metal layer respins, to make Fermi manufacturable at reasonable yields. That will take 6-10 months, depending on when they made the decision to do the base layer respin. Progress on cut down variations of Fermi will necessarily await the testing results of the base layer respin and their availability window is likely at least a year away – making it likely they will never see the light of day – a conclusion buttressed by JHH’s comments.

    Of course even at ‘reasonable’ yields, 60-80%, it still won’t be cost/performance competitive with even the current Cypress chips, which should be yielding at 90%+ by then, much less with whatever the refreshes brings to the table.

    A year from now, when Fermi II is up and running (IF it is up and running) AMD will be readying it’s next generation at 28nm and will be well positioned to have at least a 6 months lead at that node.

    Nvidia doesn’t need those sunglasses, it’s future isn’t that bright.

      • Shining Arcanine
      • 10 years ago

      Nvidia has multiple development teams working on multiple projects in parallel. Just because 1 project is slowed down by something does not mean that another will be. I am sure that Fermi’s successor is on track, but regardless of what Fermi’s successor will be, I think we should wait until next month before jumping to conclusions.

        • spigzone
        • 10 years ago

        Generalities in lieu of addressing specific points doesn’t git ‘er done.

        What specifics in my post do you disagree with and what are your reasoned counter arguments?

          • Shining Arcanine
          • 10 years ago

          Here is a specific point:

          “Nvidia doesn’t need those sunglasses, it’s future isn’t that bright.”

          My counter argument is my previous post.

    • ub3r
    • 10 years ago

    Chillingly, the GF100 reminds me of this :(.
    §[<https://techreport.com/articles.x/4966<]§

      • lethal
      • 10 years ago

      Funnily enough, that review showed the FX looking somewhat decent performance wise. As soon as DX9 titles showed up it all went downhill… I recall the HL2 charts where the Radeon 9600 spanked pretty much all Nvidia cards.

        • Lazier_Said
        • 10 years ago

        The HL2 charts where the R9600 pwned the dustbuster showed up right around the time Valve signed a marketing deal with ATI.

        A year and a half later when the actual game came out, about all you could say was antique card 1 has prettier water than antique card 2.

        Future proofing in PC hardware was and remains throwing money away.

          • JustAnEngineer
          • 10 years ago

          A year and a half later, the Radeon 9700Pro still played every game available at maximum detail.

            • Shining Arcanine
            • 10 years ago

            This is true. I know that I did this with Battlefield 2. The only problem is that I had 10 FPS. I upgraded to a XFX Geforce 7950 GT and my frame rates quadrupled.

            This was back when I still played video games.

            • Lazier_Said
            • 10 years ago

            Yep, I had a 9700 around that time.

            Wouldn’t even run BFV playably at 16x12xAA let alone BF2.

            High res, AA, decent framerate, choose any two.

            • swaaye
            • 10 years ago

            Why don’t you play video games anymore?

            • PetMiceRnice
            • 10 years ago

            I had an ATI 9800 Pro and got a solid 42 months out of it in my primary gaming machine of the time. I’m not gaming on the PC anymore, but that card was truly a great last hurrah.

          • swaaye
          • 10 years ago

          Hey, I played HL2 on a 9700Pro. That baby outclassed the entire GeForce FX series, even the fastest refresh hardware, in just about any game that used shader model 2.

          The GeForce FX cards were so awful at SM2 that HL2 runs them in DX8.1 mode and feeds them SM1.4. Some sites did some tests with the cards forced into DX9 mode and it wasn’t pretty. 🙂

          The only card I’ve had longer than my 9700Pro is my current 8800GTX. Many aspects to the modern gaming industry are causing technological change to slow….

      • Madman
      • 10 years ago

      Funny, but even midrange cards are 13″ dustbusters nowadays 😀

        • JustAnEngineer
        • 10 years ago

        Radeon HD5850 is only 9

    • flip-mode
    • 10 years ago

    Everyone is overreacting a bit. ATI recovered from the 2900. Nvidia recovered from the FX5800.

    Nvidia will be fine. They’ll recover. They likely will continue to hold the market share lead. They’re a good company that is just going through a rough patch.

    AMD will almost certainly hit another rough patch some time in the next couple of years when things just don’t work out right.

      • spigzone
      • 10 years ago

      Maybe, maybe not.

      In the technology arena, the playing field itself is ever shifting and changing. What is true today will not necessarily be true tomorrow.

      For example, AMD and Intel are both moving to graphics on die with the CPU which will, as the performance increased y on y, negatively impact the sales of increasingly higher segments of the discrete graphics market, eventually killing off those segments.

      That Nvidia and ATI ‘changed places’ in the past does not predict they will continue to do so in the future.

    • duffy
    • 10 years ago

    I’m pleased to hear Nvidia is doing so well.
    I got rid of my last Nvidia chipset board over a year ago, which is also about the same time I tossed the last G-card from any of my PCs.

    They’re the new VIA, in my book.

      • srg86
      • 10 years ago

      nVidia’s Linux drivers are the best in the business. I’ve moved to their chips for precisely that reason.

      ATI Drops support for older GPUs in their drivers far too early and open source drivers have never felt as rock solid to me.

        • duffy
        • 10 years ago

        Linux and Nvidia? Yup, the new VIA.

          • Shining Arcanine
          • 10 years ago

          Nvidia’s Linux drivers have been the best as far back as I can remember. If you used Linux, you would likely know that they are the best from firsthand experience.

      • Shining Arcanine
      • 10 years ago

      So, I take it that you like to run all of your Linux machines from a remote terminal. How well does that work for you?

    • Triskaine
    • 10 years ago

    You are wrong. That was in Q3 2009. In Q4 AMD had a Graphics revenue of 427 million and a profit of 53 million.
    To get nVidia’s real GPU revenue for the same quarter, you have to exclude Tegra and Chipset revenue, because AMD’s Graphics revenue includes only Graphics and no Chipsets, which are filed under Computing solutions. Then nVidia’s Graphics revenue comes out to 730,8 million. And that was for a quarter during which AMD was heavily supply constrainted. The next quarter will be the real interesting one, as AMD’s complete DX11 lineup will come to bear.

    • The Dark One
    • 10 years ago

    PRIME1 posts are like newly surfaced Osama Bin Laden tapes. Lots of religious hyperbole but the content is rarely topical.

    • clone
    • 10 years ago

    Nvidia is pinning a lot of hope on Fermi….. they don’t have a cpu and are doing everything they can to get general purpose computing in the game.

    if Fermi fails it won’t be just that Nvidia lost some video card market share it’ll be an idiology loss…. a very bad one and they likely won’t recover….. make no mistake if Fermi fails badly enough Nvidia could very well become a part of someone else…. they have money… in this game that is enticing, they can’t buy Intel, IBM or Sony but those 3 could buy them while no one else can afford them…….. the FTC would block an Intel buy or an AMD buy even if it was Nvidia buying AMD, IBM likely isn’t interested and Sony has it’s own issues leaving a perfectly viable company with little to no future stuck as an IP ready to be parted out…… I wonder if the U.S. defense department wouldn’t start crying if a foreign interest like China pony’d up the coin…… anyway Via has little to nothing to offer and with no cpu / no platform the clock started ticking the day after the announcement that AMD bought ATI.

    Fermi’s problems are in it’s lack of focus, “too many eggs in one basket” lead to a huge monolithic chip that is difficult to build, difficult to build in quantity and offers problematic performance and did I neglect to mention that it’s not just late at this point but a firm release date isn’t even being considered atm……. a very ambitious endeavor to be sure but it is/was at the expense of their GPU position…… PhysX could be quickly eclipsed by Havok especially if universally adopted once Nvidia disappears, PhysX is not compelling and has so far required far more investment by Nvidia than it’s ever returned ….. expendable in this industry….. not even a notable IP of worth IMHO.

      • spigzone
      • 10 years ago

      Lots of sovereign funds around looking to diversify, like Abu Dhabi buying Global Foundries.

      Depends on how far the stock craters after Fermi’s launch and it’s actual competitive/availibility picture clarifies.

      Crater far enough and someone will make the shareholders an offer they can’t refuse.

    • Meadows
    • 10 years ago

    “Fabulous”, …now I’ve seen everything.

    • ddarko
    • 10 years ago

    Pretty soon, Nvidia will be touting how Fermi will be the top card to play Duke Nukem Forever when both are released.

    Let’s review the entertaining twists and contortions of the Fermi #1 Brigade. First, way back in *[

      • Shining Arcanine
      • 10 years ago

      Actually, the metrics that matter are workstation graphics, general purpose computing and linux driver support. That is where the money is and ATI is far behind in all of those categories.

        • wira020
        • 10 years ago

        Most importantly, 3D support!!! Because you know, we all crave for it!

        • Game_boy
        • 10 years ago

        Those are the metrics that matter to YOU. You are confusing your needs with the wider market’s.

        You represent about 1% of the desktop graphics market’s needs. That’s fine, it’s just that by having inferior GPGPU and Linux support (which I would dispute anyway) AMD aren’t losing much of their market at all.

        AMD has never had much workstation share so nothing has changed there.

          • Shining Arcanine
          • 10 years ago

          Believe it or not, that is where the money is. That is why Nvidia is doing very well financially at the moment while ATI is barely making ends meet.

          • wira020
          • 10 years ago

          I was pretty sure he intended sarcasm… but i dunno for sure…

            • Game_boy
            • 10 years ago

            -[

            • wira020
            • 10 years ago

            Not you, i meant “Shining Arcanine”… i was pretty sure he was being sarcastic when he said only those thing matters… and i was too… i’m pretty sure 3d as current implementation can only gave me headaches… 😛

            • Game_boy
            • 10 years ago

            Oh. Not sure then.

    • beck2448
    • 10 years ago

    AMD’s graphics revenue last quarter was 306 million with 8 million profit. Nvidia’s latest quarterly revenue was 982 million with 131 million profit.

      • spigzone
      • 10 years ago

      In two quarters the reverse might well be true.

        • PRIME1
        • 10 years ago

        In 2 quarters the gap will widen.

          • Meadows
          • 10 years ago

          Unlikely, I doubt we’ll even see GF100 that soon.

            • Shining Arcanine
            • 10 years ago

            It is supposed to debut next month.

            • spigzone
            • 10 years ago

            ‘debut’ = paper launch

            • Shining Arcanine
            • 10 years ago

            I read the article and you are right about that. Unfortunately, I will wait for it regardless of how long it takes, because I am more interested in being able to program a GPU than I am in being able to run games on it.

        • beck2448
        • 10 years ago

        Nvidia has much more cash, R and D and marketing muscle than AMD. They’ll be fine.

      • spigzone
      • 10 years ago

      In two more quarters, Nvidia might be a division of Intel.

        • PRIME1
        • 10 years ago

        They won’t fail like ATI did and get bought out.

          • spigzone
          • 10 years ago

          ATI wasn’t failing, they got an offer they (the shareholders and forward looking management) couldn’t refuse.

          No such luck for Nvidia.

          Poor, poor Nvidia … *sigh*.

            • Shining Arcanine
            • 10 years ago

            They looked like they were when they were brought out.

        • Shining Arcanine
        • 10 years ago

        Intel suffers from NIH syndrome. There is no way that they would consider buying Nvidia.

        • outcast
        • 10 years ago

        I shudder at that … The last thing I want is consolidation. I want Intel, AMD/ATI and Nvidia to be major players in the high-end graphics department so that I pay a reasonable price for my hardware upgrades. I was so miffed when Intel canceled their GPU; now if Nvidia was forced to do the same …..

        For now at least AMD/ATI can regain their footing…….this is a good thing….

      • asdsa
      • 10 years ago

      One word: Quadro. If ATI get their workstation act together then nvidia will be in a world of hurt.

      • clone
      • 10 years ago

      with only 131 profit from an investment of 851 million while good to you and me isn’t good for a corporation.

      AMD’s was worse but they have been shedding and they just won the Xmas season handily on even higher margins while Nvidia was stuck almost at a loss.

    • YeuEmMaiMai
    • 10 years ago

    Nvidia, you might want to interview 3dfx and S3 to see what happens when you do the pooch……

    • OneArmedScissor
    • 10 years ago

    What I really wonder is why they don’t even have a 40nm, 128 bit, GDDR5 version of so much as the 9800GT, much less a GTS 250?

    They have made that GPU on THREE different generations of manufacturing, and yet, they can’t seem to even shrink that to 40nm properly.

    Instead, over six months after AMD’s first 40nm, reasonably high performance GPU, the best they could muster was the GT 240. That speaks volumes of their 40nm issues.

    When they moved to 55nm, basically the first thing they did was outright improve an existing GPU, and they can’t even match a three year old one.

    • spigzone
    • 10 years ago

    To TR writers:

    Rant …

    rant   /rænt/ Show Spelled[rant] Show IPA
    –verb (used without object)
    1.to speak or declaim extravagantly or violently; talk in a wild or vehement way; rave: The demagogue ranted for hours

    If it doesn’t apply it shouldn’t be used.

      • spigzone
      • 10 years ago

      Take note, BSN and Fudzilla.

    • PRIME1
    • 10 years ago

    Wow the card is not even here yet and the ATI fans are scrambling for cover. LOL

      • spigzone
      • 10 years ago

      AMD’s 4Q GPU sales growth was 2X Nvidia’s.

      What will it’s 1Q 2010 multiple be? … 4X … 5X … 10X …

      or more?

      ouch.

        • PRIME1
        • 10 years ago

        2 cards to 4 cards is a 2x increase. Look at the actual numbers and you will see the reality of the situation. They sell about 1/3 what NVIDIA does.

          • spigzone
          • 10 years ago

          That was 2X Nvidia’s SALES growth … as in eating Nvidia’s market share lunch.

      • Fighterpilot
      • 10 years ago

      “Wow the card is not even here yet and -[

    • JokerCPoC
    • 10 years ago

    And in the meantime Nvidia’s high end suffers as there are no cards being made to fill the gap as production on the 200 series chips was consigned to the EOL stage(halted), When the new Fermi isn’t/wasn’t even ready yet, So ATi is gaining on Nvidia, Those Who do CUDA seriously won’t buy the low end crap, We’d rather buy the GTX295 and not just one either, Although Nvidia has another idea(as they want to make lots of money, But then their Greedy), Buy an inferior Tesla or a Quadro card instead of a GTX295 in multiple quantities or 3 in My case(I have 1 GTX295 now, Soon I’ll buy the 2nd GTX295 and then a 3rd GTX295 later on when I have the money before the end of the year) and for multiple Teslas and/or Quadros(all Teslas and Quadros are single gpu cards currently with about 4GB of ram on each Tesla/Quadro card) You’d need about $15,000 instead of about $2,000 and two motherboards and cpus and sets of ram, etc.

      • mongoosesRawesome
      • 10 years ago

      what in the world do you use CUDA for that you need 3 GTX 295’s?

        • Grahambo
        • 10 years ago
        • JokerCPoC
        • 10 years ago

        Seti@Home or some other Boinc related project that uses Cuda, Right now the project is down due to an external net issue though and It will be fixed soon, As It’s the weekend. 😀

    • albundy
    • 10 years ago

    “To the CEO’s credit, few games out there have DirectX 11 support at this point”

    That’s pretty much a statement that reeks of desperation. Who would be dumb enough to shell out the same amount of cash for last years tech? The fact that it is already being used makes NV look like they are standing in the cold rain without an umbrella. Does common sense ever prevail?

      • JokerCPoC
      • 10 years ago

      Only those who need last Years Tech, People who run CUDA enabled software for one and You can’t mix DX10 and DX11 hardware I’ve read.

        • JustAnEngineer
        • 10 years ago

        NVidia doesn’t have any “current-generation” (DX11) hardware. They’ve recently introduced a few “last-generation” (DX10.1) GPUs and they’ve got a whole bunch of even older (DX10.0 and down) GPUs that their evil marketing geniuses keep renaming rather than replacing with current generation technology.

          • PRIME1
          • 10 years ago

          ATI lacks hardware physics and 3D support. This won’t change, they are on their way out.

            • ClickClick5
            • 10 years ago

            l[<"...they are on their way out."<]l You are in more denial than I ever thought. OR you are just screwing with us all.

            • OneArmedScissor
            • 10 years ago

            I hope he means that hardware physics and 3D whatsit are on their way out.

            • JumpingJack
            • 10 years ago

            Gentlemen, meet the Sharikou of GPUs…..

    • tomc100
    • 10 years ago

    I love these fanboy flamebait articles. More entertaining to read the blogs than the actual articles.

    • outcast
    • 10 years ago

    Fermi Architecture GPUs Will Only “Hit the Full Stride” in Q2 – CEO of Nvidia.

    Mass Availability of GeForce GTX 400, Other Fermi Products Delayed Till Q2
    [02/18/2010 12:44 PM]
    by Anton Shilov

    Nvidia Corp. will finally start selling its highly-anticipated GeForce GTX 400-series graphics cards as well as other products based on the code-named Fermi architecture and GF100 (NV60, G300, GT300) graphics processing unit (GPU) in the first quarter of fiscal 2011, it looks like mass availability of appropriate products is only expected in Q2 of FY 2011.

    “Q2 [of FY 2011] is going to be the quarter when Fermi is hitting the full stride. It will not just be one Fermi product, there will be a couple of Fermi products to span many different price ranges, but also the Fermi products will span GeForce Quadro and Tesla. So, we are going to be ramping now on Fermi architecture products through Q2 and we are building a lot of it. I am really excited about the upcoming launch of Fermi and I think it will more than offset the seasonality that we usually see in Q2,” said Jen-Hsun Huang, chief executive officer of Nvidia, during the most recent conference call with financial analysts.

    Earlier the head of Nvidia said that the company would ramp up production of Fermi-based chips in Q1 FY2011. Nvidia’s first quarter of fiscal year 2011 began on the 31st of January, 2010, and will likely end on the 30th of April, 2010; the Q2 of FY 2011 will last from May till late July, 2010. At present many observers suggest that Nvidia will launch the GeForce GTX 470 and GTX 480 graphics cards in March or April, but it looks like the products will not be available in truly mass quantities right after the launch.

    Nvidia’s chief executive officer did not provide any concrete timeframes for the transition of the whole lineup to the new Fermi architecture, but said that since the owners of mainstream and entry-level graphics cards hardly demand new functionality, it is not crucial for Nvidia to update currently available “fabulous” graphics chips. In addition, the speed of the transition depends on the supply of 40nm chips by TSMC.

    “All of that just depends on 40 nm supply and we are trying to finesse it the best we possibly can. For the entry-level products, the truth is that the new architectures […] are probably not extremely well appreciated anyhow. People, who buy the new architectures, tend to be early adopters and they tend to be the game enthusiasts, workstation designers or creative artists or – there are very specific reasons why it really enhances their experience. Our current-generation GPUs are fabulous and all the things that mainstream consumers would use their computer for. […] I think the mainstream GPUs are really fabulous and has been enhanced recently with some really great features and so my sense is that they are going to continue to do quite nicely in the marketplace. Then we will just transition as fast as we can,” said Mr. Huang.

    In the fourth quarter of FY 2010 Nvidia posted revenue of $982.5 million. Sales of GPUs accounted for 58.3% of sales, or $572.9 million. r[http://www.xbitlabs.com/news/video/display/20100218124405_Fermi_Architecture_Will_Only_Hit_the_Pull_Stride_Next_Quarter_CEO_of_Nvidia.html<]§ Subtract 1 year ... darn poster was lazy

    • VILLAIN_xx
    • 10 years ago

    I think Nvidia meant *[

    • geekl33tgamer
    • 10 years ago

    This was cryptic code for “Fermi is not coming out in March 2010”. I’ll bet money on it…

    • designerfx
    • 10 years ago

    lack of DX11 games is because nvidia has more graphics influence than AMDATI. Really, once that’s gone, people arent’ going to want to follow Nvidia anything.

      • Flying Fox
      • 10 years ago

      If you really want to use all 6 letters, at least do it properly: DAAMIT! 😛

        • outcast
        • 10 years ago

        I say that was very witty of you

          • mongoosesRawesome
          • 10 years ago

          FlyingFox ain’t so witty – that’s courtesy of theinq.com

    • neoanderthal
    • 10 years ago

    I don’t know if nVidia is having problems with Fermi or not, but I do know that their “go-to” products are really scarce on Newegg right now. I’ve been trying to help out my WoWtard brother in law with a better system to feed his addiction, and the selection of in-stock GTX 2xx hardware is pretty trim – three GTX260s, no GTX275s, and three GTX285s. The cheapest 285 is only $10 less than the cheapest HD 5870, which is pretty hard to swallow. A GTX260 is a better deal, I think, but I’m worried it’s not enough performance increase over his HD 4850 to make it worth his while.

    I had an EVGA GTX260-192 lo, these many years (well, a couple anyway). I just installed a pair of HD 5770s for crossfire last night and will have to see how that goes as compared to my old 260. I almost got teary-eyed when I pulled the GTX out of my system – I’m no fanboi for either camp, but that damned GTX would overclock like mad, and was a solid, trouble-free performer.

    I hope nVidia pulls a rabbit out of their hat and gets back in the enthusiast game, price/performance-wise.

      • spigzone
      • 10 years ago

      Why not just steer him to a 5850?

      The Powercolor AX is only $300, a mad overclocker, runs very cool and is whisper quiet.

      ??

    • xan13x
    • 10 years ago

    You know, I had been debating when to upgrade since the 5870 came out, and was really hesitant to commit until I saw what Nvidia had to offer. Last week I decided that I was tired of having NO info from Nvidia on Fermi and wanted to upgrade so I went ahead and got a Powercolor LCS 5870. Looks like I won’t be regretting that anytime soon :-p

    • outcast
    • 10 years ago

    q[

      • moog
      • 10 years ago

      Yeah, how much did nvidia pay you guys for that last para?

      • Shining Arcanine
      • 10 years ago

      See the following report:

      §[<https://techreport.com/discussions.x/18449<]§

        • outcast
        • 10 years ago

        If Cyril was referring to sales performance then I can easily buy that for now, but when it comes to price versus benchmark performance i am not convinced.

          • Shining Arcanine
          • 10 years ago

          He said sub-$200 graphics cards. I think that correlates fairly well to sales performance.

    • Ardrid
    • 10 years ago

    If those comments aren’t telling, I don’t know what is. If you weren’t sure who won this generation, you should be now.

      • outcast
      • 10 years ago

      Nvidia’s GPU design is just too complex … making it harder to manufacture and harder to write drivers for.

      I knew this would happen, after they tried to fool investors.

        • Shining Arcanine
        • 10 years ago

        If that is the case, why do Nvidia work perfectly on Linux while ATI’s drivers are complete garbage?

          • outcast
          • 10 years ago

          You jump-the-gun … the reply was about Fermi GPU

            • Shining Arcanine
            • 10 years ago

            Fermi hardware is not available for you to make statements about the quality of its drivers. You have no basis to discuss the difficulty of writing drivers for Fermi.

            As for complexity, that is true, but honestly, Fermi does much more than just graphics. In my book, the wait for Fermi is worth it, because unlike ATI’s Cypress chips, it will still make sense to program Nvidia’s Fermi chips to do useful things outside of the scope of graphics 10 years from now.

            If you think otherwise, try programming stuff for Cypress and tell me whether or not you would still be willing to do that 10 years from now when run of the mill CPUs from Intel outperform it. Unlike the original Pentium chip, which still has programs made for it because they can be written with ease, programming on Cypress is hell on earth and the only reason you will see people programming it is because they are forced to do it.

            • outcast
            • 10 years ago

            Since you agree that Fermi is more complex. As complexity increases it’s not hard to envision difficultly in writing drivers (should increase proportionally)…as for the quality of the final driver, I made no reference one way or the other (good driver or crappy driver) in my original statement. I only made an inferred statement about the challenges faced by Nvidia’s team.

            • MathMan
            • 10 years ago

            Take it from this hardware engineer: complex hardware doesn’t necessarily result in complex drivers. In fact, it can very much be the opposite and it often is. That’s really not hard to understand. Software guys will very often pester the hardware guys to make their designs more sophisticated to make their life easier.

            In the case of Nvida Tesla class GPUs, it’s definitely the case for the shaders. They throw quite a bit more hardware at scheduling and register scoreboarding, which makes writing a compiler significantly easier.

            Unfortunately, it also costs more area, so the perf/mm2 will suffer.

            • Shining Arcanine
            • 10 years ago

            Today, it seems that anyone who can assemble a bunch of computer parts and install Windows thinks that automatically makes them an expert on how computer engineering works when it does not.

            It is nice to see someone here who knows how computer engineering works.

            • Shining Arcanine
            • 10 years ago

            If you took a class in computer hardware, you would know that is not the case. Designing a CPU that does 128-bit floating point arithmetic makes the hardware more complex, but programming it is no more difficult than it was for a 64-bit CPU.

          • asdsa
          • 10 years ago

          Seems when ATI now has the best performing cards on the market nvidia fan boys resort to same old Linux driver argument. Who gives a rats arse about Linux drivers except total geeks and some corporate branches (server use)? You don’t need 2 teraflops of GPU power on a platform that is meant for running Tomcat or JBoss inside a locked closet.

            • Shining Arcanine
            • 10 years ago

            I am a computer science student and I run Gentoo Linux on my laptop. Having my own Unix system helps me in my classes, which makes Linux driver quality very important to me.

            Linux is for more than just corporate servers. I suggest you try it. It is really great when you have all of the development tools needed to make programs in an academic environment integrated with your system. It is even nicer when all of that comes without the bloat of Windows.

            • Game_boy
            • 10 years ago

            AMD drivers on Linux are fine. I can run games with no issues.

            And they are the only ones to provide open drivers.

            • Shining Arcanine
            • 10 years ago

            Tell that to my classmates at school that have laptops that are 3 to 4 years old. Getting a Desktop Environment working on that hardware is hell on earth for them. The more knowledgeable ones among them cite ATI’s tendency to drop driver support for older products as the cause because the kernel and X11 interfaces tend to change as time passes.

            By the way, Nvidia provides the source to the software that acts as a bridge between the kernel/X11 interfaces and their binary driver, which makes it really easy to update their driver to support the latest kernel revisions. ATI’s open drivers might have the same benefit in theory, but creative has open drivers too and as far as I know, being open has not really helped the situation very much. The only reason they released the source code was to make the vocal people who ask them to support their hardware like they are supposed to support it to shut up.

            • w00tstock
            • 10 years ago

            q[< who gives a rats arse about Linux drivers except total geeks and some corporate branches <]q way to totally prove him right... I am however at a total loss as to why people using gentoo would be having a issue with ATI binary drivers as gentoo still iuses the older xorg which is fully supported by amd/ati drivers. Maybe if you said ubuntu, but you didn't. Considering ubuntu I have a system running integrated ati and a 4850 with no issues what so ever... Arguments over new hardware and ati linux drivers doesn't make sense anymore.

            • Shining Arcanine
            • 10 years ago

            What do you consider to be old? I am running x11-base/xorg-server-1.7.5 on my machines. According to Wikipedia, Ubuntu 10.04, which is bleeding edge as far as Ubuntu goes at the moment is still running xorg server version 1.7.3, which is outdated software:

            §[<http://en.wikipedia.org/wiki/List_of_Ubuntu_releases#Version_history_of_common_programs<]§ If I wanted, I could opt to upgrade to one of the release canadiates of xorg server 1.8.0, but that is a bit too new for me. What are you running on Ubuntu? By the way, if you knew much about Gentoo, you would know that many of its users run it with ACCEPT_KEYWORDS="~x86" in make.conf, which ensures that they have all of the latest stable software available from the upstream developers, even if Gentoo's package maintainers have not cleared it yet for use on Gentoo. This puts Gentoo a full six months to a year ahead of Ubuntu in terms of having the latest software.

            • outcast
            • 10 years ago

            The new driver(s) released by ATI for Linux, does not give me any problems. They just work like a champ with Compiz and what every I throw at their way, but then I don’t have to contend with old unsupported ATI GPU in Linux. I can however see why other Linux “Geeks” would love to have an easier time installing ATI’s driver(s), without having to fix xorg.conf file or other trouble shooting for video-playback. Dropping support for older GPU was not a positive development so ATI got a lot of Linux “Geeks” miffed.

            PS: You can play games on Linux… you know, some people (Linux “Geeks”) use Wine and Cadegra etc to play MS-Windows games.

            • Shining Arcanine
            • 10 years ago

            According to my more unlucky classmates at school that have laptops that use ATI graphics cards, it is hell getting them to work properly. Unlike Nvidia, which maintains an old binary driver that they update to be able to interface with the latest kernel and X11 interface revisions, ATI drivers will only work with kernel and X11 versions from around the time they were produced.

            Getting ATI’s linux drivers for hardware older than say, 2 years old, to work with the latest Linux software is hell and that is the situation that all of my classmates with ATI graphics cards have.

            • esterhasz
            • 10 years ago

            Man, with that kind of output I really hope for you that they pay by word…

            • Shining Arcanine
            • 10 years ago

            I am not being paid anything. I just think things like good Linux driver support and GPGPU are more important metrics than the ability to play games, both of which are things that Nvidia has which ATI does not.

            • neoanderthal
            • 10 years ago

            I haven’t had these problems you refer to regarding to ATI and linux. I have a crossfired pair of 5770s (just got ’em actually), and the current Catalyst for linux works fine. I enabled Crossfire under linux and Compiz, Warsow and OpenArena all worked without a hitch. I didn’t have any problems with Warsow and OA with Compiz enabled, something I could never manage with my GTX260.

            I’ve used ATI drivers under linux with graphics chips as old as Mobility X300 (Thinkpad T43), and haven’t had any issues in a long time. My main workstation for the longest time had an X1950 Pro and it was rock-solid.

            Hyperbole helps no one. We’ve got one PRIME1, we don’t need another.

            • Shining Arcanine
            • 10 years ago

            Would you provide me with details on your hardware and Linux configuration? I would like to pass them on to my peers at next week’s linux user’s group meeting.

            • kc77
            • 10 years ago

            I’ve got a Mobility 1250 that works well. The ATI drivers have problems with Video acceleration in particular. I will admit that if I had a choice it would be Nvidia drivers hands down though. Overall I’ve been able to enable 3D support on anything from the 3 series on quite easily. Ubuntu is pretty good for detecting the needed driver and automatically installing it for you.

    • maxxcool
    • 10 years ago

    To the CEO’s credit, few games out there have DirectX 11 support at this point, and Nvidia has managed to stay roughly competitive on the performance front as far as sub-$200 desktop graphics cards are concerned.

    ummm didn’t 3dfx say that about 32bit color games ???? Trend ??

      • BabelHuber
      • 10 years ago

      True or not, this is just Marketing BS anyways.

      ATi has a full line of DX11 cards. Hence they could say ‘We have DX11 from top to bottom. This is the state of the art. Everything else is outdated.’

      Since Nvidia does not have DX11 cards out on the market, they must claim that the new features are not relevant for mainstream users.
      What else could they say to try to keep customers and OEMs quiet?

      OTOH: He says: q[

      • Game_boy
      • 10 years ago

      Yield problems with the die (rather than the process) can only be fixed by a new stepping from Nvidia, not by TSMC.

      If we are to believe Charlie, two minor spins haven’t worked. So a major spin is needed, which will take about six months.

    • stmok
    • 10 years ago

    Where’s *[

      • PRIME1
      • 10 years ago

      What’s to defend? They are still beating ATI. So I don’t see much reason to comment beyond that.

        • Game_boy
        • 10 years ago

        Beating ATI? Is the 5970 not faster than the GTX 295?

          • PRIME1
          • 10 years ago

          There are more things to measure than just speed.

            • Game_boy
            • 10 years ago

            Name some.

            AMD has: DX11, Eyefinity, lower power consumption, lower noise, lower prices than similarly performing Nvidia cards
            Nvidia has: PhysX, 3D Vision

            They both have some sort of GPGPU code, though AMD’s is underused. The hardware potential is the same [AMD has greater FLOPS, Nvidia is easier to program].

            I’d say it’s all about even on features.

            • derFunkenstein
            • 10 years ago

            market share?

            • flip-mode
            • 10 years ago

            zing

            • Game_boy
            • 10 years ago

            OK, Nvidia’s marketing is better.

            • indeego
            • 10 years ago

            And Intel’s was better than AMD’s during netburst. /[

            • TO11MTM
            • 10 years ago

            Or how some people now think Japanese cars are technically superior.

            No, but seriously, I Do see some of where PRIME1 is coming from. My last few video cards (Both laptop and Desktop) have been Nvidia because I know their cards do certain things well that the ATI cards just didn’t do right at the time. I’m still not sure if ATI’s HDMI outputs 768P on newer cards…

            • MrBojangles
            • 10 years ago

            My 4870 does 1360×768 perfectly.However 1280×720 does not fit the screen properly. But i’m pretty sure its do to the fact i use a 42″ 1080p hdtv as a monitor instead of a dedicated pc monitor.

            • shaq_mobile
            • 10 years ago

            i had an issue until i turned off overscan on my tele. the stock settings on ati drivers are whats at fault, not the cards (to my knowledge). only reason i use a gtx275 is because my friend talked me into it after my 4890 had that weird electronic buzzing noise that comes from current fluctuating in the capacitors when they dont get coated with some chemical or something. in retrospect i should have just purchased a different 4890. my gtx 275 has a resurrected 5800ultra cooler on it and it drives me up a wall.

            • MrBojangles
            • 10 years ago

            I have a simular problem with my 4870.If i try to use my dvi to hdmi cable i get a loud buzzing feedback noise from my tv.i fixed it by just useing the dvi to hdmi converter that came with the card, along with a regular hdmi cable.

            • JustAnEngineer
            • 10 years ago
        • Skrying
        • 10 years ago

        Why are you so irrational?

          • TurtlePerson2
          • 10 years ago

          Don’t kick the nVidia loyalist while he’s down. Just because nVidia is losing on price, performance, and power consumption right now doesn’t mean that they’re out of the game for good.

            • Skrying
            • 10 years ago

            It’s a clear case of someone being so irrational and biased that they can’t think straight. How this happens to people is a great mystery to me.

            PRIME1 constantly does this. He isn’t doing this to troll. He actually believes what he says!

      • anotherengineer
      • 10 years ago

      right there at #15 lol

      Im a fanboi girl, in a fanboi world, wrapped in plastic, its fanastic come on Jen- Hsun lets go party 😉

      sry couldnt help it 😀

      #15 “They are still beating ATI”

      not at DX11 cards ZING

      sry couldnt help that one either 😉

      • SubSeven
      • 10 years ago

      He has HighTech4US substituting today…..

        • geekl33tgamer
        • 10 years ago

        LOL – This topic brings out the fanboys in masses…

          • outcast
          • 10 years ago

          True LOL

          I want Nvidia to succeed so that we can have round 2 on the GPU wars… that is the only way we will get a fair price.

    • rodney_ws
    • 10 years ago

    $259 XFX 5850 FTW! Had it since October the week it was released… haven’t regretted it once. Nvidia just didn’t have anything in that price/performance range worth considering. I guess they’re going to push back the launch of Fermi yet again? Sure hope it’s a good match against 6000 series cards!

      • JustAnEngineer
      • 10 years ago

      NVidia has had nothing competitive at any price point since the end of September, 2009. It’s sad and disappointing.

      Where are those lying shills and fanboys who swore back then that Fermi would be available in stores in volume by the middle of January 2010?

        • insulin_junkie72
        • 10 years ago

        Above $100, quite true.
        Below $100, it’s not so cut and dried, and depends really on what your needs/focus is.

          • thecoldanddarkone
          • 10 years ago

          It’s not just muddled, it’s downright terrible. I’m looking for a card in that range, and right now it looks like I might end up getting a 9600gt for 75 after shipping.

      • swaaye
      • 10 years ago

      Radeon 6000 series probably won’t show for over a year. If history is any indication, ANYTHING can happen. 😀

        • entropy13
        • 10 years ago

        Aren’t they aiming the 6000 series for 3Q 2010 earliest, 4Q 2010 latest?

          • Game_boy
          • 10 years ago

          They could have a mid-cycle refresh just for the high-end. In fact, I’d expect them to; they did it with X1900, with RV670 and with RV740/RV790.

          That would normally come about 6-8 months after the original launch, so on Fermi launch day would be good.

      • cybot_x1024
      • 10 years ago

      Codename: Hecatoncheries FTW!

    • HighTech4US
    • 10 years ago

    More C R A P reporting from Tech Report.

    Why do these reporters seem to never do any work on their own. Or when they do they hand pick a paragraph to justify char-lie’s FUD.

    Read the entire 4Q2010 conference call here:
    §[<http://seekingalpha.com/article/189155-nvidia-corporation-f4q10-qtr-end-01-30-10-earnings-call-transcript?source=yahoo<]§ and you will see these Q&A's ---------------------------------------------------- Doug Freedman - Broadpoint AmTech And then my last one is really, the Fermi architecture, how quickly do you guys think you are going to be able to take that architecture to the full product line? What is sort of the – what expectation should we have as far as getting those into the mainstream and if my understanding is correct, you are going to launch that at the high end. Jen-Hsun Huang You know, all of that just depends on 40 nm supply and we are trying to finesse it the best we possibly can. You know, for the entry-level products, the truth is that the new architectures, the very, very entry-level GPUs are probably not extremely well appreciated anyhow. And so, you know, the reasons why people buy the new architectures tend to be early adopters and they tend to the game enthusiasts or workstation designers or creative artists or – there are very specific reasons why it really enhances their experience. Our current generation GPUs are fabulous and all the things that mainstream consumers would use their computer for. You know, all of the high-definition videos, even plays 3-D Blu-Ray. We are the only GPU that processes 3-D Blu-Ray completely in the GPU. All of them have CUDA and they are all compatible with the Fermi architecture, and now, all of them have Optimus. Right? So I think the mainstream GPUs are really fabulous and has been enhanced recently with some really great features and so my sense is that they are going to continue to do quite nicely in the marketplace. And then we will just transition as fast as we can. ---------------------------------------------------- Seems to me the answer is that mainstream Fermi parts will become available when the 40nm supply issues are resolved (quoted to be 2H2010) in the meantime the DX10.1 parts are selling just fine for nVidia in fact nVidia is selling everything they get back from TSMC.

      • Game_boy
      • 10 years ago

      There are no major 40nm supply issues. Nvidia’s own G 210 through 240 seem to be fine and in good supply, as well as the entire 5xxx series except the dual card.

        • maxxcool
        • 10 years ago

        they are referring to the yeild problems with the fermi die itself not 40nm as a whole. its (fermi) much more complicated that the ati part in terms of design, power usage and leakage.

        • HighTech4US
        • 10 years ago

        You are wrong.

        And you seem to have problems reading the transcript so I will quote it for you (and others here who seem unable or refuse to READ it).

        Alex Gauna – JMP Securities
        Thanks very much. I was wondering if you could clarify a little bit on your supply constraints facing for the first quarter. I am assuming that means that you are going to see seasonality in legacy products and you can’t get enough of the new to offset it or maybe wrap some color around that, please?

        Jen-Hsun Huang
        Seasonality question, you know, we – the seasonality in the PC industry, Q1 is generally a down quarter, and if you take a look at – if you look at what most companies have guided, they have guided down sequentially. We are guiding flat, and if we had a sufficient supply to support the ramps of the products that we are going into production with right now, we would guide further up. You know, our company is currently in a growth phase. There are new products in many new markets, as well as going into a new product generation. So there is a lot different reasons why we are expecting to grow and our suppliers are working as hard as they can for us and their yields are improving. However, we remain constrained.

        Suji DeSilva – Kaufman Brothers
        Thanks. Can you guys help us understand what the impact was on the quarter from the constraints and perhaps on the guidance as well just to give a sense? I just wanted to understand where the 40 nm constraints come off at this point.

        Jen-Hsun Huang
        Well, it is hard to figure out exactly how much was constraint volume, you know, it is probably on the order of a couple of hundred million dollars. But, the big picture is that 40 nm supply is constrained for the world. You know the market snapped back a lot more quickly than people expected in the beginning of the year and it surely appears that Windows 7 is doing very well and I think that surely appears on top of that that GPU adoption is increasing. You know, people are – we have always believed that as you know, that we think that more and more people are enjoying their PC experience with GPUs and more and more PC manufacturers recognize the benefits of adding GPUs to their system. And so we are seeing a pretty strong demand across the board, whether it is desktop PCs or notebooks PCs, and we are seeing an increased demand in our workstation business, and on top of that, we are in the process of ramping a new architecture called Fermi that everybody is really excited about. And you know, Fermi is going to sweep across GeForce and our Quadro business and across our Tesla business. And so, we just have a lot of growth drivers going right now, and we could surely use some more supply.

        Daniel Berenbaum – Auriga USA
        Hi, good afternoon, and thanks for taking the call. Jen-Hsun, I want to make sure I understood a comment that you made a little bit earlier when you were talking about the supply constraints, you were saying that the order of the supply constraints, if you didn’t have the supply constraints, you could have done another couple of hundred million dollars in revenue. First question, did I hear that correctly and second of all, can you help us understand a little bit is that really just focused in GPU where you could have had that extra revenue and can you help us give us any color on who is not getting parts, you know, is it notebook, is it the channel, can you give us a little bit of granularity there, if I did understand that correctly?

        Jen-Hsun Huang
        Well, I was trying not to give you an exact number, and I probably shouldn’t have given you a number at all. Because we really don’t know when supply – we knew that during the quarter, channel inventory was too low. When David says that channel inventory was five weeks, five weeks average in fact means gap outs in many regions around the world. And you can only measure channel inventory really in the final analysis to what the average is, but we know that certain regions are much harder than other regions and it is hard to keep them from stocking out.
        And the second thing is OEMs, without exception, notebook OEMs were handing out the whole quarter and we worked quite hard to keep them from gapping out throughout the quarter, and we weren’t successful most of the time. And so, there were challenges in fulfilling the demand that we did have throughout the quarter.

        Nicholas Aberle – Caris & Company
        Good afternoon. My question is, so given the scenario where 40 nm supply is constrained throughout all of 2010; my guess is you guys have to make some pretty tough decisions on how to allocate the supply that you do have. Given you guys are transitioning GPU, desktop, and notebook over to 40 nm, Tegra 2 is 40 nm, Tesla 2 is 40 nm; how do you guys make those decisions as we roll through the year?

        Jen-Hsun Huang
        Well, I said that supply was going to be constrained; I didn’t say supply was going to be constant. We are expecting a big year, but frankly, if we had more supply, we would have a bigger year. And so, we just have to keep working at it. You know, [inaudible] is doing a fabulous job improving their yields and if you just look at from August when we first started ramping to now, the yield improvements have been done and the execution that their teams have gone through is just fabulous. This is a really extraordinary company and what they have done with 40 nm yields over the last several months is really fabulous. And so, if the trend continues, things are going to look a lot better and so we are hoping for it to continue, we are working hard together to make it improve and so I am hoping that supply will abate, I think that was the work I used earlier, sooner than later, because we can certainly use the wafers to drive our growth.

          • StashTheVampede
          • 10 years ago

          An interview with the head of the company isn’t necessarily the best source of true information on the product. The fact that he’s happy with DX10 cards (while competition is at DX11) is a good indication (to me) his DX11 cards aren’t ready.

          We also don’t have ANY fermi OEM board leaks, period. No pictures, no benches, no screenies, period. Other Nvidia and AMD boards have had leaks, Fermi hasn’t had any (Taiwan is poor at holding secrets).

          Combine praise for shipping product (duh) and no leaks of Fermi perf #s and then the rumors of poor yield make sense (not to the degree that was stated on SA). When was Fermi supposed to ship anyhow?

            • HighTech4US
            • 10 years ago

            It’s not an interview. It’s the financial conference call for the 4Q2010.

            • sluggo
            • 10 years ago

            It is an interview. Analysts get to ask questions on the product and financial performance of the company. Call it whatever else you want, it’s still an interview.

            And the amount of factual information that changes hands during these things is determined SOLELY by the interviewee; he gets to put what spin he likes on the answers, and he gets to leave out whatever he likes. His purpose there is to paint for the analysts the most optimistic picture of the company’s prospects that he can while matching it up with the current data.

            If you make no effort to read between the lines on an analyst’s conference call you might as well not be listening.

          • glynor
          • 10 years ago

          All of that looks to me like it either confirms everything Charlie and X-Bit said, or at-best, certainly doesn’t flatly deny it. Their words are spun in PR marketing-speak, of course, but that doesn’t change the content of what they were saying.

          Basically, the investors were asking “Why did Q4 stink, and why did you post a loss for 2009?” And their response was “supply, blah, market came back faster than we could have ever anticipated, blah, supply blah, 40nm yield issues, blah, better supply is coming next year”.

          How does any of this contradict what Charlie said? Huang says NOTHING firm about why supply is constrained. There are certainly some vague implications made, but the channel seems to be having no trouble now getting 40nm ATI 5000 series parts across the board, so I don’t buy that as the complete story.

          Sounds like:

          1) They made a bad call about Q3 and Q4 demand, and cut their orders too deep. You can’t just pop a bunch of 55nm chips out of nowhere when demand suddenly surges and caught you with your pants down.

          2) Fermi isn’t yielding like they hoped, which pretty much confirms what Charlie is publishing. Now, it likely isn’t as bad as Charlie makes it seem (the sky probably isn’t falling), but it also isn’t all roses. They are talking to INVESTORS here, not engineers. They don’t care about the why. They want to know what business steps are being taken to fix it.

            • HighTech4US
            • 10 years ago

            Wow I have now seen someone who can twist facts even more than charlie

            Alex Gauna – JMP Securities
            Okay, thanks very much. Congratulations, nice quarter.

            Craig Berger – FBR Capital
            Hey, guys. Nice job on the results.

            Doug Freedman – Broadpoint AmTech
            Great, and thanks for taking my question. Congratulations on a strong quarter.

            • MrBojangles
            • 10 years ago

            Wow were you just imagining your own transcript while reading. Cause the one i read consisted of Huang repeatedly being asked about constraints.At which point he would repeatedly dance in circles around the subject only stopping on occasion to tell them things were “fabulous”.

            • green
            • 10 years ago

            yes because as history has dictated, it’s ALWAYS a good idea to bite the hand that feeds you. especially when the hand feeding you has a limited supply of food available.

            • maxxcool
            • 10 years ago

            LOL… no no no… the true master of facts is Mr. Jobs … aka Agent Smith. THERE is a man that can spin ANYTHING…

            Customer-x “Mr. Jobs? my poop you sold me is crunchy..”
            Agent Jobs “Thats a feature Mr. Anderson… A …. Feature…… you like features right?….. we like features…..”

            anyhoo… only time will reveal the true verdit… its interesting show in the meantime. but you gotta admit huangs is dancing quite a dance 😉

          • Game_boy
          • 10 years ago

          OK, read all of it.

          There are no real 40nm supply issues. There are plenty of 5xxx; plenty of G21x chips; are you saying they don’t exist?

          There are Fermi supply issues. This is Nvidia’s fault, for engineering a huge die without regard to making it work on 40nm.

            • HighTech4US
            • 10 years ago

            You read but you don’t comprehend.

            • StashTheVampede
            • 10 years ago

            You read it and took it all at face value. Read between the lines: they didn’t have enough supply in Q3-Q4 and Fermi yields aren’t enough for a shippable product.

            The CEOs comments for financials aren’t to be believed. Believe Taiwan leaks — those are more trust able because they contain actual product.

            When was Fermi going to ship?

            • HighTech4US
            • 10 years ago

            So you believe that AMDs CEO also lies during their CC?

            • StashTheVampede
            • 10 years ago

            Nice try.

            §[<http://en.wikipedia.org/wiki/Straw_man<]§ Go look it up. AMDs CEOs delivered product in volume throughout Q3-Q4 with plenty of OEMs and retail boards on shelves. They didn't have a Q3/Q4 problem NOR did they have problems at 40nm with a new chip. Their 5XXX line of chips also were leaked to Taiwan with pictures and benches. Do I believe AMDs CEO lied in the last CC? No. Do I believe that Nvidia tell the complete truth? No.

      • maxxcool
      • 10 years ago

      So it the TR is crap why are you here? besides, the issue is pretty clear. Thier blaming TMSC for their yield problems.. how is that inaccurate for TR to publish or comment on ? If you don’t like tech-journalism that is 80% fact prior to full product availability you minus well shove yer head into a sandy hole…. thats all pre-release tech journalism is…. a gathering of comments, pdfs, blurry cell phone shots and semi-anonymous tips.

      get over it.

      basic common sense applies here:
      they did not make last years “deadline”
      the have affirmed a respin and multiple Process tweaks”
      The chips that are going out… are going to Quadro and Tesla partners… not consumers
      Bottom line it all adds up… speculate what you will nvidia is lying.

      And then the try to justify it by telling us that all of us (mainstream) people don’t have a need “or are asking for” directx 11 for under 600$ a gpu.

      lies, more lies, poor lame justification…. and fanbois….. meh. i have a pny gf260 and love it. but this kind of lying does not build any confidence.

      but as a said stick you head in the sand pal…. i will bookmark this and come back in 4 months when the damn thing is reviewed by TR prior to “release” and we all see the 300watt monster for what it is… a dud. nv30 anyone ?????

        • HighTech4US
        • 10 years ago

        You are wrong. R E A D the transcript. nVidia has nothing but praise for TMSC.

        As for why I read Tech Report is that they have, for the better part, very good articles. Where they fail is when they post or link to one of charlies spews and do no work in researching the story. This Tech Report story is a prime example of poor reporting.

          • maxxcool
          • 10 years ago

          Since this is a reprint of xibit labs actually you are slandering them too. and since i am sure Damage and Cryil… Vets on the industry probably contacted Anthony Shiloy for source data and more information i am calling bs.

          like i said before…. i have this article bookmarked. and when AUGUST of 2010 rolls round and we BARLEY HAVE ANYTHING UNDER 600$ from anything in the g400 series… then we will see who was right… and to be honest… charlie doesn’t really need to use any imagination what-so-ever since most everyone else sees the reality distortion filed Nvidia is borrowing from Apple.

          • maxxcool
          • 10 years ago

          additional note: i read the blurb you have… but it does not have the quotes xibit and other sites have… have a look below, but anyways. time will tell but you cannot deny that there is something fishy.

          /[< Fermi Architecture GPUs Will Only "Hit the Full Stride" in Q2 – CEO of Nvidia. Mass Availability of GeForce GTX 400, Other Fermi Products Delayed Till Q2 [02/18/2010 12:44 PM] by Anton Shilov Nvidia Corp. will finally start selling its highly-anticipated GeForce GTX 400-series graphics cards as well as other products based on the code-named Fermi architecture and GF100 (NV60, G300, GT300) graphics processing unit (GPU) in the first quarter of fiscal 2011, it looks like mass availability of appropriate products is only expected in Q2 of FY 2011. " r[

      • flip-mode
      • 10 years ago

      Jen-Hsun tells nothing but the truth – whole, unvarnished, and completely detached from any passion he has for the standing of his company….. riiiight.

      • SomeOtherGeek
      • 10 years ago

      Oh no! Another person who believes everything he reads. You, my good sir, have shot yourself in the foot, so to speak.

    • moshpit
    • 10 years ago

    That had to be the most defeat I’ve read between the lines in any Nvidia statement ever. That worries me. Charlie is usually a ranting idiot. I don’t want to see him be right about Fermi. But there’s that nagging Nv30 in the back of my mind that even he brought up, confirming I’m not the only one thinking “history often repeats itself, and often catastrophically so”.

    • flip-mode
    • 10 years ago

    The Lord hath spoken… let it be so. DX10 and proprietary lock-out add-ons are the way forward.

      • BiffStroganoffsky
      • 10 years ago

      Oh look, I happen to be atheistic!

    • chunkymonster
    • 10 years ago

    This just reinforces why I dumped nVidia for my Sapphire 4890…

      • Hemotoxin
      • 10 years ago

      After this 8800GT I am not buying Nvidia again if I can help it.

    • anotherengineer
    • 10 years ago

    55nm cuts into people power bill, I think the 55nm yields vs 40nm yields make cutting into margins, marginable.

    Just say we are having technical difficulties, or working out bugs for a flawless product or something, buck passing and finger pointing gets tiring.

    And BTW my HD 4850 radeon is fabulous too Huang 😉

    • Krogoth
    • 10 years ago

    They are correct business-wise. The DX10 generation of GPUs are still more than adequate for the majority of users.

    However, when CEO is focusing attention away from Fermi. You know there is something wrong with it.

      • Jigar
      • 10 years ago

      There are 2 people that wont believe you…

      Prime1 & Huang. 😉

      • tviceman
      • 10 years ago

      You did not read the whole article then.

      “Q2…is going to be the quarter when Fermi is hitting the full stride. It will not just be one Fermi product, there will be a couple of Fermi products to span many different price ranges, but also the Fermi products will span GeForce Quadro and Tesla. So, we are going to be ramping now on Fermi architecture products through Q2 and we are building a lot of it. I am really excited about the upcoming launch of Fermi and I think it will more than offset the seasonality that we usually see in Q2,” said Jen-Hsun Huang, chief executive officer of Nvidia, during the most recent conference call with financial analysts.

        • Game_boy
        • 10 years ago

        Those products will all be from the same chip, those don’t reference derivatives. Couple = 480 and 470; S2050 and S2070.

        • Arag0n
        • 10 years ago

        I told you days ago… they we’re talking on the Mobile World Congress about a full range of graphic cards based on Fermi and with the 480 GTX improving the performance on 40-60% versus GTX 295…..

      • Preyfar
      • 10 years ago

      My GTX 260s eat up any game I can throw at them, and I’m completely satisfied with their performance.

      I’m never against having MORE performance, but I think we’re at a point now where there’s no huge push to want to upgrade. DX11 is nice, but for the most part, it’s not a huge visual leap from DX9/10 (honestly, unlike the DX8 to DX9 jump, most of the DX11 differences you may not notice unless directly pointed out). There hasn’t been a game to really challenge the graphics market since Crysis, and until there is I don’t think people will feel the need to push forward to upgrade until that’s that one game that blows everybody’s minds.

      Granted, if you need a new card, the 5000 series is the way to go, but I’m oddly content with my GTX 260s (and I’m always the first guy around to jump to the new platform).

        • MrJP
        • 10 years ago

        But he’s actually talking about the architecture a generation before your GTX 260s – the G92 and its many close relatives. The 260s were/are great cards for the consumer, but bad for Nvidia because they’ve ended up having to sell them at much lower prices than intended and have had no success in shrinking or chopping them down for the mainstream market.

        If the rumours are anything like correct, then the situation is even worse with the latest generation which may not be financially viable even as a high end product. He’s got to promote the old architecture because that’s all they’ve got to rely on for revenue for the immediate future (as far as discrete graphics products are concerned, anyway).

          • spigzone
          • 10 years ago

          And the 5830 will leave ZERO competitive mid to high range Nivida cards.

          In a week, Nvidia will not have one single price/performance/feature competitive card in it’s entire line-up.

          Nada.

          Zilch.

          Butkis.

    • NeelyCam
    • 10 years ago

    Talk about blind leading the blind

      • dpaus
      • 10 years ago

      I think you mean “the bland leading the bland”

Pin It on Pinterest

Share This