Nvidia prepping Kepler in 2011, Maxwell in 2013

GTC — During his opening keynote address at Nvidia’s GPU Technology Conference today, Nvidia CEO Jen-Hsun Huang revealed the first details of his firm’s future GPUs.

 

Huang offered a brief roadmap sketch, with a few notable details about a couple of future products.  He projected performance using an unusual, GPU-computing-focused metric: double-precision gigaFLOPS per watt, which indicates power-efficiency rather than raw, peak performance.

The next major GPU architecture from Nvidia, code-named Kepler, is slated for release in 2011, with the first products likely coming in the second half of the year.  Kepler will be based on a 28-nanometer fabrication process, and Nvidia intends for it to be nearly three times more efficient, in DP FLOPS per watt, than today’s Fermi architecture.  Huang noted that such an improvement goes “far beyond” what process technology advances alone can achieve.  Changes in the chip architecture, design, and software will contribute to that advance, as well.

The Maxwell architecture will come next, in 2013, and will be produced on a 22-nm fabrication process.  Maxwell promises nearly an 8X increase in DP FLOPS per watt beyond today’s Fermi chips.  Huang noted that, in parallel computing, power is the primary constraint, which is why he chose that metric to describe future architectures.

Between now and the Maxwell generation, Nvidia plans to introduce several compute-focused features into its GPU architectures, including pre-emption.

In between these major architectural generations, which are planned to track with the introduction of new process technologies, Huang said Nvidia will continue to produce a “mid-life kicker” or an enhancement to its existing microarchitecture. The next such update will be a refresh of the Fermi-generation GPUs, coming next year.

As before, the firm aims to do a top-to-bottom refresh of its GPU lineup, whenever it introduces a new generation of GPUs, within about three months.  Huang admitted the Fermi generation has taken longer than he’d like because of some challenges in producing the first chips.

Comments closed
    • mzie25
    • 9 years ago

    oh damn…. i just bought a Quadro fx 3800.. i think i have wasted my money.. should of gone for an ATI FirePro

    • Nutmeg
    • 9 years ago

    I think they should definitely change to smaller processes even faster. Cos the last time just went so well, right?

    • Wintermane
    • 9 years ago

    Dont forget people they are only talking about gpu compute they are not talking about gaming performance.

    • vvas
    • 9 years ago

    Here’s my guess: the “mid-life kickers” referred to are probably die shrinks. So, before Nvidia actually roll out Kepler, they’re first going to release Fermi derivatives on a smaller process at some point next year. Which sounds like a reasonable move.

    Charlie over at SemiAccurate claims that Nvidia have come into an agreement with GlobalFoundries for producing some of their chips there. Although such a deal (if it actually exists) is probably meant for Tegra mostly, could it be that we might see a GF102 next year on GF’s 32nm? That’d be pretty sweet. :^)

      • Shining Arcanine
      • 9 years ago

      Single precision performance is twice that of double precision performance on Fermi in the actual silicon.

      Anyway, 64-bit CPUs exist mostly for double-precision performance, which as you said, does nothing for games, but that did not stop people from being excited about it.

    • Silus
    • 9 years ago

    LOL, seems like Huang got what he wanted. All the AMD fanboys having a fit over chip codenames. Priceless!

    • Fighterpilot
    • 9 years ago

    Uh oh…Silicon Doc has morphed into a new name….I recommend an immediate ban.
    There’s also an eerie similarity in the hate speech to those awful PMs that were buzzing around here a few months ago.

    • Wintermane
    • 9 years ago

    Maybe 2009 was when nvidia actualy had its first working silicon? we know the fab issues delayed mass production but they didnt delay first silicon…

    • ronch
    • 9 years ago

    So no new architecture every year anymore?

      • Meadows
      • 9 years ago

      …?
      When was the last time you honestly saw a new architecture every year?

        • ronch
        • 9 years ago

        May, 2007 – ATI HD2900XT
        December, 2007 – ATI HD3870
        July 2008 – ATI HD4870
        September 2009 – ATI HD5870

        Note: These are the dates these cards were reviewed, so it couldn’t be far off from the respective card’s release date.

        Maybe Nvidia doesn’t roll out new architectures every year, but ATI pretty much does, even if it’s just increasing the number of unified shaders and a few tweaks and features here and there. And yes, I gave examples of ATI products because your question is whether I see new architectures every year or not, not specifically confined to Nvidia products.

          • Voldenuit
          • 9 years ago

          Aren’t those all evolutions of the same architecture?

          Not singling out ATI here, G200 was based on G80, and the 6×00 and 7×00 cards were based on the FX series.

          Just saying that new architectures aren’t born out of whole cloth frequently in either the GPU or CPU world.

          • Silus
          • 9 years ago

          Those are not new architectures. From RV770 to RV870 (Cypress), there were more changes than from R600 to RV670 and RV670 to RV770, but only R600 was actually a new architecture and it is from 2007…

          In 3+ years (since November 2006), NVIDIA had two new architectures, found in G80 and now GF100.

      • Silus
      • 9 years ago

      Maxwell is only targeted at 2013 because fabs won’t have 22nm before 2013. Only Intel will probably have 22nm ready before that, but Intel doesn’t outsource their fabs to other companies.

    • HisDivineShadow
    • 9 years ago

    So wait. They are a year late on Fermi because they couldn’t work out a new process combined with a new GPU, so… they turn around and introduce ANOTHER new GPU with ANOTHER new dieshrink?

    nVidia, you’re really taxing credibility here. It just sounds pretty damn stupid.

      • Silus
      • 9 years ago

      Yes, because they will always fail to deliver from now on…

      Did you have the same logic for ATI, with R600 and R520 ?

        • Waco
        • 9 years ago

        Hey now, I liked my 2900XT. Poor card – everyone always hating on it. 😛

    • MadManOriginal
    • 9 years ago

    I was really optimistic about GPGPU taking over the ‘high-performance’ demanding consumer applications…2 years ago. Badaboom still doesn’t convert video at as high a quality as CPU implementations, getting faster speed by lowering quality is not a ‘win’ in my book just an alternative at best. With dedicated silicon coming in CPUs for the last few highly demanding consumer uses I see the brief opening for GPGPU to take over in the consumer arena closing quickly.

      • Voldenuit
      • 9 years ago

      If you’re referring to Sandy Bridge’s encode hardware, that is actually situated on the GPU, and is disabled (along with the rest of the GPU) when a discrete GPU is installed. It’s possible (or hoped that) intel (or someone else) will make a workaround by the time SB comes to market.

      I will agree that GPGPU has been a lot of hot air and no real results. It requires a lot of hardware/software integration and developer effort. Adobe Premiere Pro CS5 has CUDA acceleration, but since it costs $799 for a license, it’s not exactly going to be embraced on a wide basis by the public.

      I expect Apple with iMovie and OpenCL to reach the consumer first. They have tight integration of hardware and software and have much to gain from improving the Mac platform.

    • MadManOriginal
    • 9 years ago

    YO JEN-HSUN WHY DIDN’T YOU USE THE r[

    • RtFusion
    • 9 years ago

    Fermi in 2009? What timeline does nVidia exist in? The reality distortion field now distorts your calendars.

    • TaBoVilla
    • 9 years ago

    guess how fermi looked like in a similar “optimistic” graph in 2008, compared to older generations.

    this metric is as useless as an ashtray on a motorcycle.

    we all know they can “aim” at a power envelope, but the final outcome tends to vary, alot, specially for nvidia.

      • bottlenecker
      • 9 years ago

      Why you gotta be knockin’ my motorcycle ashtray? You jealous?

    • clone
    • 9 years ago

    talking about future tech while the current tech isn’t doing well is pretty routine.

    I’d have thought this not entirely necessary with 460 and 450 in place but low end is bread and butter so my guess is another profit warning from NVidia.

    message to shareholders: “don’t leave we’ve got better stuff coming.”

    • DeadOfKnight
    • 9 years ago

    Retrospectively that graph would put Kepler’s performance in 2012 and Maxwell in 2015. That would make more sense to me and seem a lot more attainable anyways. I hope we all look back at this when they are actually getting 15 GFLOPS per Watt to have a good laugh.

    • Forge
    • 9 years ago

    q[

    • Wintermane
    • 9 years ago

    As with fermi they have to annouce roughly what the next compute sentric gpus are going to offer a good while ahead of launch. Also I beleive the fermi came out in the 2009 BUSSINESS year.

    All this means is the next products will either have a hell of alot more compute power or they will have a fair boost in power and a lower power draw.

      • JustAnEngineer
      • 9 years ago

      Fermi actually arrived in limited quantities at the end of NVidia’s financial 2011 Q1. That was calendar 2010 Q2, but NVidia has an unusual fiscal year and quarter system.

    • Disco
    • 9 years ago

    It seems that Fermi just came out. After all that lead up and marketing, now that it’s finally starting to hit its stride Nvidia are essentially tossing it aside to anounce their next product. Seems a bit early to me…

    I guess they are just trying to offset the ATI/AMD goodness that will be available very soon. But I can’t really see how this focus on the future “who knows when it will actually arrive” Kepler will provide them any positive results. short term or long term.

    • kdashjl
    • 9 years ago

    go fermi go

    • Ryu Connor
    • 9 years ago

    Interesting the focus on double precision.

    A sign of a long term transition out of the enthusiast market?

    I suppose if one is out to trade niches, I’d take HPC over enthusiast too.

      • cobalt
      • 9 years ago

      I suspect the double precision emphasis (and the vertical axis being performance per watt) are an artifact of being presented at the GPU Technology Conference. GTC has a more HPC and research focus, and thus these features are more important than those you’d see emphasized at a consumer- or games-oriented venue.

      • MadManOriginal
      • 9 years ago

      NV needs to make one line of GPUs for HPC and one line for graphics if they want to keep up in graphics chip development. Having so much GPGPU features essentially wasting die space on a graphics card is killing them, but having two lines of chips would be rough too especially since the HPC market is so much smaller in volume.

      • SiliconSlick
      • 9 years ago

      Ahh, so more redoubled doofusdung( R&D), just like last time. I listened to ten thousand raging riotous regurgitating red roosters spew like shrieking little girls that Nvidia had abandoned the gaming community with FERMI, yet even now the GTX480 stands as the fastest GPU for gamers there is, and the 460 has spanked ati into red crybaby oblivion ( 450 not bad either).
      Why is it for several years the red roosters shrieked ati could bury and destroy Nvidia with price drops when in reality ati was 6-8 billion in debt and Nvidia was profitable the entire time with money in the bank ?
      Why is the opposite of reality true in the world of raging red rooster fanboys ?
      Be careful, you only speculated, but before long, we’ll probably hear it as satan’s red deviled dragonic absolute word to all, and anyone not repeating it in wishful anticipation of a red rooster glory hole win will be banished from the quackdom rant topics (every one ) forever.

        • HighTech4US
        • 9 years ago

        ten thousand raging riotous regurgitating red roosters spew like shrieking little girls

        +1

        • cegras
        • 9 years ago

        Newly joined member: check.

        Huge bias and irrational love of company: check.

      • Krogoth
      • 9 years ago

      If that trend continues, Nvidia might end up becoming the next “Matrox”. A company that carters towards prosumers and enterprise market.

      It is quite possible. The market for discrete PC cards is about to hit its death-knell. In the form of integrated graphics being good enough and ubiquitous under Sandy Bridge and Fusion platforms. It is worse when gaming consoles continue to overshadow PC gaming. The minority of PC gamers who are willing to hold onto their platform and get performance GPUs will continue to be a vanishing minority.

      Unlike AMD/Intel, Nvidia has no viable answer to Sandy Bridge and Fusion. The lack of QPI/DMI license hurts them on the Intel front, while their integrated solution are match for AMD on their own turf.

      Nvidia is starting to see the writing on wall. They are gambling on the more lucrative HPC and workstation markets where the demand for more performance will always exist.

        • Voldenuit
        • 9 years ago

        The market for /[

          • Wintermane
          • 9 years ago

          Even a 410 would be far more powerful a gpu then anything crammed on the cpu die.and fighthing with the cpu for mem and power and heat…

          • Krogoth
          • 9 years ago

          Ask yourself these questions.

          How many users are willing to go out of their way to spend top $$$$$ on high-end cards? How many games actually need the power of said cards? The answers may surprise you. Here is a hint, you know it is a problem when an old fanged 8800GT is still quite sufficient.

          Sure they can no longer handle 2560×1600 and 1920×1200 gaming at maximum in-game details. Again, how many gamers are at this level? Not enough to convince most developers to keep pushing the hardware envelope.

          What is the reason for all this? PC gaming is no longer the leader in “gaming”. It has regress into to a diminishing niche. It is all about MMORPGs, flash games and console ports. There is hardly any exclusive PC content. What little that exists, doesn’t demands top of the line hardware for the full experience.

          The consequence of this for GPU manufacturers is the decrease of demand for performance GPUs. Nvidia is in bigger heat then the other competitors because of previously mentioned reasons. They know this and are acting on it.

            • Voldenuit
            • 9 years ago

            No exclusive content? I’d like to point out a little game called Starcraft 2, you may have heard of it, it’s only available on PC and has sold over 700k copies at retail, and over 1.5 million internationally. And there’s also the question of another little upcoming title called Diablo III that will probably be PC exclusive as well.

            Even for cross platform games, there remains a strong contingent that prefers certain genres on PC. The recent polls here at TR and ausgamer are testament to that, with an overwhelming majority of PC enthusiasts and a smaller but still significant majority of general gamers preferring the PC. Steam typically cycles between 1.5 and 2 million online gamers in my experience and I only expect the numbers to grow. Console makers can’t even dream of the attach rate of Steam customers – I can’t even count how many games I have in my folder, and I’m not even an avid gamer.

            Part of the barrier of entry for PC gaming is in the steepness and complication of hardware requirements, not helped by OEMs that bundle bargain basement graphics in their computers to hit low sticker prices. With more robust IGPs like Llano raising the bar, this will raise the average hardware specs and make gaming possible to an even wider audience. And once more people get into PC gaming, quite a few might want to game at higher resolutions and detail settings.

            Ironically, Sandy Bridge may also help nvidia and amd since the graphics are good enough to ‘get into’ gaming but not good enough to provide an enjoyable experience in many games. People will see spending an extra $150-200 on a GPU to be a better deal than buying a console or spending $50 on Hyperthreading.

            I’ll believe PC gaming and GPUs dead when I see their cold, hard corpses. Because they seem to be kicking ass to me.

            • Krogoth
            • 9 years ago

            Starcraft 2 and Diablo 3 are exceptions to the rule. I suspect they are the last of the PC exclusive games from Blizzard-Activision.

            PC gaming is already a burnt-out husk of its former self. Gaming consoles are more of a less, a dedicated gaming PC with a predictable hardware cycle/platform. The PS4, Xbox 3, Wii 2 will make it more obvious.

            • Voldenuit
            • 9 years ago

            I wouldn’t mind seeing a fixed hardware PC gaming platform (OK, technically, the Xbox was one, but I digress). It would allow for better optimization and (hopefully) less buggy software since there is a single hardware spec to write for.

            The big difference is that console game manufacturers up to now have not been successful at reproducing the gaming experience that PC gamers prefer. That is why many gamers with multiple platforms still prefer to play FPSes, RTS and/or RPGs on the PC.

            Valve could do it today with a ‘Steam Box’, maybe subsidising the hardware cost or offering a games discount if you use their platform. Phantom was truly ahead of its time. Especially with the lapboard.

            But since I use my PC for much more than just gaming (and for a larger proportion of its use), I’d probably still stick to the PC.

          • StashTheVampede
          • 9 years ago

          The writing is on the wall for discreet GPUs, period. Nvidia’s comparison to Matrox isn’t TOO far fetched, but Nvidia may not want to be in nearly any form of consumer device at all.

          Nvidia’s path could be pci-e cards with Fermi onboard, potentially a motherboard with a socketed Fermi (more likely now that CUDA has gone x86) and Tegra for all forms of mobile needs.

          Getting out of the consumer space is where they want to be.

            • Krogoth
            • 9 years ago

            Yep, Fermi was designed to be a GPGPU in mind not a gaming GPU.

            THat is why it never stood a chance against dedicated gaming GPUs like 5xxx and soon 6xxx series.

            Notice how this PR event focused on GFLOP performance/watt ratio not on projected FPS for Just Cause 2/Metro 2033? That speaks volumes on Nvidia’s intentions.

            • MathMan
            • 9 years ago

            Maybe the fact that pretty much the whole conference (I went) was about high performance computing has something with it to do as well?

            I, and many other attendants with me, frankly couldn’t care less about projected FPS numbers of whatever hit game of the day.

            Many of the sessions and all of the more than 100 submitted posters were about how to squeeze more useful FLOPS out of a GPU or how to implement particular algorithms etc.

            Last year was no different.

            AFAIK, Fermi is still a pretty decent chip for gaming too. But for HPC, there is nothing else on the market in terms of absolute performance, performance/Watt and development tools. It’s a very exciting product.

            If you need to deploy tons of FLOPS in a datacenter, Perf/W is one of the most important parameters. It’s no mystery why this was the focus of the presentation.

    • lex-ington
    • 9 years ago

    Maxwell should have out a new CD by 2013. He makes some good music.

    Then again, Maxwell may have out some dual-layer BD-RW’s by 2013 that won’t cost more than a car.

    the future is looking bright for all Maxwell’s.

    Don’t Maxwell make atrocious coffee as well????

      • NeelyCam
      • 9 years ago

      Maxwell’s equations are pretty hot, too

        • anotherengineer
        • 9 years ago

        Indeed.

        Nvidia’s scientist naming scheme seems random though.

        Tesla – focused on electrical
        Fermi – focused on nuclear physics
        Kepler – focused on astronomy
        Maxwell – focused on electrical

        Oh well

          • Forge
          • 9 years ago

          Maybe like this:

          Tesla required previously unheard of watts to run.

          Fermi generates nuclear heat.

          Kepler implements some new space-derived cooling tech.

          Maxwell – Deskside generator?

    • ultima_trev
    • 9 years ago

    It seems people forget that during the GeForce 6-7 / Radeon X000-X1000 era, it was nVidia’s offerings that were more power efficient, while ATI was just about raw power.

    At least Thermi, unlike the abyssmal FX series (which still fills me with the lulz til this very day), is actually higher performing than the competition, even if less cost effective to produce.

      • SiliconSlick
      • 9 years ago

      ATI recently had it’s best quarter EVER, with 440 million in card sales, the 5000 series with dx11 carrying the day.
      The problem is, in the same quarter, and for some time, NVIDIA sold 1.6 billion, or 1,600 million, 4X the sales of ATI – even as ATI had their “best quarter ever”.
      Just go look at the stock market caps. NVIDIA by itself BEATS amd and ati combined (since they are combined)…
      So the sad facts are this: NVIDIA IS FOUR TIMES THE SIZE OF ATI.
      NVIDIA HAS BEEN MAKING MONEY
      amd ati finally had one Q where they posted a tiny profit for the 1st time in years.

      There are more sad facts for ati/amd when you compare them to NVIDIA, but in the raging red rooster yekker points, the pop culture quackers spew this PR insanity bull in nearly 100% the opposite direction.

      Do any of you people who claim to know what production costs are actually know what they are ? NO, NOT EVEN CHARLIE/SA CAN TELL YOU.

      So in the end, we have lies and rumors, and then we have the known facts. Here’s a known fact as well. If you have a tiny core that is much smaller than your competitors, you GET PAID LESS FOR THAT TINY CORE.
      It’s a wonder after all these years that the red roosters FORGOT 100% about that market reality, even as they spewwed like a million tiny, tinny ceo wannabe’s, always declaring their non paid market research like so many quacking parrots( rather red rooting roosters in this case).
      ATI is only 440 million a quarter- NVIDIA is 1,600 MILLION a quarter, not matter how hard ati crunches down it’s non-existant or barely perceptible itsy bitsy 33million profit for 1Q with 16Q of loss all larger than that, IT CANNOT DROP PRICES TO ANY DEGREE THAT CAN HARM NVIDIA AT ALL… P E R I O D !

        • Triskaine
        • 9 years ago

        Can I have some of the stuff you are smoking? Nvidia’s earnings in the last quartal were in fact 811 million while they also lost gobs of marketshare.

          • Game_boy
          • 9 years ago

          Nvidia lost money last quarter.

            • Silus
            • 9 years ago

            And so did AMD! Shock!

            What? A recession making companies lose money ? Say it isn’t so…

            • TaBoVilla
            • 9 years ago

            ATI is not AMD! worse case scenario, they sell AMD graphics division before driving it to the ground.

            • Silus
            • 9 years ago

            Yeah you’re right, ATI is not AMD, because ATI doesn’t exist anymore. it’s just AMD.

            If you’re eager to breakdown the companies into their own sub-divisions, as you seem to like to do with AMD, then maybe you should also divide NVIDIA’s earnings for the mobile (Tegra) sub-division, their HPC (Tesla) sub-division and discrete GPUs (for desktop and laptop) sub-division…which makes no sense, just like it doesn’t for AMD. Their are only one company, ever since AMD bought ATI. The ATI brand lived for a while, but ATI as a company doesn’t exist since then.

            • Game_boy
            • 9 years ago

            AMD’s graphics side made a profit, because they are ahead. AMD’s CPU side made a loss, because they are behind.

            AMD sells many more CPUs than GPUs, so it’s a net loss.

            • Silus
            • 9 years ago

            That’s not even correct, because AMD’s loss is mostly related with the GF deal, which is still costing them buck loads of money. AMD doesn’t seem to know how to make deals, without putting themselves in the red (buying ATI, the foundries spin-off…) and their massive 3.5+ billion debt is still a sign of that.
            NVIDIA also that had a loss, mostly related with additional charges to the mobile chip problems they had, which in total already cost them over 500 million.

            That being said, why are you still dividing the earnings of a single company, into sub-divisions…?

            We get that you want to show that the graphics division you like actually made money, but it makes no sense to divide the earnings into sub-divisions, because it’s just one company and if that company lost money, it affects all sub-divisions, regardless of their performance. Especially in AMD’s case, where I can only imagine the interest rates of their massive debt…that certainly affects the budget for each project their work in.

        • Krogoth
        • 9 years ago

        In_elite is that you?

      • Voldenuit
      • 9 years ago

      The Geforce 6800 Ultra was pretty power hungry (and hot!). Part of that had to do with the fab (IBM), while the parts that were made at TSMC were a lot better off.

      I don’t think it’s easy to generalise about the two companies. Over the years, each has traded positions both in the market and in the community’s regard too many times to keep count.

      One is free to be a fan of either, but the truth is, the community (and market) need both to keep trading punches so that there is healty competition in the industry. Although it doesn’t help that Jen Hsun is crazier than a fox on acid. 😛

    • darryl
    • 9 years ago

    I just want to know what the chances are of producing a GTX470 clone using the 104 chipset ( a purely graphics solution like the GTX460)? OOOPPSS I also want to know when.

    • YeuEmMaiMai
    • 9 years ago

    over optimistic they are…..lesson from ati they are taking lol

      • can-a-tuna
      • 9 years ago

      Wise words from master Yoda. 😀

    • Zorb
    • 9 years ago

    Just more vaportalk with some joke of a graph. Might as well hit the button and let the disco ball drop…….

    • sweatshopking
    • 9 years ago

    YOU GUYS ARE ALL WRONG. Maxwell is going to be stupid fast, and then you’ll all be eating crow!!

      • cygnus1
      • 9 years ago

      So’s your face

      • Disco
      • 9 years ago

      Is that you Jen??

      • Voldenuit
      • 9 years ago

      It will be as fast as light, but will require an infinite amount of energy to get there…

      • anotherengineer
      • 9 years ago

      crow, umm I refer to it as French Partridge 😉

      • huge
      • 9 years ago

      Prime1??

      • Jigar
      • 9 years ago

      Can i have one ? 😉

        • sweatshopking
        • 9 years ago

        Jigar, you can have 2. the rest of you get nothing.

    • dpaus
    • 9 years ago

    Uh-ohhhh…. Huang is starting to believe his own spin….

    • Helmore
    • 9 years ago

    How come Fermi is placed in 2009 on that time axis? You couldn’t buy any Fermi chips in 2009 as far as I know.

      • esterhasz
      • 9 years ago

      Well, let’s hope for NV that the Kepler on 2011 is not plagued by the same calendar distortion…

      • can-a-tuna
      • 9 years ago

      Nvidia marketing. That’s why. 😛

      • MrJP
      • 9 years ago

      The dates mark the start of each generation’s full-on marketing blitz:

      e.g. §[< http://www.techreport.com/articles.x/17670<]§

      • anotherengineer
      • 9 years ago

      NO doubt. Should be mid 2010 lol UH OH Nvidia you have 4 months to 2011 and get a 2.5x performance boost/watt over Fermi.

      I don’t think its happening, unless since fermi on that timeline was 2009, I guess Kepler would be mid 2012, then maybe then.

        • cphite
        • 9 years ago

        It still won’t run Crysis

    • maxxcool
    • 9 years ago

    That man is crazier than bat-guano… until tech report can review it i dont believe a word that man says.

    • no51
    • 9 years ago

    I thought that was a ridiculous graph until I read the axes (axises?). Basically anything would be a huge (metric)/watt improvement over fermi. Then the graph just becomes slighty over-optimistic.

Pin It on Pinterest

Share This