Poll: How will Kepler fare against the Radeon HD 7970?

If you’ve been paying attention to the GPU scene, you’ll know that AMD’s next-generation graphics architecture has already arrived in the Radeon HD 7970. Simply put, it’s a beast. Nvidia is due to answer with Kepler, its own architectural refresh, by the middle of this year. The rumor mill is already buzzing with predictions of how the two will stack up, and we’re curious what you think. Will Kepler be faster than AMD’s latest and greatest, will it lag behind, or will the two flagships offer comparable gaming performance overall? Cast your vote below or in the middle column on the front page.

Our last poll asked you to name the best PC game of 2011. Skyrim was the overwhelming winner, capturing 31% of the vote ahead of my personal favorite, Battlefield 3, at 18%. 14% gave the nod to Deus Ex: Human Revolution, while only 10% cast their ballot for Portal 2. Valve’s latest opus came out way back in April, and I suspect memories of it faded in the flurry of holiday releases. The Witcher 2, Batman: Arkham City, and Modern Warfare 3 round out the next three spots in the rankings with 6%, 4%, and 3% of the vote, respectively. The other games on our list failed to move the needle more than 1%.

Comments closed
    • DeadOfKnight
    • 8 years ago

    I bet some of you feel dumb right now.

    • AssBall
    • 8 years ago

    I still don’t understand why video cards are so polarizing. And this high end segment is like trying to decide between a Lamborghini and a Ferrari. The only right answer is “Yes please!”

    – I have had poor experiences with my Nvidia cards, but my Dad and Brother swear by the ones I put in their systems.
    – I have had great experiences with my ATI cards, but my Mom and Fiance have had trouble with the ones I put in their systems.

    To each their own, but it is not something to really get that worked up about. I am happy to just pick a reliable, decent, efficient card for the price range.

    • jdaven
    • 8 years ago

    I read the results of this poll as follows:

    < 50% of TR readers think that Kepler will beat the 7970.
    > 50% of TR readers don’t care, say the two will be equal or think the 7970 will beat Kepler

    Interesting.

    • Chrispy_
    • 8 years ago

    I think I’m going to say “meh”

    When a $250 gpu can hit the vsync in almost every game on high detail at 1080p with AA enabled, you have to wonder what the point of buying more power is these days other than to brag about the size of your epenis.

    Resolutions above 1080p are going to be running on either multiple displays or laggy, blurry IPS panels.

    Multiple displays is an option but despite having had three screens on my desk at one point, I continued to game on one because even for games that supported crazy aspect ratios properly, the interruption of the bezels really ruined the immersion.

    If you’re gaming on an IPS display, you’re likely to be getitng G2G response times that make 40 fps about the best your screen can handle, once it’s added its own input lag to deaden the whole effect.

      • tom299
      • 8 years ago

      Very ignorant statement. Do you own a true 120hz monitor like the Samsung S27A950D? Games running at 120fps at 1080p on monitors like these are a totally different gaming experience. There is no going back and I need every FPS I can get.

    • jihadjoe
    • 8 years ago

    7970 is about the same as the 580 clock-for-clock, it’s faster because it’s built on a new process and thus clocks much higher.

    A mere die shrink of the GF110 should be sufficient to put Nvidia on parity with AMD, so unless Kepler actually goes backwards in terms of performance per clock/watt Nvidia will win this round of the GPU wars pretty handily.

      • sschaem
      • 8 years ago

      So not changing anything the next nvidia card would need to use a 2+ghz shader clock.
      But the issue is that games need more then just shaders (Today games are not shader limited)
      So its everything that need to go up 30% : memory bandwidth, ROPS, texture units, etc..

      So a die shrink on a 28nm process and pushing the HW to 2ghz is not the way nvidia will reclaim the top spot.

      People have reported easy overclocking of the 7970, and up their FPS by 20% in games.

      And for compute, In the raytracing test the Oc 7970 score 24200, the 580-gtx 12000
      The 7970 is over 2x faster.

      Dx11 fluid computation: 91 vs 163…

        • ElMoIsEviL
        • 8 years ago

        Bitcoin Mining has my 7970s at around 800Mhash/s each vs 115Mhash/s for a GTX 580.

        The GTX 580 is the superior hardware rasterizer. nVIDIA has always been ahead of the curve in that dept. But compute wise… it is dwarfed by the mighty 7970. This ain’t no VLIW4/5 that requires CPU cycles in order to power the software Compiler and attempt to efficiently make use of the ALUs available… The 7970 is an SIMD/MIMD monster.

        nVIDIA may win in terms of FPS in games.. which is what matters to most… but they won’t win in terms of Compute Power. GCN > Fermi (Kepler) in terms of Compute Power and I feel entirely comfortable making that claim.

    • Meadows
    • 8 years ago

    I only voted because of the last option.

    • Draphius
    • 8 years ago

    im sure it will be faster but im wondering what the memory specs will be on it. i have a feeling amd is still going to be holding the crown for multi monitor support

    • faramir
    • 8 years ago

    Kepler isn’t due out for another quarter and will likely take approx. 6 months to arrive in sufficient numbers to more than one segment -> therefore: Krogoth not impressed and me neither. If Nvidia wanted to compete, they should have released their stuff at the same time; anything else is hardly a sensible comparison, it’s like comparing any other computer parts from quarters apart (where prices change, specs change, avaliability changes etc,).

    Offtopic: Geoff, I’ve been wondering about this for quite some time now – how does one pronounce your surname correctly: is it “gassy-or” or “gay-see-or” or something entirely different ?

    • wierdo
    • 8 years ago

    Well that’s interesting:
    [url<]http://hardocp.com/news/2012/01/25/nvidia_accused_using_marketing_shills[/url<] Hmm, I guess the joke's on me. Seems this kind of marketing works pretty good - just don't get caught lol.

    • albundy
    • 8 years ago

    by the time NV releases it, AMD will likely be another step ahead with a newer faster GPU, thus putting the 7970 at a cheaper pricepoint.

      • Airmantharp
      • 8 years ago

      We can only hope!

      I’m still a little miffed at how badly my HD6950’s run BF3 at 2560×1600; selling them for HD7950’s looks like a proper path, but not before I see what Nvidia has to offer!

    • burntham77
    • 8 years ago

    So you are taking a poll about our opinions of an upcoming fact? This is stupid.

    • trek205
    • 8 years ago

    only a moron would think Kepler will be slower than the 7970. all it would take is just 30% more performance than the gtx580 to beat the 7970.

      • khands
      • 8 years ago

      I expect that somewhere around 60-80% faster for Kepler is within easy reach.

      • entropy13
      • 8 years ago

      You can even put it this way…the GTX 560 Ti is 30% behind the 7970. If Nvidia’s next “mid-high end” card succeeds on improving 30% over the GTX 560 Ti…

      The 560 Ti had a launch price of $250. Even if they double that for the new mid-high card, it’s still cheaper than the 7970.

        • mczak
        • 8 years ago

        Get your facts straight, you need about a 65% improvement from GTX 560Ti to catch up with the HD7970 (with your numbers the GTX 560Ti would be as fast as the GTX 580…).
        That said the question if Kepler is faster is silly. Almost certainly nvidia will have a Kepler chip out which is much bigger than Tahiti, and unless they screwed up majorly it should be faster. Can nvidia improve their second fastest chip by 65% to catch the 7970? Sounds not quite impossible but improbable (especially since it most likely needs to live with only minimally higher memory bandwidth than the GTX 560Ti). It would probably be quite an achievement if nvidia’s second fastest kepler chip would be around the same performance as the HD7950 already, which sounds more doable (only needs a ~50% improvement over 560Ti).

          • entropy13
          • 8 years ago

          My facts aren’t “crooked” unless TechPowerUp’s table are lies?

          [url<]http://tpucdn.com/reviews/AMD/HD_7970/images/perfrel.gif[/url<] It's even 71%-100% actually. So it's 29% and I'm wrong!!! LOL GTX 580 is only 8% behind the 7970. GET YOUR FACTS STRAIGHT.

            • mczak
            • 8 years ago

            The results are real but you’re playing nice tricks here by using the results which include low res which really don’t matter for such cards. Ok I’ll play fair and not use the really high res chart (which is way more relevant than 1024×768) so 1920×1200 is it then and GTX 560Ti is 67% and HD7970 is 100% in this case in tpu’s review. Now you may say that’s only 33% but for people understanding percentage calculations that still means GTX 560Ti + 50% is needed to catch the HD7970.
            That is still lower than my ~65% probably due to game selection and because it doesn’t include 2560×1600 resolution (I took my results from ht4u, as far as I can tell tpu has about the lowest overall lead of the HD7970 compared to GTX 580 and hence other nvidia cards of all reviews).

      • ImSpartacus
      • 8 years ago

      only a moron would think Krogoth could be impressed.

        • Arclight
        • 8 years ago

        Damn straight. Idk what this people are smocking but clearly nvidia has no chance to impress the Krogoth.

    • no51
    • 8 years ago

    Nvidia undercut AMD on pricing? Especially on the high end? Must be a nice dreamland you guys live in. If it’s faster than a 7970, they’ll probably price it around $600+.

      • Silus
      • 8 years ago

      Obviously NVIDIA high-end will be even more expensive than AMD’s, but you missed the part of the rumor that mentions NVIDIA’s mid-range coming close to AMD’s high-end.

      Still, I doubt NVIDIA will price their high-end very far from AMD’s high-end. AMD exaggerated quite a bit in price with the 7970s. Will be interesting if NVIDIA is willing to price their faster card than 7970 at the same price. AMD will be forced to lower their prices a lot.

        • HighTech4US2
        • 8 years ago

        > Still, I doubt NVIDIA will price their high-end very far from AMD’s high-end. AMD exaggerated quite a bit in price with the 7970s.

        If Nvidia high end GK110 is 30-40% faster than the 7970 then it should be priced around 35% higher.

        With the 7970 current price at $550 that would make the price of the GK110 around $699.

        $699 is way to high for the market so Nvidia prices the GK110 at $499 and drives the price of the 7970 down to around the range of $300-$350.

        Capping AMD’s high-end at the $350 and eating at it from below with the $299 GK104 will hurt AMD’s GPU’s division to make a profit that could be used to fund future R&D and Nvidia having the single GPU high end completely to itself is like printing money.

        Nvidia is taking a page from Intel. Starve your competitor of capital.

          • Farting Bob
          • 8 years ago

          AMD is rumoured to have a higher clocked version of the 7970 in the works, im sure if they can bump clocks up by 15% then it will replace the 7970 at that price and we’ll see the original drop by $50 or so. We certainly wont see the 7970 stay at its current level for long after Nvidia releases their cards, thats for sure.

      • HighTech4US2
      • 8 years ago

      The single GPU GK110 will be priced at $499, the Mid-range GK104 will be priced at $299.

      The performance of the GK104 is very close to AMD’s 7970 so AMD’s high end GPU will be capped at around the $300-$330 range and Nvidia will have no competition at the high-end.

      As Charlie states Nvidia wins hands down.

        • khands
        • 8 years ago

        There’s obviously a dual-GPU solution (and probably a 7980 or something that’s basically a slightly optimized and OC’d 7970) down the line from AMD once they can figure out the thermals. It’s been that way since the 3870×2

          • kamikaziechameleon
          • 8 years ago

          6970… what was the overclock of that?

          • HighTech4US2
          • 8 years ago

          a dual-GPU card cannot compete with the single GPU GK110 card. With the performance of AMD’s dual card being nearly equal to Nvidia’s GK110 single GPU card who in their right mind would pick the dual card and the problems that come with dual vs single GPU designs.

          And Nvidia is very capable of producing a dual-card that would place the AMD card 30-35% below it in both price and performance.

          Believe your god Nvidia wins hands down.

    • Kaleid
    • 8 years ago

    It’ll be somewhat faster, but that doesn’t matter to most gamers as most gamers will never care about the high-end cards.

    • stupido
    • 8 years ago

    Please excuse my ignorance, but what is “Krogoth”?!

      • entropy13
      • 8 years ago

      “Krogoth”…the legend, the myth, the fable, the epic, etc. who is definitely NOT impressed with you right now for not knowing the existence of Krogoth.

        • stupido
        • 8 years ago

        arrghh! I’m crawling back into my deep dark hole…

          • Arclight
          • 8 years ago

          Krogoth is just a user like you and me. But the legend goes he is not impressed.

            • burntham77
            • 8 years ago

            Thank you for being useless on the internet.

            • Arclight
            • 8 years ago

            Oh you’re so very welcomed. It’s hard sometimes you know? Useful content does slip from me from time to time but i try my best not to let it happen very often.

            • stupido
            • 8 years ago

            ah…
            so he is actually a person and not imaginary…

            Stupido not impressed…

            • ImSpartacus
            • 8 years ago

            Now you’re getting it. But you’re still not very impressive.

            • stupido
            • 8 years ago

            I’m not even trying… that is job for Krogoth…

      • ImSpartacus
      • 8 years ago

      It’s not impressed, that what it is.

    • ronch
    • 8 years ago

    This poll is about as sensible/useful as the poll about how Bulldozer will fare against Sandy Bridge. Yeah, as though this poll will affect how the final product does. Just wait for the thing to come out. Useless to start a poll asking what people’s predictions are.

      • Phishy714
      • 8 years ago

      It’s also pretty useless then for people to talk about an upcoming game and ponder as to what it will be like.

      It’s also pretty useless then for people to EXPRESS THEIR OPINIONS on subjects that they care about.

      However, your post is just about the most useless part of these shenanigans.

        • ronch
        • 8 years ago

        I think you’re being overly defensive or sensitive, which made you tag my post as useless as well. Unlike you, however, I’m not trying to be rude here, Asking folks to guess how an upcoming product will perform, however, is simply not a very logical conversation topic, is it? It’s purely speculative.

          • Phishy714
          • 8 years ago

          Then by your definition, speculating on how an upcoming President will perform is simply not a very logical conversation topic.

          And btw, saying something is useless is, in fact, being rude. If its useless in your eyes, congrats! Don’t become part of it. No need to tell everyone here that their opinions or speculations or HOPES for a new product are useless.

    • willyolio
    • 8 years ago

    as usual for the past few generational refreshes: a bit faster, a lot hotter.

    • clone
    • 8 years ago

    the overclocking results with HD 7970 show that AMD played it conservative, I suspect they’ll have a refresh in 6 or more months when Kepler finally arrives, I do believe Kepler will be faster than AMD’s card but the difference will likely be negligible and given HD 7970 will be a fully mature product by that point literally almost at end of life status I believe if forced AMD will be readily able to adjust pricing in order to compete favorably.

      • Palek
      • 8 years ago

      Some people also appear to conveniently forget how much extra performance nVidia was able to squeeze out of Fermi after product launch through driver-side improvements. Fermi was a very different beast from GPUs that preceded it. I imagine it required quite a bit of learning for nVidia’s driver team.

      Wouldn’t it be fair to assume that we could see substantial performance improvements due to driver optimizations on the AMD side, as well?

        • Firestarter
        • 8 years ago

        One can hope, but basing your buying decisions on hope can burn you badly.

          • Palek
          • 8 years ago

          I wasn’t really talking about basing buying decisions on what may or may not happen, merely pointing out that Tahiti performance may not stand still between now and when Kepler arrives.

          PS. I buy games one or two years after they come out, and purchase two generations old hardware to match. Savings!!! (Still running a Radeon HD4850, and just finished Assassin’s Creed 2. Next up in my game pipeline is Bioshock – that’s right, the first one! πŸ™‚ )

        • entropy13
        • 8 years ago

        [quote<]driver optimizations on the AMD side[/quote<] That's quite a paradox of a statement.

          • Palek
          • 8 years ago

          Wow, really? That tired old jab? Have you been living under a rock since the Radeon 8500 days?

            • Silus
            • 8 years ago

            Rage, Skyrim and other AAA games say hello to the AMD driver team!

            And that happened just a few months ago! Not to mention how they “optimized” their HD 6800 series, by decreasing IQ. Who’s living under a rock then ? Or are you in denial ?

            • Arclight
            • 8 years ago

            What happened in Skyrim, stays in Skyrim…

            • Silus
            • 8 years ago

            Yeah, most likely AMD driver’s team took an arrow to knee!

            And in Rage, they took several arrows, since including an older version of OpenGL in the drivers, was quite the thing to do…

            • Firestarter
            • 8 years ago

            Don’t confuse the driver developers with the driver packagers. That they released a few screwed up packages doesn’t mean that the drivers will forever be substandard.

            • Palek
            • 8 years ago

            Oh never mind him. The confusion is intentional. Any opportunity for some AMD-bashing must be taken advantage of, no matter how irrelevant to the discussion.

            • Silus
            • 8 years ago

            And where did I say they will always be substandard ?…

            And I didn’t confuse anything. Who releases driver packages ? That’s right, the driver developers and they included an extremely old version of OpenG in them, creating big problems for those trying to run Rage with Radeons.

            • clone
            • 8 years ago

            problem disappeared within a few days, the crying has never stopped.

            such is the nature of AMD bashing, nobody talks about the fix released as soon as noticed they just keep crying endlessly about the moment where someone encountered a problem forever from that point forward.

            Nvidia makes the same mistakes but any who mention that get accused of being an AMD fanboy…. lol.

            • ClickClick5
            • 8 years ago

            This. Thank you.

            • Palek
            • 8 years ago

            Look at our dear friends Silus and HighTech4US2 spin and spin and spew bile in the direction of Charlie in this thread: [url<]https://techreport.com/discussions.x/21248[/url<] How dare he suggest that Southern Islands will show up before Kepler? Baseless rumours, Char-LIE sucks, blah-blah-blah. Except, er, he was right, nVidia was late.

            • Silus
            • 8 years ago

            Yes, because asking that a rumor should be labeled as “rumor” instead of “report” is really pushing it…

            Get a life!

            • Palek
            • 8 years ago

            [quote<]Get a life![/quote<] Says the guy who expends considerable effort conducting his anti-Charlie campaign on this website (and probably others, I imagine). Ah, the irony.

            • ElMoIsEviL
            • 8 years ago

            Dude… you’re not funny. And you’re not cool enough to make the “took an arrow to knee” line funny. When you say it… puppies cry.

            • Palek
            • 8 years ago

            I’ll make this REAL simple for you.

            When nVidia first released GPUs based on Fermi the drivers were not fully optimized for the new architecture. There was a lot of performance headroom left, and with subsequent driver releases nVidia was able to boost Fermi gaming performance quite significantly.

            With this in mind, how would you rate following scenario?

            “AMD’s Tahiti is a new architecture, a radical departure from their previous GPUs. It is quite possible that current drivers do not take full advantage of the capabilities of the hardware and that subsequent driver releases will wring more performance out of Tahiti.”

            A. AMD sucks, nVidia rules
            B. No way
            C. Possible
            D. Highly likely
            E. I lurv AMD

            (And please just answer the question. Everything else is subterfuge.)

            • Silus
            • 8 years ago

            Where did I talk about lack of improvements ? The only thing I mentioned regarding driver optimizations, were the optimizations done when the 6800s were released that decreased IQ. That should be a thing of the past for both GPU designing companies, yet AMD still did it just a few months ago.

            I talked about errors in most recent drivers for AAA titles. First they don’t have them ready before or at least, at the time the game is released. And when it is finally releases, it has errors (such as the Rage one which was the most glaring one) or very poor performance. I can’t answer your question when you ignore everything I mentioned, in your quest to protect AMD no matter how bad their track record is in these sort of things. Good luck with that!

            • Palek
            • 8 years ago

            I’ll use short sentences and simple words.

            Stop moving the goalpost. Answer the question.

            • clone
            • 8 years ago

            did anyone notice Nvidia released a driver that blew up their GTX 590 cards…. or is that one not supposed to be mentioned?…..lol.

            did anyone notice Nvidia released a driver that burned out it’s graphics cards in StarCraft II last year?

            did anyone notice their is a thread dedicated right now to driver problems with the latest Nvidia products, problems that have spanned across multiple driver revisions for several months and affect multiple apps.

            apparently shills must at all cost ignore reality when spewing their pathetic, hypocritical garbage because far be it for a shill to show some character and admit that all companies have graphics issues while they whine and bash another company.

            • entropy13
            • 8 years ago

            So it’s untrue that CAPs for HD 5000 and HD 6000 series weren’t released consistently (at the same time) as well? So the people complaining about the lack of of significant improvements with their HD 5970 are lying when the HD 6000 people got some boosts but the HD 5970 owners didn’t? And there are several games where optimizations are practically absent as Silus already mentions.

            • Palek
            • 8 years ago

            You implied – falsely – that AMD could not deliver performance improvements through driver updates. I called your jab ridiculous. End of story. Stop the spin, please.

            • entropy13
            • 8 years ago

            I implied, factually, that AMD could not deliver performance improvements through driver updates. Since all improvements in Skyrim for example, was either through mods, or just as miraculously as a AMD driver improvement, a patch from Bethesda.

            AMD’s miserably drivers are as much a constant as Bethesda’s reliance on mods to fix the bugs in their own games.

            • Palek
            • 8 years ago

            Yes, clearly your experience with a single game applies to all of AMD’s driver development efforts, past, present and future.

            Try again.

            • clone
            • 8 years ago

            did anyone notice Nvidia released a driver that blew up GTX 590 cards…. or is that one not supposed to be mentioned?

            did anyone notice Nvidia released a driver that burned out it’s graphics cards in StarCraft II last year?… oh yeah I guess in the land of double standards we shouldn’t mention that one either.

            did anyone notice their is a thread dedicated right now to driver problems with the latest Nvidia products, problems that have spanned across multiple driver revisions for several months and affect multiple apps or in the land of double standards are we pretending that one isn’t ongoing…. despite the reality that it is ongoing.

            all of the gpu players suffer from driver gremlins, only shills focus on one company.

            • ElMoIsEviL
            • 8 years ago

            Hear Hear!

      • flip-mode
      • 8 years ago

      Almost EOL in 6 months?

        • Airmantharp
        • 8 years ago

        Aside from process hiccups, isn’t this always the case?

          • flip-mode
          • 8 years ago

          Er, 5870 lasted over 12. 69xx still going 12 months and counting. 5770 stayed around for a good 18 months. GTX 580 and 570 are with us over a year now. GTX 560 12 almost 12 months.

          I think it is very, very rare that any GPU goes EOL after just 6 months. The 3870 was released 6 months after the 2900 and that was strongly driven by the massive blunder that was the 2900. Usually a GPU generation is more like 18 months or so.

            • Airmantharp
            • 8 years ago

            The last three generations have been a process hiccup, no?

            40nm has been around far too long, starting with the 4770.

            • flip-mode
            • 8 years ago

            Hmm, not sure if the 40nm process hiccup change the generational cadence much. You’re really going to make me look this up. Okay, lets see…

            Radeon 9700: 09/2002
            Radeon X800: 05/2004
            Radeon X1800: 05/2005
            Radeon 2900: 05/2007
            Radeon 4870: 06/2008 (pushed out ASAP because of 2900 ??? possible)
            Radeon 5870: 09/2009

            That’s a pretty regular cadence on the Radeon side of 12-18 months, 14 months on average, HD 7000 series not included.

            Let’s check the Geforce side:

            Geforce 5800: 04/2003
            Geforce 6800: 08/2004
            Geforce 7800: 06/2005
            Geforce 8800: 11/2006
            Geforce 280: 06/2008
            Geforce 480: 03/2010

            Also a pretty regular cadence of 12-18 months, and also, and slightly surprisingly, 14 months as an average.

            So the 7970 comes about 26 months after the 5870, and the long run of 40nm almost certainly caused a hiccup in the cadence, but it’s also not the first time the cadence got tripped up – 24 months wait for the Radeon 2900 and 21 months wait for the Geforce 480. But 6 months is most certainly not the normal generational cadence here – 14 months is the average.

            • Airmantharp
            • 8 years ago

            You’re right about generations, no disagreement there.

            I should have specified mid-generation refreshes as a part of the ~6 month EOL statement- as in, we should see a refresh that replaces the HD7970’s position in AMD’s lineup in about 6 months.

            Also, thanks for looking that up and laying it out nicely :).

        • clone
        • 8 years ago

        HD 7xxx will be fully mature in 3 months let alone 6 and by 9 months it’s done as HD 8xxx will have taped out by then….. if all goes well, it’s usually 100 days from final tape to production but by final tape the driver focus shifts to the next gen, production is focused on the next gen, you name it is focused on the next gen…. hence the previous gen is at end of life status 3 months prior to next gens launch.

        2nd point is that a refresh with adjusted clock rates would seem a simple matter given how conservative AMD was with frequencies if todays overclocking numbers are any indication let alone the huge lead time they have to get everything sorted including picking colors for the executive bathroom if Nvidia really is 6 months from shipping product.

        my guess is it’ll be 9 months before Nvidia gets Kepler on the market in quantity.

        either way if Nvidia was releasing product by February / March I wouldn’t be calling them this rounds loser but instead waiting for the numbers but given the signs are showing a june / july or later launch the fights over, they forfeited, the ppl have left, the lights are out, it’s over.

        worse still for Nvidia is that HD 7970 is a power miser let alone a performance monster, their really is no reason not to buy it today, GTX 580 was always a whore on power which arguably could give some pause but HD 7xxx doesn’t suffer from it consuming 80 watts less at load.

          • ElMoIsEviL
          • 8 years ago

          You wrote some positive things about the Radeon HD 7970. Therefore someone voted you down. No worries I’ve got your back πŸ™‚

          I love reasonable and rational arguments. Not that petty “fanboi” stuff most of the kids are into these days.

            • clone
            • 8 years ago

            Nivdia employs shills, this is known fact, Nvidia also extorts websites by restricting access to their hardware unless certain comments are made and special focus is given to Nvidia only features regardless of merit or value.

            if my post is honest and genuine I don’t really worry much about the thumbs down because as evidenced by much of the baseless criticism by shills or Zealots who openly lie and lack integrity the thumbs down also is a lie and lacks integrity.

            computer hardware isn’t like a sports team, their is no “home team” most of it’s made in Asia so it boggles that some will sacrifice their morals and integrity for the sake of a faceless company only interested in the money they have and will make.

            for some ppl stupid is as stupid does.

            • Waco
            • 8 years ago

            This is all total BS.

    • Arclight
    • 8 years ago

    I expect GK 112 to be a lot faster than 7970, probably comparable or dare i say it, faster than 7990. While the GK 104, from the rumors will be able to match the 7970. Sure it has a smaller bus width and 1 GB less of RAM but it has what 200 more SP than GTX 580? Coupled with architectural advances it could actually even outperform the 7970. I voted for “Faster than the 7970” cause from the spex, i assume it’s capable.

      • can-a-tuna
      • 8 years ago

      Nothing stops AMD from souping their Tahiti lineup. They have PLENTY of time to tune the architecture before that high end Kepler arrives. As always, nvidia is late in the game and AMD has up to half a year lead. Though, it would be embarrassing if nvidia’s monster card arriving almost six month later would not beat current Tahiti.

        • Arclight
        • 8 years ago

        [quote<]Nothing stops AMD from souping their Tahiti lineup. They have PLENTY of time to tune the architecture before that high end Kepler arrives.[/quote<] I never said AMD couldn't do that, in fact we already heard a rumor about it a month ago. There is no need to defend AMD, right now they have the performance crown. That's one of my points also "time". By the time GK 112 gets released we will start reading leaked informations about the HD 8000 series....nvidia is late again but they were late last time also and still managed to do great financially thanks to cards like GTX 460 and GTX 580, 570 and 560 Ti.

    • tbone8ty
    • 8 years ago

    AMD: “We excepted Nvidia to release Kepler…oh wait where is it?”

      • Arclight
      • 8 years ago

      Oh the smack talk has begun!

      • yogibbear
      • 8 years ago

      expected?

        • Mourmain
        • 8 years ago

        etcexped.

    • HighTech4US2
    • 8 years ago

    Why even have this poll as Charlie has already spoken and he states the $299 Kepler GK104 will beat the AMD 7970.

    1. The short story is that Nvidia will win this round on just about every metric, some more than others. Nvidia wins, handily.

    [url<]http://semiaccurate.com/2012/01/19/nvidia-kepler-vs-amd-gcn-has-a-clear-winner[/url<] 2. the initial Kepler/GK104 cards will be priced around the $299 mark. [url<]http://semiaccurate.com/2012/01/23/exclusive-and-the-nvidia-keplergk104-price-is[/url<] So nVidia's mid range part will beat AMD's high end part. When the GK110 (nVvdia's high end part) comes this summer AMD will have no answer for it.

      • pogsnet
      • 8 years ago
        • Silus
        • 8 years ago

        LOL hilarious. When Charlie spews his usual BS about NVIDIA, the responses in here are “he’s right most of the time, so it’s probably true”.

        For the first time the idiot says something good about a NVIDIA product and “it’s just a rumor and not a reality”. You AMD fanboys are truly pathetic.

        That said, Charlie’s rants are Charlie’s rants and should be ignored until actual products are launched, as usual. But it’s hilarious to see the double standards in action.

      • Palek
      • 8 years ago

      Hold on, weren’t you supposed to be a card-carrying member of the anti-Demerjian league? Suddenly he’s right because this time he says nice things about nVidia?

      Oh the humanity!

        • HighTech4US2
        • 8 years ago

        I’m just pointing out the gospel from your god stating that Nvidia wins hands down.

        If you want to become a hectic and go against belief in your god so be it.

        And as for being anti-Demerjian I still am because of his continuing agenda against Nvidia and his wishes for the destruction of the company. For him to post anything positive about Nvidia means that he can’t BS the facts away.

          • JMccovery
          • 8 years ago

          I think the word you’re looking for is ‘heretic’ not ‘hectic’.

          What is a ‘hectic’ (as in kind of person) anyways?

          • Palek
          • 8 years ago

          Just because in your world everything is black or white (or in this case, uh, lime green or, uh, dark green?) don’t assume that the rest of us think the same way. I have absolutely nothing against nVidia and I actually stopped reading The Inquirer because I found Charlie’s style irritating.

      • Haserath
      • 8 years ago

      He never said the card would best the 7970 performance wise. At $300, GK104 looks more like it will best the 7970 performance per dollar wise.

        • HisDivineOrder
        • 8 years ago

        Apparently, he said it on twitter. πŸ˜‰

        If an AMD fanboy as proud and card-carrying as SA is saying AMD’s lost this round, then I’d say it’s probably true.

      • can-a-tuna
      • 8 years ago

      Oh, now we start quoting Charlies rantings, eh? There is no way $300 mid range card can beat $500 high end card. This has never happened. Even in HD2900XT or FX5800 times. Crapvidia’s single GPU high end card will probably be faster as it usually has been but I think AMD’s dual GPU card still takes the fastest card title.

        • Silus
        • 8 years ago

        Oh so he should only be quoted when he talks crap about NVIDIA and praises AMD’s products ? The typical AMD fanboy double standards.

        He shouldn’t be quoted EVER. It’s actually interesting that when he talks trash about NVIDIA he appears in the front news page, but this particular rant of his (where for the first time he praises a NVIDIA product) never appeared in the front news page…coincidence ? Don’t think so…

        Anyway, you’ll see very soon where GK104 stands against Tahiti. Will be interesting to see you spin that one!

      • clone
      • 8 years ago

      Nvidia lost this round a month ago.

      it’s embarrassing that you are pushing rumors of something that might happen in 6 months as proof of a victory today…..race is over, AMD took the trophy, everyone’s left the stadium, you are sitting in the bleachers all alone, it’s nighttime, the lights are out and you are still waving the Nvidia flag.

      apparently you’ll be doing this for the next 6 months or more night after night looking very silly in the process depending on how late Nvidia is.

      very sad.

      • bwcbiz
      • 8 years ago

      Umm, you’re mixing 2 separate stories. The story about Kepler sample performance beating AMD makes no metric of whether they are high-end or mid-range samples. The story about the $299 cards makes no mention of performance benchmarks. Nothing in the stories implies that the high-performing samples map to the $299 silicon.

      Still, even if Kepler only gives a 10-20% boost over 79xx series, that is a substantial performance boost from both companies over their current product lines. So no matter who wins the crown, we win.

      • Lans
      • 8 years ago

      Lets see… ignoring the fact you are taking rumors as facts, this still leaves a bit of room for debate:

      [quote<]Nvidia will win this round on just about every metric, some more than others[/quote<] Namely [b<]even if everyone believes[/b<] the rumors, it is still valid to have options 2/4 (about same/not enough to impress) and 3 (faster). Of course then there is the fact that it wouldn't make for a very good/useful poll to assume what people believes while formulating the poll... And I am not sure how many TR readers think Charlie is 100% or close to 100% of the time being right... While I do believe he is better than 50% right, I have never sat down to tally things up and I rather read his stuff for fun and judge each rumor on its merit. So lets do that... Can GK104 be faster than HD 7970? Sure. HD 7970 in some cases are only around 25% faster than GTX 570 (too lazy to find GTX 560* comparisons) and if Nvidia gets "proper generational increases", it would suggest GK104 can be faster than HD 7970. Will it be priced at $299, that part seem rather unlikely... at least unlikely to stick... HD 7970 is faster than GTX 580 so such GK104 will be faster than GTX 580 which is sells for $470 (even GTX 570 sells for more than $299). Doesn't seem like a terribly good idea to price GK 104 so low if it beats HD 7970 that is selling for $560 and cannibalize your own product line (at least I would think it pisses off a lot of retailers/partners that want to unload all the old parts first). And there is another long winded way of saying why it is still useful to have this poll.

    • pogsnet
    • 8 years ago
      • HisDivineOrder
      • 8 years ago

      I heard that the successor after Kepler, codenamed Megaton, is being readied to launch 1 month after Kepler’s launch in June. nVidia is preparing the 999 series along with a cross promotion with Herman Cain called, “The Real 999 Plan.” It involves nVidia selling you a 999-based Megaton video card, a special early version of Resident Evil 6 with the 6 turned upside down, an Intel 39xx CPU, a copy of Herman Cain’s book and his autograph, and a specially edited version of the Hitler youtube video with your name in it and having you talk about how disappointing anything AMD is and how you can’t believe AMD would do this to you again.

      Remember, Megaton is coming one month after Kepler to spoil the AMD GPU that’s probably/maybe/probably not/maybe not/eh?/could show up 3rd or 4th quarter this year. Maybe. Or maybe first quarter next year. Or perhaps most of the line will show up 2Q like this year.

      But remember it because if you’re an AMD faithful, you’ve only got that to cling to for hope since even Charlie over at SA, as big an nVidia hater as there ever was, is saying AMD’s toast this round. Not hard to believe, really, when 7970 has so little to put it ahead of a 580 and Kepler’s sure to be tons faster than a 580…

      It’s almost enough to make you think AMD’s fleecing customers currently at those $550 prices. I guess we’ll find out in April if nVidia’s going to release a $300 card that destroys AMD’s $550 card. If so, we’ll know the answer to the question, “Is AMD fleecing its customers?” Because they’ll be dropping their card’s price down to match…

    • luisnhamue
    • 8 years ago

    of coz it will be faster than 7970. after some maths, i realized that AMD just managed to catch up with Fermi in terms of tesselation…and actually the 7970 is 10~15% faster in gaming. so if kepler keeps the nvidia trend of performance leaps over each generation, the game will be the same we saw last generation.

    7970 will be comparable to the cut-down version of kepler.

    • guardianl
    • 8 years ago

    The only educated answer I can think of is this:

    Tahiti XT (7970) = 378 mm2
    Cayman (6970) = 389 mm2

    And approx. a 50% performance increase between the two…

    GF110 (580) = 520 mm2
    Kepler = ?? mm2 (but ~500 follows the trend)

    So let’s say 50% performance increase between the two… except NVIDIA’s track record with producing on new processes is a little suspect (producing these huge chips would be a problem for anyone, I’m sure), so they’ll likely have to disable some shader clusters, and clock it lower then ideal like the 480.

    Thus, ~25% increase over the 580. So, I’m calling it now, Kepler’s high-end part at launch will be about 10-15% faster then the 7970, but use closer to 300W.

      • pogsnet
      • 8 years ago
      • BestJinjo
      • 8 years ago

      HD7970 is only 22% faster than GTX580. (http://www.techspot.com/review/481-amd-radeon-7970/page13.html)

      If High-end Kepler is 50% faster than GTX580, then:

      GTX580 (100%) —> HD7970 (122%) —> Kepler (150%)

      Therefore, Kepler would be 23% faster than HD7970.

      Bringing HD6970’s performance is irrelevant since we only need to look at GTX580 as the Base = 100%.

      Either way, I think AMD’s AIBs will soon bring out much faster clocked HD7970 cards. HD7970 can overclock to 1175-1300mhz. So really, the 925mhz reference clocked HD7970 won’t be relevant by the time the highest-end Kepler launches. I am sure AMD will respin the chip to make sure it’s competitive. It doesn’t look like NV’s high-end Kepler chip will launch before Ivy Bridge. That gives AMD plenty of room to do a 2nd respin of HD7970 (ala HD4870 –> HD4890), or get AIBs to hurry up with factory preoverclocked cards.

    • Joerdgs
    • 8 years ago

    It better compete so it brings the damn prices down.

    • TaBoVilla
    • 8 years ago

    we are still in the console era, yawn

    • alwayssts
    • 8 years ago

    Question seems ambivalent, but…

    1. AMD came to market first; 7970 is a known and nvidia will clock their part to beat it at 1920. Not a difficult task for under 225w.

    2. Many parts of 7970 seem unbalanced unless gaming above 1080p and/or using a ton of compute/dx11/shader-heavy features…bandwidth, bus/buffer for instance. The core clock and average power consumpton are incredibly low at stock…yet it requires 300w of power connectors….on and on and on.

    Will nvidia deliver a more efficient design for 1080p? Probably. Will Pitcairn do the same and be proportionally cheaper considering it will be a 24 ROP design? Probably. Will a faster-clocked Tahiti compete with whatever GK100 is at higher-rez while being cheaper? Probably.

    There isnt a yes or no to the question. It is about markets, and from there questioning both the stock SKU and the design (which factors in efficiency; power consumption/scaling while overclocking) in those markets.

    For instance, if Pitcairn and GK104 are both similarly efficient at 1080p, and use similar power, but Pitcairn has less units and greater clock and gk104 the reverse, which is a better product? GK104 because it will have greater absolute performance overclocked (and therefore greater absolute power consumption) or Pitcairn because it would be a smaller die with less absolute performance/power consumption but cheaper?

    Is there a right answer?

    That said, all signs point to nvidia making a wise decision for placement of gk104 so I voted faster. Point being…semantics and arbitrary truths.

    • Alexko
    • 8 years ago

    That’s a pretty strangely worded poll, given that Kepler is an architecture family, not a chip. But if you meant “Will there be a Kepler chip that’s faster than the HD 7970?” then my answer would be “Yes, most likely”.

    • Krogoth
    • 8 years ago

    The high-end Kepler will leap-frog the 7970 like the 580 leap-frogged the 5870/6970 from their throne.

    The interesting battle will be the mid-range where we might see some price wars going on.

    Otherwise, this generation is more evolution than revolution. Physics is making it difficult for the semiconductor guys to make massive boosts in transistor count and performance. The days of rapid performance boosts that come with every new generation are long over.

      • flip-mode
      • 8 years ago

      [quote<]Physics is making it difficult for the semiconductor guys to make massive boosts in transistor count and performance.[/quote<] Man, you were saying that billions of transistors ago. [quote<] The days of rapid performance boosts that come with every new generation are long over.[/quote<] What? ... What?... Like... what? Please see: 7800 GTX > 8800 GTX > GTX 280 > GTX 480. Please also see HD 2900 > HD 4870 > HD 5870 > HD 7970. These are current events Krogoth. These gains are happening now and are not stories from the good old days. Ugh.

        • I.S.T.
        • 8 years ago

        You skipped a generation with the HD 6970. In addition, most of those jumps weren’t <i>that</i> huge. Only the 7800 GTX>8800 GTX and the HD 2900>HD 4870 jumps were gigantic. In addition, those jumps were only so huge because the previous gen cards were so screwed up.

          • Squeazle
          • 8 years ago

          6970 was small enough of a jump to ignore that the 6000 series came out. Or it will be, once the 7000’s permeate the market.

          • Silus
          • 8 years ago

          What ? The 7800 GTX screwed up ? You need to let me know what you’re smoking…first to market with 6 months advantage over ATI, huge performance boost over the 6800s and new features. The performance gap was huge over the 7800 GTX, because the 8800 GTX was a totally different architecture (non-unified vs unified).

          In unified architectures the performance jump between generations will be as predictable as it was between non-unified architectures (for graphics, but this changes a bit when talking about compute features). Only if a new paradigm in graphics architectures arrive, totally different than non-unified / unified, will we be able to see another massive jump in performance, as we saw from 7800 GTX to 8800 GTX.

        • Bauxite
        • 8 years ago

        Yep, the march has been steady.

        Go [url=http://i.top500.org/sublist<]here[/url<], plug in June 1997. (#1 was a big step up back then) Then go [url=http://www.amd.com/us/products/desktop/graphics/7000/7970/Pages/radeon-7970.aspx<]here[/url<]. IMO the best equivalent to Rmax is the double precision compute. Look at the progression over time, there is no magical slowdown anywhere.

        • l33t-g4m3r
        • 8 years ago

        I think physics is starting to make a small impact now, with the TSMC problems, but IMO it’s the magnitude and complexity of these chips along with driver and software problems. I was playing batman AC earlier and physx was causing a boss fight to go literally slo-mo, so I turned it off. Frame drops is one thing, slo-mo is a bug. Yet another reason Physx should go open source.

        I also like the idea of a price war, since I don’t feel like paying $300 for an incremental speed bump. BF3 60 fps @ 1080p for $200, that’s the goal.

        Another point, Nvidia needs to up their QC, because too many defective/partially defective chips are slipping through and causing all sorts of problems. I think the TDR issues may be quite similar to creative’s SCP issues. Basically, low quality cards are being sold on the market. GPU-Z says my ASIC quality is 64.3%. I don’t really like that number.

        • Krogoth
        • 8 years ago

        Have you bother to check the jumps in performance, features and transistor budget in GPUs from 2001-2006 and compared to it the latter 2007-2011? The numbers speak for themselves.

        It has been slowing down since 2007 and there hasn’t been much improvement in terms of performance, features and transistor since the debut of Cypress and Fermi. I could even stretch it as far as the GT2xx and RV7xx family.

        The only thing that has been going up over the year is loaded power consumption and the need for larger HSF solutions.

          • Silus
          • 8 years ago

          I’m sorry Krogoth, I know you’re famous and all..but what is depute ? Maybe you meant debut ?

          • flip-mode
          • 8 years ago

          You’re too cracked to bother with.

      • FuturePastNow
      • 8 years ago

      Yep. I think the top-end Kepler will be about as much faster than the 7970 as the 580 was over the 6970. That’s enough to take the performance lead, but AMD has a lot more room to cut prices than Nvidia will. They should be making a huge profit on 7970s at $550; reducing that to a merely large profit will help them more than it will hurt them.

        • zzz
        • 8 years ago

        Agreed. I’m not a fan of ATI cards but if that department is keeping AMD afloat, I’m happy to see them have some consumer love. Nvidia will produce a faster card, at the expense of heat and power consumption. In reality, the people that actually buy these cards don’t care about those factors, since they’re premium cards and the only real metric is frames per second. In terms of price brackets, the usual price/performance will be a daily exchange in all but the top end.

        On topic: Aside from the highest end, the newest ATI cards are pretty much rebrands of their old stuff.I really wonder about performance when talking about mature drivers vs a modular design making driver development easier.

          • flip-mode
          • 8 years ago

          [quote<]Aside from the highest end, the newest ATI cards are pretty much rebrands of their old stuff.[/quote<] Not true, unless by "highest end" you mean high-enthusiast-mainstream. There will be 3 GCN chips that will result in *at least* 7 GCN cards and maybe a few more: Tahiti (7970, 7950, and 7990, possible 7890) Pitcairn (7870, 7850 - possible 7790 or 7830) Cape Verde (7770, 7750) The 7770 is rumored to be a $150 product, so we are very very far down from what anyone would call the "highest end". All the rebranding happens only at the lowest end, really, and everything is pretty meaningless down there anyway.

      • ImSpartacus
      • 8 years ago

      I agree, it’s really not going to be a very impressive generation. Maybe we’ll see something better next year.

      • LiquidSpace
      • 8 years ago

      what are you talking about retard, GTX580 is not faster than 6970 in any way.
      the only time GTX580 beats the 6970 is when AA is applied and there’s a good reason for that whether your rusty brain would believe it or not, the reason is that AMD uses as default MSAA while Nvidia dumped MSAA years ago since the 8000 series and they “Nvidia” now use a watered down version of AA so that the hit is less than 20% while in AMD’ case it’s almost 50% drop in fps with MSAA.
      One more thing is that ATI’ quality is always better than Nvidia’ in every aspect; I read an article years ago back in 2008, and it was comparing the quality in everything from AA to AF to transparent texturing to depth of field etc between Nvidia’ 8000 series and AMD’ old 1000 series and guess what the 1000 series although was some 2.5 times slower than the 8000 series it was better in every single qualitative aspect than the 8000 series with the broken “DX 10” back then.
      Nvidia even pushed Ubisoft to pull out DX10.1 support for Assassins creed 1 because it was performing much better on AMD cards and with better quality.

        • Airmantharp
        • 8 years ago

        I’m pretty sure benchmarks and screenshot comparisons the Internet over disagree with you (as do I).

          • LiquidSpace
          • 8 years ago

          no they don’t, without AA GTX580 is on par with 6970 in most cases; in BF3 maxed out without AA
          at 1920.1080 the 6970 gets 70FPS average and GTX580 gets 71.5 FPS average.

            • Airmantharp
            • 8 years ago

            At 1080p, most games are CPU-limited when playing with either card, as Krogoth suggests below.

            Still, the GTX580 is faster, and has more hardware resources available, than the HD6970. And again as Krogoth states, it has held the price to prove it.

        • Krogoth
        • 8 years ago

        What have you been reading?

        580 has always been faster than 6970. The only time they are head to head is due to another reason, they are CPU-limited in the given bench. It is also why the 580 always sold at a higher price point then the 6970.

        In GPGPU and tessellation related benchmarks, the 6970 isn’t even remotely close to the 580.

          • LiquidSpace
          • 8 years ago

          no it’s not CPU limited considering all these benchmarks are done on 2600K or even the six-core 980X and most of the time overclocked and also these cards have 2+ Gigs of GDDR5 ram wich is more than enought up to 2560.1600.
          secoondly the 6970 is not oriented towards GPGPU though it has some GPGPU functionality, and honestly no one cares about GPGPU untill we see some huge advancement in GPGPU arena, maybe untill we can run a whole OS on a GPU.
          third, Nvidia always sells their cards at a much higher prices, 280 was on par with 4870 and 285 on par with 4890 and Nvidia still sold these cards at $100+ over the compitition nothing new here.
          about Tessellation, Nvidia cards have higher tessellation output because that’s how they are designed, while their texture output is much lower, and that was the reason why in HawX Nvidia does better than AMD because the game was made with awful textures actually with everything awful and a lot of tessellation around and inside the Jet, it was practically a Demo for Nvidia.
          on the other hand AMD has a good balance between tessellation and textures so that they would be able to output good fps in most cases.

          Ignorance must be a bliss for you.

          • swaaye
          • 8 years ago

          Apparently AMD is totally amazing for BitCoin if that’s your thing. πŸ˜‰ Magical integer performance for the cryptography processing there.

      • ElMoIsEviL
      • 8 years ago

      Oh wow… I feel like I just listened to Rush Limbaugh… brain hurts…

      Ok… your “Sports commentary-like-Intellect-level” aside… how do you define “leap-frogged”? Because I haven’t seen a GTX 580 “Leap-frog” much of anything. I’ve seen it “faster” than a Radeon HD 6970 but not “leap-frog” it.

      When set in CrossfireX/SLI mode the 6970s compete rather well across the board.

      But that doesn’t really mean much. The 6970 was built on a VLIW4 architecture which was a tweak of ATIs 2007 VLIW5 architecture found in the Radeon HD 2900XT. In other words AMD have been able to use an aging architecture to compete with a brand spanking new “Fermi” SIMD architecture. That’s pretty impressive alone.

      Why? Because the VLIW4/5 architectures require an efficient Software Compiler in order to ensure the intelligent management of resources. That is because AMD VLIW4/5 architectures do not handle the instructions themselves… your systems CPU does. So AMDs GPUs have been harder on the CPU usage. Run F@H for shits and giggles or any PC game and compare the CPU usage.

      Tahiti XT is based on GCN with is an SIMD (MIMD) architecture. AMD have yet to release the full driver for it. I’m still using the Beta launch driver.

      GCN is a Revolution for AMD (not an evolution). You don’t realize it because you’re likely not that knowledgeable. I don’t mean to insult but you’re probably just a “gamer” who is mostly concerned with “Frames Per Second”. Well Gamers have a right to be disappointed with Tahiti (7970) because it contains nearly the same ROPs, TMU arrangement (but with tweaks) as the 6970.

      But that’s not why AMD released Tahiti. It is a Compute Monster. It destroys anything on the market today in that respect but is still limited in games by a subpar hardware rasterizer (compared to what nVIDIA offers) and far less ROPs (stuck at 32 still).

      And that’s that.

        • Fighterpilot
        • 8 years ago

        Very nicely reasoned post there Elmols.
        NVidia’s high end chip will defeat HD7970 if it’s…
        1.Noticeably faster in FPS across a wide variety of current and recent games.
        2.Has superior compute performance ie Bitcoin hashing,password DB cracking etc.
        3.Runs cooler at full blast.
        4.Uses less power at idle and full throttle.
        5.Is noticeably quieter while achieving the above points.
        6.Is cheaper by more than $20.
        7.Overclocks like hell and doesn’t die easily from cheap power circuitry.
        8.Arrives sometime in the next 2-3 months(otherwise its just plain late again).
        9.Includes a dedicated hardware decoder for video and delivers stereo audio similar to the 7970.
        10…..need I go on?

        Otherwise its just vaporware dream for fanbois like Silus and Hightech to spam this(and other) websites about.

    • I.S.T.
    • 8 years ago

    Love the Krogoth meme.

      • can-a-tuna
      • 8 years ago

      I voted for that option because it sounded funny. I didn’t know he’s an nvidia douche.

      • WillBach
      • 8 years ago

      Yeah, it’s really catching on. I think some guy made a Krogoth account but if if he hasn’t I really should πŸ˜€

        • tfp
        • 8 years ago

        Yeah I think some guy made a Krogoth account years ago

      • ImSpartacus
      • 8 years ago

      I actually wasn’t impressed with it.

    • LiquidSpace
    • 8 years ago

    I want a card that would give me a minimum of 60 FPS in BF3 maxed out at 1080 or higher.

    • Geistbar
    • 8 years ago

    I don’t think these poll options will impress Krogoth.

    • Kharnellius
    • 8 years ago

    “How will Kepler far against the Radeon HD 7970?”

    Did you mean:
    “How will Kepler fare against the Radeon HD 7970?

      • BobbinThreadbare
      • 8 years ago

      Has anyone really been far even as decided to use even go want to do look more like?

        • derFunkenstein
        • 8 years ago

        +1 one for making my brain ache.

        • Chandalen
        • 8 years ago

        β€œYou’ve got to be kidding me. I’ve been further even more decided to use even go need to do look more as anyone can. Can you really be far even as decided half as much to use go wish for that? My guess is that when one really been far even as decided once to use even go want, it is then that he has really been far even as decided to use even go want to do look more like. It’s just common sense.”

    • JoshMST
    • 8 years ago

    And we will all be amazed when it comes out as smaller, cooler, and faster than the 7970.

      • chuckula
      • 8 years ago

      I just downthumbed you even though I think there’s a good chance that it actually [b<]will[/b<] be all those things...

      • xeridea
      • 8 years ago

      Doubtful. It will be faster for sure, but the die will likely be huge like Fermi, and will run hotter because of this, and also Nvidia tends to favor silence over cool.

        • NeelyCam
        • 8 years ago

        [url<]http://semiaccurate.com/2012/01/23/exclusive-and-the-nvidia-keplergk104-price-is/[/url<] [quote<]"Kepler/GK104 cards will be priced around the $299 mark. This should tell you quite a bit about how large the silicon is, but not necessarily what it will be marketed as."[/quote<]

    • yammerpickle2
    • 8 years ago

    Faster, but I voted for insufficient to impress Krogoth. Sure it will be faster, but like other posts by how much? Also we should be asking how much more will it cost, how much more power does it consume to produce how much more heat. I’ve two GTX580’s in this rig, and my wife’s has a single GTX 580 so I’m not a AMD fanboy. I can tell you from experience my GTX580 are like having a hair dryer set on max blowing on you. I never have to run any heat to the office, but in summer I run the AC like a madman to cool that room.

    • indeego
    • 8 years ago

    Krogothmehs can now replace frametime in all benchmarks.

      • 5150
      • 8 years ago

      I gotta go Krogoth. I couldn’t give a damn about top-end $500+ video cards.

        • gmskking
        • 8 years ago

        Yea, there is no way I would spend that much for a video card that will be old in a year. It makes no sense. $250 is about my max, then again, I do not game on my PC either. But, a good reason for that is $500 video cards.

        • BobbinThreadbare
        • 8 years ago

        It’s still cool to see what will trickle down to the reasonable price range.

      • UberGerbil
      • 8 years ago

      So this is KrogothMeh-trics?
      (sorry)

        • ImSpartacus
        • 8 years ago

        [i<]*wipes tear*[/i<] Krogoth would be impressed.

    • allCarnage
    • 8 years ago

    Really? No one is going to point out the spelling error in the title? When did i become the spelling Nazi?

    Then again, without spell check i would have a half dozen spelling errors in just this post…

    =D

      • gmskking
      • 8 years ago

      I was looking to see if someone else caught it. I think most people do not care anymore. They already know what is meant. πŸ™‚

      • PrincipalSkinner
      • 8 years ago

      I noticed. It just hurts my eyes. I must close one to be able to watch it.

      • indeego
      • 8 years ago

      I figured they left it because it would break their CMS/poll software or something.

      • Arbie
      • 8 years ago

      allCarnage – it’s “half-dozen”.

      • SomeOtherGeek
      • 8 years ago

      No, we were waiting for Meadows to take care of it for us.

      • BiffStroganoffsky
      • 8 years ago

      It is more fun to yank Cyril and Ronald’s chains. Geoff is too laid back to care…as the title still shows.

      • christopher3393
      • 8 years ago

      not corrected yet…don’t know if I’ll sleep.

      • MadManOriginal
      • 8 years ago

      You do have 2 errors anyway, not sure whether this counts as spelling or grammar, but ‘I’ is supposed to be capitalized.

        • allCarnage
        • 8 years ago

        I hate the capital i rule, it always feels unnatural to me… (lol or should that be un-natural? damn hyphens are annoying too!!)

    • TurtlePerson2
    • 8 years ago

    I would assume that it will be a larger chip, consume more power, and perform slightly better in the benchmarks than the 7970. That’s how it’s been for the last few generations of chips and I don’t expect it to change.

      • SomeOtherGeek
      • 8 years ago

      Therefore, Krogoth is not impressed.

    • chuckula
    • 8 years ago

    I’m torn between faster than 7970 and Insufficient to Impress Krogoth!!

    Seriously though, I don’t think there’s much doubt that Kepler will beat the 7970. The real questions are:
    1. By how much will it beat the 7970?
    2. By “beat” do you really mean games, GPGPU computing, or both (good change GPGPU gets a bigger boost than games).
    3. Will fast Keppler cards really be available in April, or will it take longer?
    4. Is the 7970 it from AMD for the next 6 months or is there a much badassier chip waiting to strike?

    P.S. –> I can’t actually register a vote, is the poll not turned on yet?

      • khands
      • 8 years ago

      For 4, we can pretty much assume that there’s a dualgpu card down the line. Also, I wouldn’t be surprised if there was a 7980 or something that is basically an oc’d 7970 should nvidia bring the competition.

        • Game_boy
        • 8 years ago

        There were some claims that 7970 had lots of headroom and operated just fine above 1GHz. And you’re right on the dual-GPU card because there’s no 300W TDP 7xxx yet.

        • glynor
        • 8 years ago

        I agree about the dual GPU card.

        I also think it is worth pointing out that AMD likely has significant pricing power with this since it is such a small piece of silicon, and their yields should be nicely up by the time Kepler does launch. They’re just making up for a flailing CPU division right now (and waiting for yields).

        I wouldn’t be shocked to see the 7980 drop to $399 or lower within days of Kepler’s NDA expiring.

    • maroon1
    • 8 years ago

    Even Charlie Demerjian have confessed that Kepler is going to win

    [url<]http://semiaccurate.com/2012/01/19/nvidia-kepler-vs-amd-gcn-has-a-clear-winner/[/url<]

      • Game_boy
      • 8 years ago

      WHAT

        • maxxcool
        • 8 years ago

        yeah, i snorked my soda when i read that…

      • cegras
      • 8 years ago

      Oddly enough, I think his damning of GCN is rather foreboding.

      • tviceman
      • 8 years ago

      He also admitted in those forums that GK104’s top video card won’t be faster than hd7970, that will have to wait until Nvidia gets it’s high end Kepler chip out (GK110 or GK112, depending on what rumors you want to believe). But he did round about acknowledge that Nvidia has massively improved both performance per watt and performance per mm^2, to the point that GK104 will be more efficient at gaming than Tahiti.

      • Arbie
      • 8 years ago

      [Sorry if this is a double post]

      Charlie didn’t “admit” or “confess” it. Using such wording is simply unfair to him. He announced it, just like you would any credible piece of tech news. And AFAIK he announced it before anyone else. And he’s probably right, because he seems to get good information and make good use of it.

    • Farting Bob
    • 8 years ago

    Faster, but less power efficient and more expensive. In line with price/performance going down the lineup.

      • ew
      • 8 years ago

      Exactly!

      • Firestarter
      • 8 years ago

      Probably faster and proportionally more expensive. I guess power efficiency will be comparable though, considering GCN is less efficient than VLIW4/5. Matters it does not, buying HD7950 anyway I will.

        • ImSpartacus
        • 8 years ago

        No, faster and DISproportionally more expensive.

      • HighTech4US2
      • 8 years ago

      Charlie: The short story is that Nvidia will win this round on just about every metric, some more than others. Nvidia wins, handily.

      Please re-read that sentence again.

Pin It on Pinterest

Share This