Nvidia’s ”big Maxwell” GPU appears online

Pictures of what seems to be a "big Maxwell" reference card have appeared online. Videocardz posted the shots, which show an Nvidia-branded circuit board with a GPU labeled GM200-400-A1. The chip looks larger than the GM204 GPU in the GeForce GTX 970 and 980, and it's accompanied by a generous 12GB of Hynix RAM rated for transfer rates up to 7 GT/s.

Videocardz suspects the card is the same as the Quadro M6000 that popped up in the GPU-Z database last month. Although the specs haven't been confirmed, the GPU utility reported 3072 shader processors, 96 ROPs, and a 384-bit memory interface—a 50% increase over the GM204 on all fronts. The 988MHz GPU clock is lower than the base and burst frequencies for the GTX 980, but that figure could climb in the final product.

Quadro cards are aimed at workstations, but they also provide a foundation for the ultra-high-end Titan series, which is due for a Maxwell refresh. The Kepler-based GeForce GTX Titan Black debuted last February, and its predecessor came out one year earlier.

An impending launch would also jibe with the GM200 timeline published in a research paper affiliated with Nvidia's Los Angeles operations. That paper projected the GPU's arrival for the end of 2014. The leaked shots are dated December 27, and the GPU-Z entry was submitted December 24.

If the next-gen Titan does arrive next month, expect it to be very fast—and very expensive. The Titan Black sells for nearly $1300 right now.

Comments closed
    • ronch
    • 6 years ago

    So maybe it’s time AMD did something that will make us feel they’re not stuck with technology largly based on the HD7000 series and rebrands plus a few little things like TrueAudio tacked on?

    I plan to upgrade my graphics card in 2015 or so, so here’s hoping AMD will have a good lineup soon, otherwise it’s hello Nvidia for me.

    • internetsandman
    • 6 years ago

    Oh man I’m glad I haven’t upgraded yet
    I can save up for a bit longer and properly splurge on a maxwell Titan

    • BlackDove
    • 6 years ago

    I was excited by the next big chip, but this actually sounds hugely disappointing. GK110 is about 3.5x as powerful as GF110. This was largely because it was a new architecture and a new 28nm process.

    This sounds like GM200 will be 28nm planar just like GM204. The GK104 was more than twice the SP performance of GF110.

    I guess thats why the 900 series and whatrver they call this havent been nearly as hyped as the 600 and 700 series were.

    I expect GPUs to stagnate until 2016 when GP100 comes out with stacked DRAM and 16nm.

    A lot of people i know are picking up 780s and 780tis really cheap at least.

      • geekl33tgamer
      • 6 years ago

      Best bang for the buck right now must be on the discounted R9 290 series? They are under £200 here shipped. It’s closest “green” rival is the GTX 970 and they come in at £250 starting price…

        • Pwnstar
        • 6 years ago

        I picked up a 290x for $280. Final price will be $260 after I sell the game that came with it.

      • MathMan
      • 6 years ago

      If you think a GK110 is 3.5x of a GF110, then the source of your disappointment is yourself.

      A GK110 is only a bit more than 2x faster.

        • BlackDove
        • 6 years ago

        Full GK110 is 5.2 TFLOPS. Full GF110 is 1.5 TFLOPS.

        2×1.5 is not 5.2 is it?

          • MathMan
          • 6 years ago

          Ah, you’re one of those who judges GPU performance by pure Flops. Carry on then.

          (By your reasoning, AMD would have murdered the Fermi generation, but one way or the other they didn’t.)

            • BlackDove
            • 6 years ago

            Im one of those people who realizes that people dont just play games with GPUs. Hyper-Q and dynamic parallelism in addition to a process shrink and new SM design did give much more than 2x the performance in a lot of applications.

            GK110 did turn its FLOPS into real world performance unlike AMD.

            My point, which you obviously missed, is that its disappointing that GM200 is 28nm planar and not 20nm FinFET, and that we wont see a jump in performance like we did from GF to GK until GP100, which will be 16nm and have HMC DDR4 based memory on a silicon interposer.

            AMD is trying something like that but its got a 300W TDP so it probably wont be able to compete with GP100 or Knights Landing.

            Carry on lol.

            • MathMan
            • 6 years ago

            What extra funny is that, for GPU compute, the Kepler architecture is significantly worse than Fermi for many workloads.

            But, hey, if you can find the unicorn application where it’s 3.5x faster: knock yourself out!

            • MathMan
            • 6 years ago

            As I said, it’s very well known that it is quite a bit harder to extract compute FLOPS out of Kepler compared to Fermi.

            How much?
            Well, here’s a presentation that explores just that:
            [url<]https://www.petaqcd.org/IMG/pdf/Review_Junjie_LAI.pdf[/url<] It analyzes the performance of CUBLAS, among the easiest class of GPU programs to optimize for maximum FLOPS. It finds that Nvidia's library achieves 70% of peak for Fermi vs only 42% for Kepler. It then does an analysis about the theoretical maximal performance taking into account the known warts of the architectures and ends up with 82% vs 57%. If you look at tons of other benchmarks, this shouldn't come as a surprise. They were pretty mediocre for Kepler across the board. Maxwell is a significant improvement on this. (I still would love to see one, just one, example of the 3.5x! )

            • BlackDove
            • 6 years ago

            Your link doesnt work for me but DGEMM efficiency with GK110 is about 70% while GF110 was 60%. GK110 was easier to extract performance from FLOPS than GF110 if you made use of things like Dynamic Parallelism, Hyper-Q and GPU direct. Not all code does.

            The 780s had their DP performance nerfed worse than the 580s which is the only case i can think of where the GF110 worked better than GK110. Im only talking about fully enabled GK110s and GF110s in Quadros, K20s, K40s and Titans. GK210 improved on that.

            • VincentHanna
            • 6 years ago

            wait, your prediction is that Nvidia is going from 28nm in 2015 to 16 nm in 2016? wut?

            you high bro?

            • BlackDove
            • 6 years ago

            Thats what they have to do. TSMC cant produce GM200 on 20nm FinFET from what ive read.

      • Pwnstar
      • 6 years ago

      What do you expect nVidia to do? The new processes aren’t ready until November at the earliest and the current 20nm process isn’t suitable for high performance GPUs.

    • LoneWolf15
    • 6 years ago

    As much as I like to see nVidia having a design win…

    Maxwell owners (myself included) are having plenty of GPGPU issues with Folding@Home, issues nVidia has known about for some time now. Bugs in their OpenCL implementation in the driver mean that 970/980 owners are having the client fail, or deliver really substandard performance.

    Still waiting to see this resolved. I’d love to consider bumping up from a 970 to a 980, but this has given me pause; I might wait to see what the Radeon 3xx series puts out. In comparison, my Radeon R9 280x cards might have run hotter, and taken more power, but they never failed in Folding.

      • xeridea
      • 6 years ago

      Nvidia doesn’t care about OpenCL because they want people to use CUDA. Sad that there are driver issues, I know historically Nvidia has been a lot better for folding than AMD cards (hardware technicals, AMD was way better @ cryptocurrency). I remember doing folding, last I did it though was on an HD 4850.

    • Meadows
    • 6 years ago

    Pardon me, how much RAM again?

      • Pwnstar
      • 6 years ago

      It’s a professional card. They need lots of RAM.

        • Meadows
        • 6 years ago

        You don’t know what it is. It’s unreleased.

          • Pwnstar
          • 6 years ago

          Are you suggesting nVidia’s next consumer card will have 12GB of RAM?

          That’s what makes me think this is a professional card, not consumer.

        • VincentHanna
        • 6 years ago

        with the way games are going, 12GB of ram might not be that out of the ordinary in a couple of years.

    • TwoEars
    • 6 years ago

    I’m excited but the price has to be reasonable…

    You can easily get 970 SLI for $700 these days and that’s a *heck of a lot of performance* for the money. Most likely more performance than a Titan II will be. And SLI works great these days, no reason to be afraid of SLI. My 670SLI setup has been working great.

    I could personally maybe stretch to $999 for a Titan II but any more than that and I’ll definitely buy 970SLI instead.

      • Meadows
      • 6 years ago

      It’ll be reasonable. Probably only a little over $1000.

        • Prestige Worldwide
        • 6 years ago

        1000 for a GPU is not reasonable.

          • TwoEars
          • 6 years ago

          I love you too.

          • Meadows
          • 6 years ago

          I approached the issue with his level of seriousness.

          • kuraegomon
          • 6 years ago

          To you. That’s an almost entirely subjective value judgement.

            • VincentHanna
            • 6 years ago

            You could say the same thing if Nvidia asked for a pound of flesh or your right arm.

      • f0d
      • 6 years ago

      tried sli with the only game i really care about for performance – planetside 2
      it just diddnt work no matter what i did there were constant jitters/stutters
      i tried default profile
      i tried profile 1
      i tried profile 2
      i tried different drivers and settings

      no matter what happened i was getting horrible stutters and jitters multiple times a second
      this is with 2xgtx670 but as far as i know there isnt any problems with sli on 670’s?

      anyways i decided to sell my second card and buy a 970 then sell my first card, then the australian dollar dropped like a rock from 1:1 us dollars to .8:1 us dollars and a gtx970 now cost around $500+ 🙁

        • renz496
        • 6 years ago

        I don’t know if it was my setup but to me the problem might from the game itself. I think problem appear after last week updates. Before that my 660SLI work just fine. Right now even when I’m alone in less crowded are my FPS can drop below 30. In area in some fights my fps can go as low as 15 fps. Before this even in big battles my fps rarely drop below 30

    • Krogoth
    • 6 years ago

    Don’t get your hopes too high. It is probably a Quadro/Titan put into place to counteract AMD’s next big release.

    Expect a $699+ MSRP at launch unless AMD’s counterattack tries to undersell the competition.

      • the
      • 6 years ago

      Yeah, the big variable here is how much the R9 380X will cost and perform. Using HBM memory isn’t cheap but it will also help make it very, very fast. I don’t think it’ll be more than a Titan as they need the market share right now. There is a decent change this could out perform the GM200 so it’d make sense for nVidia to be price competitive in that case.

      • Prestige Worldwide
      • 6 years ago

      If you read the article you would know that it is a Quadro. This has been known for a while in the rumour mill. Nobody is expecting a GeForce to launch any time soon.

        • the
        • 6 years ago

        I do. Mainly because if beefed up the double precision hardware like they did from GK104 to GK110, then it would be an even larger die. The GK110 still out runs the GM204 in double precision work which is important in the Quadro target market.

    • Ninjitsu
    • 6 years ago

    Oooh I had made a pretty cool prediction about this last April (and saved the link):
    [quote<] When Maxwell originally launched in Feb, I had done some rough calculations and came to the conclusion that GM210* could end up with 3824 stream processors on the same process node [/quote<] [url<]https://techreport.com/news/26300/rumor-points-to-bigger-maxwell-gpus-with-integrated-arm-cores?post=814702#814702[/url<] Though I massively underestimated power for GM204 (but was close on shader count and SM): [quote<] Let's see. Nvidia can place 5 SMMs for each SMX, and still maintain half the TDP, at 28nm. So if the die size is increased 25%, GM204 would have 20 SMMs instead of 8 Kepler SMXs on GK104. This means 2560 (128x20) vs 1536 stream processors at a TDP of 97.5W, all else being equal. This gives us 26.2564... SP/W. [/quote<] GM204 shipped with 2048 stream processors and 16 SMMs, but at a 165w TDP. [url<]https://techreport.com/news/26300/rumor-points-to-bigger-maxwell-gpus-with-integrated-arm-cores?post=814725[/url<]

      • peaceandflowers
      • 6 years ago

      It has a 180 W TDP, actually – according to its own reference BIOS, and also real world tests. Not sure why it’s claimed to be 165 W… Some wanky definition of TDP, maybe.

    • Kretschmer
    • 6 years ago

    I really hope that 2015 isn’t a year where nVidia has the performance/efficiency/driver advantage and AMD has the sane adaptive sync technology. Maxwell looks like a slam dunk so far, but $600 1080P TN panels (with vendor lock-in) have really turned me off to the platform.

    If I have to choose between performance/quiet with dying monitor tech and 300W under-performers that work with cheap Freesync IPS displays, I might go back to gaming on my SNES until this all blows over.

      • renz496
      • 6 years ago

      GSync will lock people to nvidia no doubt. But I will also not going for adaptive sync monitor unless it is confirmed to be adopted by other company than AMD.

      • Krogoth
      • 6 years ago

      Don’t fret over G-Sync/Free-Sync non-sense.

      Just wait until VESA group decides on something and I doubt G-Sync will get the nod until it drops the custom hardware requirements.

        • JustAnEngineer
        • 6 years ago

        [quote=”Krogoth”<] Wait for VESA... [/quote<] Here you go, dated May 12, 2014: [url<]http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/[/url<]

        • Andrew Lauritzen
        • 6 years ago

        I actually think a platform vendor (Microsoft, Apple, Google) needs to step in here and force standardization. To do adaptive sync *right* is going to require OS support as well, as right now we only get it in exclusive full screen mode which has its own issues going forward. OS’s are already peripherally involved with this via the 48Hz refresh stuff (for lower power and less jittery video) which is the related to the VESA spec additions. Arguably panel self-refresh is related as well.

        Hopefully someone does step up as there’s really no motivation for hardware vendors to standardize among themselves.

          • Krogoth
          • 6 years ago

          That’s the crux of the problem.

          A tiny, but vocal minority only care about variable syncing. The masses and businesses don’t care so there’s no incentive for big players to do anything about it outside of Nvidia/AMD’s little pissing contest.

      • mesyn191
      • 6 years ago

      That 300w ‘under performer’ is also rumored to be quite a bit faster than the 980 or 970 and will probably end up fairly close to ‘big Maxwell’ performance while also probably selling for less. That it will give you adaptive sync with cheap-ish monitors, which could also work with nV cards assuming nV bothers to update their drivers, is hardly a bad thing.

        • Voldenuit
        • 6 years ago

        Here’s hoping the 380X puts price pressure on 970/980, but there’s no way I’m sticking a 300W card in my case (microATX).

          • mesyn191
          • 6 years ago

          You’ll be fine. Its if you have a SFF or iTX case that high power GPU’s become untenable.

            • JustAnEngineer
            • 6 years ago

            The size of the case is only relevant if it lets you mount more fans and have more vents and you actually use those. A giant case may take an extra minute to reach steady-state temperature, but once you’re at steady conditions, only the amount of air flowing and the temperature rise of the air matter. A micro-ATX case with the same air flow and temperature rise as a giant E-ATX case will perform the same.

            • mesyn191
            • 6 years ago

            Sure but many smaller (SFF, iTX) cases have less airflow capability due to less room for fans and small or even nearly non-existent vents.

            So you can’t count on the airflow being the same for those small (SFF, iTX) cases vs. bigger mATX or even E-ATX cases.

            • JustAnEngineer
            • 6 years ago

            You also can’t count on it being any less.

            A well-designed mini-ITX case (e.g.: [url=http://www.silverstonetek.com/product.php?pid=536&area=en<]Sugo SG13[/url<]) can dissipate 500+ watts.

            • mesyn191
            • 6 years ago

            The vast majority of iTX/SFF cases aren’t built like that though so yes unfortunately you can count on it being less in the common case which is what matters when making general statements.

            Outliers do not rule out the general case.

        • rahulahl
        • 6 years ago

        Yea, but at 300w, its probably gonna be more expensive to run.
        The lower cost wont really be that low, if you are spending excess in electricity bill.
        In US, it probably does not matter, but in Australia, we pay more than 8 times per killowatt compared to US I think.
        Plus, the excess heat cant be good for your components.

          • mesyn191
          • 6 years ago

          [url<]http://shrinkthatfootprint.com/wp-content/uploads/2013/09/electriccost1.gif[/url<] The average cost of electricity per Kwh in Australia appears to be a bit more than double the US but far from 8x. The total energy cost will still be low enough not to matter much unless you're actually using 300w almost all the time on the GPU. Which we know won't happen. 300w is the peak power usage which will only happen under some work loads and people don't let their games run 24/7. The only people where 24/7 peak power usage numbers are relevant are for those who do distributed GPGPU computing like F@H or MilkyWay all the time. The common case will be quite a bit lower than that. If you do a peak power usage scenario comparison of the 'energy sipping' high end GPU's like the 970/980 you'll find they still pull significantly north of 200w under peak load. So you're not going to save much power or even necessarily reduce the total amount of heat being dumped into your case, room, or home vs. a 300w GPU. Its only a 70-80w difference. If you're really worried about that sort of thing just turn off a light bulb or 2 in the house. Or set the AC temp a degree higher. Or set the central heating temp a degree lower. Those latter 2 can really save you a fair amount of power usage actually.

            • rahulahl
            • 6 years ago

            My bad.
            Someone told me its 4 cents a kilowatt in US. They must have been mistaken.

            • mesyn191
            • 6 years ago

            Its alright. Wouldn’t surprise me if it was true several decades ago.

            • JustAnEngineer
            • 6 years ago

            [url<]http://www.eia.gov/electricity/data/browser/#/topic/7?agg=2,0,1&geo=g&freq=M[/url<] The average residential price for electricity is 12½¢ per kw-hr. [url<]http://www.eia.gov/electricity/data/browser/#/topic/7?agg=0,1&geo=vvvvvvvvvvvvo&endsec=vg&linechart=ELEC.PRICE.TX-ALL.M~ELEC.PRICE.TX-RES.M~ELEC.PRICE.TX-COM.M~ELEC.PRICE.TX-IND.M&columnchart=ELEC.PRICE.TX-ALL.M~ELEC.PRICE.TX-RES.M~ELEC.PRICE.TX-COM.M~ELEC.PRICE.TX-IND.M&map=ELEC.PRICE.US-ALL.M&freq=M&start=200101&end=201410&ctype=linechart&ltype=pin&rtype=s&pin=&rse=0&maptype=0[/url<] Hawaii tops the states at 36½¢. Washington is the cheapest at less than 9¢.

            • VincentHanna
            • 6 years ago

            Washington also happens to be 90%+ clean energy because of all of our hydro-electric.

      • 3SR3010R
      • 6 years ago

      AMD’s Free Sync is just as much a vendor lock-in as G-Sync is because neither can optimally run with the other vendor’s GPU.

      If you have a Free Sync monitor then Nvidia is at a disadvantage because it can’t use the adaptive sync technology built in.

      If you have a G-Sync monitor then AMD is at a disadvantage because it can’t use the adaptive sync technology built in.

        • VincentHanna
        • 6 years ago

        true, but one utilizes free technology built into the display port standard.

        which means that, if your monitor has a display port 1.2a, and supports G-sync you shouldn’t be locked to either, correct?

        [url<]http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/[/url<]

        • mesyn191
        • 6 years ago

        Its nV’s choice not to use FreeSync though since its really just a branding for a feature built into the VESA specification. There is some GPU specific driver related work to be done but that would’ve been true no matter what and nV can easily afford to do that.

        AMD has no ability to use GSync at all because nV won’t let them. Its a huge difference.

      • AdamDZ
      • 6 years ago

      Not everyone cares. I’ve owned only Nvidia cards for like a decade. I’m not interested in anything that AMD has so vendor lock-in is not an issue for me. It’s not like I’m buying $5000 worth of lenses for a DSLR. It’s just a monitor and the prices will go down.

      • ronch
      • 6 years ago

      I’m actually into old PC games right now and have my PS2 hooked up. Really, new doesn’t mean good especially in gaming. Countless great games were made in the last two decades that are still worth one’s time.

    • crabjokeman
    • 6 years ago

    Where the heck are the low-end Maxwell cards that can easily be passively cooled?

      • Deanjo
      • 6 years ago

      [url<]http://www.zotac.com/products/graphics-cards/geforce-700-series/gtx-750/product/gtx-750/detail/geforce-gtx-750-zone-edition.html[/url<] [url<]http://www.palit.biz/palit/vgapro.php?id=2377[/url<] [url<]http://www.asus.com/Graphics_Cards/GTX750DCSL2GD5/overview/[/url<]

      • Mad_Dane
      • 6 years ago

      The low-end Maxwell cards is being sold as 970 and 980 cards, look up Ngreedia’s naming history and you will see that the XX4 chip used to be in budget cards.

      Edit: The 750 TI is a Maxwell card, low power and all.

      [url<]http://www.guru3d.com/articles-pages/palit-geforce-gtx-750-ti-kalmx-review,1.html[/url<]

        • MathMan
        • 6 years ago

        Is this the same ‘ngreedia’ (what are you, 12yo?) that priced the GTX 970 way below the R9 290X and forced AMD to adjust all its prices?

        It’s not Nvidia’s fault that their mid-level silicon beats that pants of AMD’s high end…

          • Mad_Dane
          • 6 years ago

          So you think a 12 year old has the technical knowledge to judge hardware over many generations? GTFO

            • MathMan
            • 6 years ago

            A 12 year old would use names like ngreedia.

            It’s beyond me why an adult would stoop down to that level.

            It’s not funny. It’s not clever. It’s just plain childish.

            • VincentHanna
            • 6 years ago

            If you don’t think so, you must have been a very slow child.

            when I was 12, I was more than capable of parroting what I had heard on some website I liked and pretending that I came up with it.

            • Waco
            • 6 years ago

            You clearly don’t understand that “ngreedia” makes you sound like a moron…

          • Tirk
          • 6 years ago

          You mean the same R9 290X that forced ngreedia to drop all its prices when it came out?

          Its funny how when you actually use the 970 and 980 its no where near the low tdp they were stated as having. And people wonder why no passively cooled versions? Its pretty obvious when you look at the real power usage of these chips.

          At this point the R9 290X is almost a year old. It took almost half a year for Nvidia to respond with the 970 and 980. Best estimates are AMD’s next gpus will come in around March which is the same time frame Nvidia took to release their new gpus. But don’t tell that to the $1000 titan users who somehow think that’s a bargain price.

            • MathMan
            • 6 years ago

            Or when the GTX680 undercut the the 7970.

            My point is: they’re both companies who try to make money, and they’ll do whatever can be done to optimize their profits.

            Sometimes that means setting high prices, sometimes that means undercutting your competitor.

            It just so happens that Nvidia manages to convince people that they’re worth being paid more for a similar product. You can call them greedy if you wish, but if AMD didn’t have such a shitty brand (as in: not being able to command higher prices for the same product), they’d do just the same.

            • f0d
            • 6 years ago

            its the same with cpu’s
            when they had the performance lead amd was charging $500-$1k for their top cpu’s but when they lost the advantage they dumped their prices to what they have now

            if amd could they would definitely charge a premium for their products – the only reason they dump their prices is because to compete they have to (to be fair nvidia also dump their prices to compete also – its just how the market works)

            the 7970 and 290x was at a pretty high price when they were released and iirc the price for a 7970 was much higher than amd’s set rrp for a few months after release

            amd are just as “greedy” as nvidia (ngreedia lol) i never understood why they get such a good guy image when they are both exactly the same (as in they both want your money and will take as much as they can)

            i never really could get why people stick up for a certain brand – its just a freaking videocard/cpu, why put any emotional investment into a piece of silicon? just get whats good and suits your needs at the time of purchase
            i will switch brands at the drop of a hat depending on price/performance/needs at the time of purchase
            amd/nvidia/intel make great products depending on your need and when you buy, why limit yourself to only one option?

            geez that rant went on for longer than i intended

            tldr?
            amd/nvidia/intel are all just as greedy as each other and will do whatever they can to get your money, look out for yourself and buy whatever is right (whatever the brand) for your price/performance/needs when you buy

            • renz496
            • 6 years ago

            Another people who believe nvidia doesn’t have any respond to 290X until 980/970 comes out. Nvidia respond to 290X was 780 Ti. Period. 980 did not bring much of performance improvement but I think part of that it was AMD that let Nvidia to do that.

        • Ninjitsu
        • 6 years ago

        S.T.A.L.K.E.R.! Hi, how are you.

          • Prestige Worldwide
          • 6 years ago

          S.T.A.L.K.E.R. posts on disqus as “GTX MILKING SERIES”

          [url<]http://videocardz.com/54263/nvidia-geforce-gtx-960-specifications-and-performance-leaked#comment-1791495502[/url<] [quote<]GTX Milking Series • 3 days ago GTX 128bit_Potato in the house ! 1.2 GHz clock speed ! and, a disgrace of a card... Nice job Ngreedia, cows will love it.[/quote<]

          • Meadows
          • 6 years ago

          It was only a matter of time.

      • Prestige Worldwide
      • 6 years ago

      My ASUS Strix GTX 970 is passive if the GPU is under 60 C.

      But not low-end.

      • Mad_Dane
      • 6 years ago

      Its crazy the amount of Nvidia fanboys is around, I guess you enjoy the free bottle of lube with your Titan cards.

        • Krogoth
        • 6 years ago

        Yeah, the Nvidia fans (namely shills) are a strange crowd. They don’t like any form of legitimate criticisms.

      • HisDivineOrder
      • 6 years ago

      Jimmy Hoffa and Elvis are watching over these cards as we speak. Refined, Hoffa is reading a copy of the finalie book in the Game of Thrones series, having just completed the first book in the new Harry Potter 7 book series, while Elvis ripped open one of these precious passively cooled GPU’s and is using it to play Half-Life 3. His Steam profile reveals he just completed Half-Life 2 Episode 3. Then he played some Left4Dead 3, Dota 3, Team Fortress 3, and Ricochet 2. He finished Warcraft Adventures and Starcraft: Ghost in the last couple of weeks. He’s got the Aliens Colonial Marines 2 beta coming up.

      In the background, they’re listening to a brand new collaborative album by Tupac and Biggie. They’re sipping New Coke and living the good life.

      Atlantis is nice this time of year.

    • bfar
    • 6 years ago

    The real question is when do we get the “every man’s” GM200? The GTX 780 was the Keplar to get, and so it will be for Maxwell. Hopefully AMD will add a competative edge too.

      • Krogoth
      • 6 years ago

      Nah, GTX 780 was meh overall. It was somewhat faster than 680/770 that it replaced, but sold at a higher price point until 290X price drops and 780Ti Launch.

      670 was best bang-for-the-buck Kepler. Near-680 performance without killing the wallet.

        • drfish
        • 6 years ago

        Depends on [url=http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/14<]what you play[/url<]. The 780 was exactly the card I was waiting for at the time and I bought it basically day one. Still happy with it.

          • Krogoth
          • 6 years ago

          If you weren’t on 6xx/7xxx bandwagon and wanted a high-end GPU. The 780 did made some sense when the price drops hit it after 780Ti launch and following 290X price cuts. The 780 only seemed good at launch because it was going against the overpriced for gaming usage Titan (Titan was a steal for the GPGPU crowd).

          670 was a better deal overall, since it held its place for so long and there was little to incentive to go beyond it with the Kepler refreshes outside of trying to do 4Megapixel gaming with tons of AA/AF and no compromises.

    • Deanjo
    • 6 years ago

    [quote<]The Titan Black sells for nearly $1300 right now.[/quote<] That's because NewEgg is once again overpriced and soaking it to their blind faithful. [url<]http://www.ncix.com/detail/asus-geforce-gtx-titan-black-09-94411.htm[/url<] $1,199.99 Canadian (even cheaper if you are a NCIX premium member, $1,147.13) That's cheaper than what I got my first Titans for. You can even get the superclocked version cheaper at NCIX [url<]http://www.ncix.com/detail/evga-geforce-gtx-titan-black-9d-94656.htm[/url<]

    • Deanjo
    • 6 years ago

    *waits for release then BUY BUY BUY!!!

    • Captain Ned
    • 6 years ago

    And I had ordered a 980 about 30 minutes before seeing this post. Thankfully I was able to cancel it through Amazon Prime, as the instant this hits I expect 970/980 prices to drop.

      • geekl33tgamer
      • 6 years ago

      I wouldn’t count on it. If previous Titan pricing is anything to go by, this will be in a different league entirely.

      You could buy 2 GTX 780’s for less than the cost of a single Titan Black this time last year.

      • renz496
      • 6 years ago

      If nvidia come up with titan first then 980/970 price will be not affected at all. In fact any price changing that affecting ‘true’ geforce line up will not going to affect nvidia premium product like titan. I still remember when nvidia make a round of price cuts on 780 and 770 the original titan price was remain the same

      • Krogoth
      • 6 years ago

      Not going to happen, you’ll have to wait until AMD’s counterattack for a price drop to happen on 970/980.

        • the
        • 6 years ago

        On that note, I’d suspect that the R9 380X will launch before nVidia’s GM200 chip. nVidia basically wants to see where the R9 380X before finalizing things like clock speed and price to be competitive (or be able to pull of another halo product like a Titan).

          • Pwnstar
          • 6 years ago

          The rumors say the 380x won’t drop until April, while this is supposed to come out next month.

          • MathMan
          • 6 years ago

          What an incredibly comfortable position to be in: they’re killing it with GTX970. They obviously have a successor waiting in the wings. The competitor has something but it’s going to be very expensive to produce (HBM and all that.) They can just wait for AMD’s move and react accordingly: there’s no pressure whatsoever to move first.

          Even if the AMD one is a bit faster (one hopes it will be with HBM), they can do damage by pricing it low enough, with still high margins due to better perf/W and cheap memory.

    • geekl33tgamer
    • 6 years ago

    Titan II released just in-time to counteract (and probably destroy) the AMD 3xx series? Check…

      • Walkintarget
      • 6 years ago

      Bang bang Maxwell’s silver hammer came down on AMDs head ….

      *Beatles reference for you young’uns around here*

        • geekl33tgamer
        • 6 years ago

        Safe to say that went over my head (90’s child). 😉

          • geekl33tgamer
          • 6 years ago

          Down voted because of my age? Bah, u grumpy old men!

            • BlackDove
            • 6 years ago

            Downvoted for saying something thats true to a bunch of AMD fanboys more likely.

            • geekl33tgamer
            • 6 years ago

            Wait, what? Go back to your green side of the fence…

            …I was talking about the music reference. :-/

            Edit: Typo

          • auxy
          • 6 years ago

          Huh, really? I always thought you were older than me.

        • Deanjo
        • 6 years ago

        What do bugs have to do with it? 😛

          • Khali
          • 6 years ago

          The sad part is there is probably a 16 year old geek sitting there asking that very question in his head as he reads your comment Deanjo.

        • Ninjitsu
        • 6 years ago

        Hahaha I was humming this just as I read your comment.

        • l33t-g4m3r
        • 6 years ago

        Not a fan of pop / boy bands, so I wouldn’t know. Not that I haven’t heard a few songs on the radio, but I don’t own any of their albums.

          • Ninjitsu
          • 6 years ago

          You make The Beatles sound like Backstreet Boys.

            • l33t-g4m3r
            • 6 years ago

            Sure. I don’t see the difference. The Beatles were just the first ones to pull it off. Their songs might have been better, but a pop band is a pop band. Regardless, the Beatles are more for people of my mom’s generation, which she wasn’t a diehard fan of either.

            I don’t like most oldies anyway, since generally speaking the quality is lacking and sounds unrefined. I do like some classic rock, but I can’t sit through an entire album from those bands either.

            Dunno. I have my preferences, but anymore I’m pretty indifferent about music, and would rather play games or watch movies. I listen to the radio while driving, and occasionally at work, and that’s about it. Meh.

            • sweatshopking
            • 6 years ago

            Yeah, I agree with you here.

            • Pwnstar
            • 6 years ago

            Backstreet’s back, oh yeah!

            • VincentHanna
            • 6 years ago

            you’re right, its completely unfair to compare “i want it that way” to “the yellow submarine.”

            unfair to the backstreet boys…

            • Terra_Nocuus
            • 6 years ago

            [url<]https://www.youtube.com/watch?v=7t8xwpW8gJQ[/url<]

      • Platedslicer
      • 6 years ago

      Regardless of whether it destroys AMD’s offerings, I can comfortably predict it will be destroying plenty of wallets.

Pin It on Pinterest

Share This