GeForce GTX 660 Ti rumored for August release

If you want a decent graphics card based on Nvidia’s latest Kepler GPU microarchitecture, you’re going to have to spend at least $400 on a GeForce GTX 670. While it’s true that the $99 GeForce GT 640 also features a Kepler-based GPU, that card is woefully underpowered and should really cost $60-70. The GeForce 600 series could have a new $300 model as early as next month, though. This story at Sweclockers claims the GeForce GTX 660 Ti is due to arrive in August.

Pictures of a purported GeForce GTX 660 leaked in May, so a late-summer arrival seems plausible. Adding a “Ti” moniker would only make sense, too. Nvidia reanimated its Titanium designation for last year’s mid-range GeForce GTX 560 Ti.

The GeForce GTX 560 Ti started at $250, but the 660 version is expected to debut at $300. The higher price would put the new GeForce directly opposite the Radeon HD 7870 GHz Edition, which received an official price cut late last week. I wouldn’t be surprised to see the new Titanium model arrive alongside a lower-priced GeForce GTX 660. The Radeon HD 7850 costs around $250, and Nvidia doesn’t really have an answer for it.

Comments closed
    • Chrispy_
    • 7 years ago

    GK106 please.
    GK107 is a 1/4 GK104 for 1/4 the price.
    GK106 needs to be 1/2 a GK104 for 1/2 the price, and it [s<]needs to get here yesterday[/s<] is too late because I bought a 6950 for the equivalent of $150

    • FormCode
    • 7 years ago

    My HD6790 still runs strong but I have an upgrade itch. Though at $300, I might just leave that itch unscratched.

      • Airmantharp
      • 7 years ago

      If you’re at 1080p, hold out as long as you can! Nvidia and AMD should both have upgrades for you by the end of the year, so unless there’s something you want to play and can’t right now, you’re better off waiting :).

    • Malphas
    • 7 years ago

    Uhm, to the guys bashing Duck, it’s been common knowledge for ages now that the GTX 660 is going to be a binned 680 (same as the 670) so everything he’s said has been correct (other than his evaluation of its worth, which is inherently opinion rather than fact). There are other tech sites apart from this one that have been covering leaks on 660’s impending release for a while now, it’s nothing new.

      • Airmantharp
      • 7 years ago

      That it is GK104 hasn’t been disputed- it’s his assessment of its worth IS the issue. This is not going to be an HD5830/GTX465 debacle part. GK104 is less then half the size of the GF100 used in the GTX465.

        • Airmantharp
        • 7 years ago

        Gosh, all these down-votes and I didn’t even mention AMD. I thought my comment was neutral, straightforward and a realistic view of the situation. Anyone care to detail what they disagree with?

    • PrincipalSkinner
    • 7 years ago

    Was Nvidia ever so slow in bringing mid range cards? I’ve been waiting for this for so much that I eventually stopped caring.

      • HisDivineOrder
      • 7 years ago

      No, they’re not usually this slow. Their reluctance to do anything with Kepler on the mid-end is due mostly to the fact that there’s so many 570 and 560 Ti 448 parts floating around in the channel they don’t want to kill their margins for those cards. Fact is, usually what drives these cards out is the fact that a competitor shows up with a part that just blows every segment up and the competitor has to respond.

      AMD showed up with parts that are only marginally faster than what nVidia already had in the channel. Hell, nVidia took a few looks, laughed, and renamed their 560 Ti replacement part into becoming the 580 replacement part because of how badly AMD did in bringing the performance this round. They could get away with that because of AMD’s fumbling on performance and pricing.

      The only real consequence of this renaming was that nVidia continues to be without a mid-range product where the 680 product should have slotted in. They never needed to wake the sleeping dragon (ie., the real 680 aka the GK110) because AMD didn’t have anything that pushed beyond the GK104 part’s performance. Even now, if you allow that AMD’s hair dryer cooler is sufficient, they’re only beating nVidia by a tiny percentage in a few games with their highest of the high end single card.

      nVidia will eventually get around to binning some of those GPU’s that don’t make the grade into lower cost cards, but for the time being they’d rather sell you their part from last year for only moderately less than they were selling them to you for last year. If AMD had done its job and showed up with a part with performance, then they couldn’t have milked us like this, but AMD started the year out milking the gamers and nVidia saw no reason not to join in the milking.

      Trouble is, I think they don’t realize how little milk the cow they’re abusing really has left…

        • ultima_trev
        • 7 years ago

        Let’s be real here. nVidia hasn’t awaken the GK110 “dragon” because it’s not ready for prime time, i.e. it’s way too power hungry for the consumer market and the gaming performance advantage over GK104 is far less than the increase in power draw… the consequence of making a heavy duty compute/dual precision card. It would probably be far worse than GTX 480 was.

        If AMD had chosen to focus strictly on game performance with Tahiti like nV did with GK104, I’d wager AMD’s cards would be the better performing ones by a fair bit. Heck, even with all the compute crap holding it down, Tahiti is still able to exceed GK104 clock-for-clock in most benches.

          • Airmantharp
          • 7 years ago

          That’s the fun part really- Nvidia doesn’t need to release GK110. GK104 is quite fast, and Fermi fills the GPGPU slot nicely. They can take their time.

          It’s actually comedic how little gain AMD made with GCN on the gaming front, and they should be quite glad that they had so much lead time this generation.

          In reality, if AMD wanted to beat Nvidia this generation, they should have made an even bigger high-end part, and then made the smaller parts without the GPGPU additions.

            • BobbinThreadbare
            • 7 years ago

            “Doesn’t need too”

            If it was really that fast they could sell a $1000 video card that blows away everything else.

            Why wouldn’t they release it?

            • Airmantharp
            • 7 years ago

            It’s obvious speculation on my part, but they may have a glut of Fermi parts if they’d kept production higher than plan to account for TSMC’s 28nm issues.

            Given that GK104 can compete effectively against anything AMD can produce in the gaming space and GF110 can do the same in the compute/industrial rendering space, it makes little sense for them to cut into their own profits by releasing GK100/110 today.

            Essentially, like I said above, they could do it but don’t need to, they have their bases covered due to AMD’s gaming performance stumble with GCN.

            Now don’t get me wrong- AMD needed to do GCN. They need an answer to Tesla, and they need people developing for their compute units in order to support everything from dedicated compute products to the GPU cores in their APUs. They might have been better off, though, making a larger high-end GCN GPU and stripping the compute logic from their mid- and low-end products, reserving that space for logic that’s suited to gaming. Nvidia has been doing this quite successfully for a while now, and with the combined efficiency gain of Little-Kepler and the decrease in gaming efficiency of GCN, they’ve put themselves behind for this generation.

            • moose17145
            • 7 years ago

            Jesus… really? I mean really? All I read almost this entire thread was “AMD can do no right and NVidia can do no wrong”.

            AMD released a competitive part for both gaming and general computing… they did it with a single chip, and they beat NVidia to the table by … what? … 6 months was it? And what did people do? “oh well Kepler is gonna beat it hard when it comes out… just you wait and see… AMD already lost this round…”. Even when NVidia had no answer to the 7900’s for 6 months, all we heard was how bad kepler was gonna rip the Radeons in half and how they were gonna be released for almost half the cost because they are supposed to be mid range parts that are so fast that they can tear AMDs high end apart. The entire time while the 7900’s were spanking NVidia, for some reason beyond my understanding, people still said AMD already lost this round because “oh well… uhmmm… you see their performance wasn’t as good as what NVidia was expecting so they see no need to release anything yet… but just you wait! The spanking is coming!!!”

            The reality was what ultima_trev already said. Their stuff wasn’t ready yet and they had no answer to the 7k series, just like big kepler isn’t ready yet. And then when NVidia finally did release their cards? Were they released at the 300 dollar price point and just spanking the pants off the radeons at the 500 dollar price point like I saw so many people claiming would happen? No. Were they consuming half the power of the Radeons? No. Were they even beating them in performance? Honestly, I would say no. Their current chips BARELY beat the red team at raw gaming performance, and they had to almost completely sacrifice their compute capability to do it.

            I will tell you what I see. I see the 680 for what it is… a high end part. It even draws power like a high end part. If this is how power hungry NVidias MID RANGE parts are supposed to be… then what on earth is it gonna take to run their truly high end stuff? It performs like a high end part, it consumes power like a high end part, and requires beefy @$$ cooling like a high end part. Well golly gee i don’t know about the rest of you… but I just think this might be NVidia’s high end part… Every characteristic of the 680, except it’s compute capability, shows that this is NVidias high end gaming card.

            Some of you really need to take off the green tinted glasses.

      • BestJinjo
      • 7 years ago

      GTX470/480 came out March 26, 2010
      GTX460 came out July 12, 2010 (so that’s 3.5 months)

      GTX680 came out March 22, 2012
      GTX670 came out May 10, 2010

      It’s a slower roll-out. What makes this even less exciting is the price drops on HD7850/7870 and 7950 has made waiting for GTX660 series basically pointless imo.

      • Airmantharp
      • 7 years ago

      They had a mid-range card, but because it’s as fast as AMD’s high-end, they badged it for high-end use and sold it at those prices.

    • tviceman
    • 7 years ago

    I see it coming in at $320. 192 bit-bus @ 6ghz would give it a 20% drop in memory bandwidth over the gtx680 and gtx670 and cutting down from 7 SMX’s units to 6 will give it a 15% drop in computational power assuming it core clock speeds remain the same as what the gtx670 runs at. So it should still end up about 5% faster (on average) than a gtx580 and about 10-12% faster than an hd780.

    GK106 can’t be too far off.The latest rumors are saying it will have 5 SMX’s, which contradict earlier rumors saying it had 4…

      • BestJinjo
      • 7 years ago

      The $320 price and performance you described would put it squarely against an HD7950. That’s game over for GTX660Ti then since 7950 has at least 30% overclocking headroom and 3GB of VRAM. GTX660Ti has to cost way less than $300 unless it’s literally just a downclocked GTX670 with fully intact 256-bit 2GB memory and 1344 SPs.

      I would guess $199 for GTX660 and $249 for GTX660Ti to go against HD7850 and HD7870. NV could drop GTX670 and 680 by $50 at that time too.

    • jrr
    • 7 years ago

    I’ve been waiting for something between the $100 and $400 cards, but like the others here I’ll continue to wait for the magical $200 part.

    I’m only recently getting games that won’t go all-out at 1920×1200 on my old 8800gts512.

      • D@ Br@b($)!
      • 7 years ago

      Which games are that?

      I still game on my 4890 Black, which will, just, play Crysis & Cysis Warhead maxed out, no anti-aliasing though, because it doesn’t seem to make much difference.
      On a 1080p tv. Computer is a A8N-Sli Premium with a 4800+, and since Warhead with 4 Gb 2522 G.Skill.

      Edit: just checked, anti-aliasing does improve visuals: with 8x, 27 flps average, without 33 flps average.

      • Meadows
      • 7 years ago

      You must be playing some crappy old games then.

        • jrr
        • 7 years ago

        TF2 and CS:GO will run all-out. Skyrim and Diablo 3 run in 1920×1200 with a few settings dialed down. Dota 2 is surprisingly taxing, requiring a little more dialing-down but still running in 1920×1200.

        Four-year-old Yorkfield quad 2.66 still chews through just about anything that’s reasonably multithreaded, so next two steps are video and an SSD.

        • moose17145
        • 7 years ago

        [sarcasm]Yes… as opposed to all those totally awesome new DRM infested games that have been coming out recently… [/sarcasm]

        Yes there have been some good games that have been released recently… but honestly most of those feel like the exception rather than the rule to me anymore. IMHO most of those older games were 1,000x better than most of the new games that are being released today. But maybe I value game play over pretty graphics. Its just a shame that usually you can only either choose a game with mediocre graphics with awesome game play, or a game with amazing graphics but garbage game play… again there are exceptions… but they are just that… exceptions to the rule.

        Also many people are like me and can’t justify spending 50 dollars on a new game… let alone the 60+ dollars many new games are being released for today. As such lots of people are often a couple years behind the times.

        [url<]http://xkcd.com/606/[/url<]

      • brucethemoose
      • 7 years ago

      What’s wrong with the 7850?

        • BobbinThreadbare
        • 7 years ago

        It’s about $30 too expensive.

          • BestJinjo
          • 7 years ago

          $210 for MSI TwinFrozr HD7850 on Newegg:
          [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814127663[/url<]

        • jrr
        • 7 years ago

        I suppose I could give amd/ati another try – last time I wasn’t too keen on their drivers

          • Airmantharp
          • 7 years ago

          You’ll be happy to know that single-card drivers are essentially on par between companies- hell, Intel’s drivers have made great strides recently.

          You can’t go wrong with a mid-range AMD card, at least not until Nvidia has a good competing product in that range.

    • Duck
    • 7 years ago

    Too expensive, high end binned junk like the Radeon 5830 🙁

    Where is the real performance mainstream part for ~200 USD?

      • tviceman
      • 7 years ago

      This is a well informed, thought out opinion. For… you know, not having any information whatsoever at hand to form an opnion on.

        • lycium
        • 7 years ago

        pot calling the kettle black

          • Washer
          • 7 years ago

          His only needed fact was Duck’s comment. Duck makes a claim with zero evidence. It’s pointless at best, purposely misleading at worst. Giving him +s just strokes his ego and encourages him to do it more in the future.

      • Washer
      • 7 years ago

      You have performance numbers already? How’s the power consumption? Temperatures with stock cooling? You know so much already, please share!

        • Duck
        • 7 years ago

        OK, if you insist…

        Performance is obviously pretty good but power consumption disappointingly high. It’s easier to overlook the drawbacks when you get the performance to go with it; such as from a GTX680 or the arguably better value GTX670. But when paring a large and hot GPU with performance characteristics that are relatively middling, it’s much harder to be enthusiastic about it. Especially with such a high price point.

        Temperatures with stock cooling is within operational requirements. It’s nothing to write home about. An aftermarket cooler will need to be fitted if the goal is for supremely quiet operation.

          • Washer
          • 7 years ago

          Really? You’re going to try to pass off what you think will be the case as absolute fact?

            • Chrispy_
            • 7 years ago

            The link in swedish claims that the 660Ti will just be a harvested GK104, so everything Duck is inferring is correct, since we have already seen the performance and power effects of a harvested GK104 with the GTX670. Extrapolation of existing data is not a crime, and this is all speculation anyway so there’s no need to get huffy about it.

            • Visigoth
            • 7 years ago

            Then he should have said: “I think”, “I speculate”, “I assume”, or something of this nature.

            • Airmantharp
            • 7 years ago

            ‘Large’ and ‘Power-hungry’ are not things associated with GK104; it’s mid-range silicon to begin with. Lower the performance, lower the power draw and heat output.

            • Duck
            • 7 years ago

            Yes exactly. I just couldn’t bring myself to spell it out like that. I mean, come on. This is a rumor article after all. We shouldn’t all have to attach a lengthy disclaimer about how the comments are our own views and are not necessary the views of TR or are even factual, blah blah.

            I also went from +4 earlier down to -1 now :'(

            • Washer
            • 7 years ago

            No problem with extrapolation but I didn’t see that in Duck’s first post or reply. I saw statements with nothing backing them up. Your speech changes when discussing what you believe or assume compared to when discussing facts or knowledge.

      • Flying Fox
      • 7 years ago

      If one rumour source is to be believed, then it will be the 660 without the Ti moniker taking up the 200-250 spot. And that is thought to be using the GK106 instead of a castrated GK104.

        • xeridea
        • 7 years ago

        It boggles my mind why Nvidia randomly associates model numbers with chips. You would think that GK104 would be a low-mid range part, and higher numbers would be higher performance, instead its backwards, and higher is lesser, except for GK110.

        I would lean towards castrated GK104 since their yields for 680 and 670 have been so bad.

          • Turd-Monkey
          • 7 years ago

          The numbers are in reverse order because Nvidia typically targets the high end first, so they end up with the “lower” numbers in each generation:

          GeForce 500 series: GF110 (GTX 580), GF114 (GTX 560)
          GeForce 400 series: GF100 (GTX 480), GF104 (GTX 460)
          GeForce 200 Series GT200 (GTX 280 AND 260), GT215 (GT 250)
          GeForce 9 series: G92 (GTX 9800), G94 (GTX 9600)
          GeForce 8 series: G80 (8800), G84 (8600)

          The mid+low to low range chips can end up with crazy chip numbers, but that’s a consequence of a first come first serve naming basis, making OEMs happy and generational crossover/rebranding.

          Obviously the 600 series has been kind of an anomaly. The GK100 is MIA, and the GK104 “mid range” targeted part has ended up as the high end part. Looking at the 400 to 500 series transition, we can see why the “Super Kepler” GK110 chip is named the way it is, but it doesn’t give us any info about what happened to the GK100 or why the GK104 is the high end chip of the 600 series.

          Maybe the GK100 was abandoned, maybe the GK104 performed better than expected? Maybe nvidia’s high end target is no longer gaming, but GPU compute? Maybe the GK104 was planned to span 660/670/680 like the GT200 spanned 260/270/280?

            • xeridea
            • 7 years ago

            My point is that it is extremely confusing as to what chips are what. Naming them based only on the order they are released is just silly, it makes the chip numbers not actually mean anything. If they don’t actually mean anything, why not just call GK#1, GK#2, GK#3….? Why not just give them all names like AMD does for the different chips, so you can associate names to model ranges, instead of random numbers, which have no meaning? Numbers imply meaning, when there is none, and it is confusing.

            • d0g_p00p
            • 7 years ago

            “maybe the GK104 performed better than expected? Maybe nvidia’s high end target is no longer gaming, but GPU compute? Maybe the GK104 was planned to span 660/670/680 like the GT200 spanned 260/270/280?”

            You pretty much answered yourself. I think the GTX 680 was supposed to be the GTX 670 but nVidia saw how it was doing against the HD 7970 and re-branded it the GTX 680. As for the GK100, it’s a real product with specs but yes you are correct it’s target is the compute market. I could be completely wrong but that is how I remembered it.

            • Airmantharp
            • 7 years ago

            The GTX680 was supposed to be the GTX660- it resembles the GTX560 and GF114 it is based on in every way, except production process and upgraded cores.

            GK100/110 resemble GF100/110 in the same way. To think that Nvidia planned to pit GK104 against AMD’s largest GCN product is misguided. Rumors support the idea that Nvidia decided to release their mid-range GK104 silicon as a high-end SKU in the GTX680 due to it’s excellent gaming performance and AMD’s middling gains with GCN.

            As to whether or not GK100/110 are real today, well, we know GK110 will be ready this fall in a Tesla card. GK100 may very well have been ready as well but Nvidia literally has no reason to make it, especially if the yields aren’t conducive, which is supported by TSMC’s issues with 28nm.

        • Malphas
        • 7 years ago

        That’s complete BS, I reckon. I also think a GTX 660 (without the Ti suffix) will be released first and will be the card this article is talking about.

    • Flying Fox
    • 7 years ago

    Bring it on! I want something between $200-300 that I have been waiting. Tolerating low settings in D3 with my GT 240 now…

      • xeridea
      • 7 years ago

      Perhaps you should start waiting and get another card. Pretty much any card you can buy will play D3 fine. Even $70-$80 cards, though I would say at least get 7770 on a sale, it will totally cream D3. It will play ok on a 6570, 6850 or 7770 won’t even break a sweat. If you are Nvidia fan, any card around the same price range should work.

      • Airmantharp
      • 7 years ago

      The GT240 is actually a pretty beafy card, I’m surprised you need to run it on low; the passively cooled GT430 (GDDR3 version of the GDDR5 GT240) was able to run BF:BC2 at 1080p at better than low settings quite well.

      With AMD’s price drops in effect you should be able to pick up something nice without Nvidia branding, which shouldn’t matter nearly as much in the mid-range where you’re less likely to run two or more cards :).

    • Meadows
    • 7 years ago

    I’m left untouched by this announcement, I’d rather spring for a model costing the equivalent of $200 with a name like the GTS 650.

    That is, as long as it’s slightly more powerful than the GTS 560 (instead of simply expanding the feature set).

      • willmore
      • 7 years ago

      I’m looking at similar performance, but much lower power consumption. Maybe it’s just this heat this summer.

Pin It on Pinterest

Share This