Nvidia has been filling out its Kepler lineup these past few months. The first members of the GeForce 600 series were all high-end graphics cards with formidable price tags, but recently, we've seen the company dabble in both the very low end and the sweet spot around $200.
Screaming-fast flagship cards are interesting, of course. But not everybody can afford them. Not everybody plays the kinds of games at the kinds of resolutions that truly justify a $300 or $400 GPU, either. For many enthusiasts, getting solid performance at a reasonable price is more important than making friends green with envy.
Today's launch addresses the last great gap in Nvidia's 600-series lineup. The GeForce GTX 650 Ti brings us full-fledged Kepler goodness at prices ranging from $149 to $180 or so, bridging the gap between the GeForce GTX 650 and GeForce GTX 660. Nvidia tells us this is the last card it plans to introduce this year. We're not surprised, since the company now has pretty much all its bases covered.
We're going to be testing one of the fastest variants of the GTX 650 Ti today: a Zotac card with 2GB of memory and clock speeds substantially above reference. This bad boy will have to spar with the latest sub-$200 offerings from AMD, and we'll be comparing it to Nvidia's old GeForce GTX 560, for good measure. The results should be interesting, to say the least.
Introducing the GeForce GTX 650 Ti
Before we look at the amped-up Zotac card, we should probably introduce the vanilla GeForce GTX 650 Ti. Here's a picture of the reference card in all its bland, black-clad glory:
Based on the name and the card's stubby circuit board (which measures just 5.65"), one might think this is merely a higher-clocked version of the GeForce GTX 650. Not so fast, folks! Nvidia's naming scheme has gotten rather confusing with this generation, so let's clarify.
The vanilla GeForce GTX 650 is based on the same GK107 graphics processor as the $90 GeForce GT 640. The GK107 is a pretty hobbled chip that's substantially smaller and cheaper to produce than the rest of the Kepler family. Nvidia's new GeForce GTX 650 Ti, on the other hand, features a larger GPU: the GK106, which you can also find inside the $230 GeForce GTX 660. Incidentally, those are the only two products to feature that chip. Nvidia's more upscale GeForce GTX 660 Ti graphics card is based on a different piece of silicon, the GK104, which is even larger and powers all other high-end Kepler offerings up to the GeForce GTX 690.
Confused? Not too much, I hope. Here's a quick overview of how the GK104, GK106, and GK107 stack up:
The GK106 is very much the middle child of the Kepler family. It's more fleshed-out than its smaller sibling, but it lacks some of the trappings that the eldest enjoys—like more ALUs, more texture units, a wider path to memory... and, we expect, not having to wear hand-me-downs.
Where the GeForce GTX 660 uses the full GK106, the new GeForce GTX 650 Ti uses a slightly scaled-back version of the same chip. Nvidia has disabled one of the five SMX engines, leaving 768 ALUs and 64 texels/clock of texture filtering capability. One ROP cluster and one memory controller were also excised, so the card can churn out only 16 pixels per clock, and its path to memory is just 128 bits wide.
Interestingly, Nvidia has two ways of retrofitting a GK106 chip for the GTX 650 Ti. It can disable one of the SMX engines from the two full-sized GPCs, or it can disable the third GPC altogether. Since that third GPC is half-sized and contains only one SMX engine, the end result is pretty much the same. Nvidia tells us there are no performance discrepancies stemming from the two different approaches.
Obviously, having this flexibility means Nvidia can repurpose GK106 chips that didn't make the cut for the GeForce GTX 660. Flawed chips can be adapted, so long as only one of their SMX engines, ROP clusters, and/or memory controllers is faulty. The same goes for chips that are fully functional but can't quite hit high enough clock speeds. As you can see below, the GTX 650 Ti has lower base and memory speeds than the GTX 660, and it also lacks GPU Boost, so the card doesn't venture beyond the base clock regardless of the available thermal headroom. (SLI multi-GPU capabilities aren't on the menu, either.)
|GTX 650||1058||N/A||8||34/34||0.8||1.1||5.0 GT/s||80||$109.99|
|GTX 650 Ti||925||N/A||15||59/59||1.4||2.1||5.4 GT/s||86||$149.99|
|GTX 660||980||1033||25||83/83||2.0||3.1||6.0 GT/s||144||$229.99|
|GTX 660 Ti||915||980||24||110/110||2.6||3.9||6.0 GT/s||144||$299.99|
In short, we invite you to think of this latest arrival as a cut-down GTX 660, because that's essentially what it is.
The GeForce GTX 650 Ti is going to be available in several flavors. Offerings based on Nvidia's reference design are supposed to be priced at $149 with one gigabyte of GDDR5 memory. Reference cards have a 110W power envelope, a single six-pin PCIe power connector, and a not-quite-single-slot design with a slightly protruding cooler. (See the image above.) Nvidia's partners are also rolling out variants with 2GB frame buffers. Those will cost a little more, and they may have an edge over their 1GB counterparts when handling higher resolutions, larger textures, and higher levels of antialiasing. However, Nvidia points out there probably won't be much of a difference between 1GB and 2GB variants at the GTX 650 Ti's target gaming resolution of 1920x1080.
Of course, there will be cards with higher-than-reference clock speeds and larger frame buffers—like the one we're going to be benchmarking today.
One last thing. Some of the GTX 650 Ti cards you'll see out there will come with a free license key for Ubisoft's Assassin's Creed III. The game isn't coming out until Halloween, but when it does, it's no doubt going to carry the same $59.99 price tag as any self-respecting triple-A title from a big publisher. Getting it for free with a $149 card sounds like a pretty sweet deal. Not all of Nvidia's partners are participating, however, so you'll want to double-check before making your purchase.
|Aorus gives its GTX 1070 the triple-fan treatment||1|
|Oculus co-founder Palmer Luckey is leaving Facebook||5|
|Asus readies 35 motherboards for Optane Memory||4|
|Kaby Lake Pentiums and Celerons won't support Optane Memory||11|
|Take a Walk in the Park Day Shortbread||7|
|New game and BIOS updates promise to boost Ryzen performance||25|
|Ryzen motherboard availability check: come and get them||10|
|Intel defends its process-technology leadership at 14nm and 10nm||59|
|AOC U3277PWQU display is an affordable 32" 4K monster||0|