Nvidia’s GeForce GTX 660 Ti graphics card reviewed

Nvidia’s GK104 graphics chip has been all over the place since its initial release back in March. The chip made its debut aboard the GeForce GTX 680, a decidedly high-end card priced at $499. About six weeks later, the GeForce GTX 670 arrived, with a slightly scaled down version of the GK104 for $399. Today, onboard the GeForce GTX 660 Ti, this same graphics chip steps down to the much more accessible price point of $299—while again giving up a relatively small amount of performance. If you’ve been holding out on upgrading to the latest generation of graphics cards, the GTX 660 Ti may finally overcome your resolve.

Here’s one way to think about it. In de-tuning the GK104 in order to make this more affordable version, Nvidia has taken the GeForce GTX 680 and shaved off two Xbox 360 consoles worth of graphics processing power. Not to worry, though: the GTX 660 Ti still has 14 or so Xboxes left.

The nitty gritty

To be more specific, for the GTX 670, one of the GK104’s eight SMX units was disabled, reducing the shader ALU count from 1536 to 1344 and cutting texture filtering power from 128 to 110 texels per clock cycle. Clock speeds were dialed back a bit, too. Trimming those resources had its impact on overall performance, but it was relatively minimal. Our experience with the GTX 670 caused us to declare that we “fail to see the point of spending more on a GeForce GTX 680.”

The GTX 660 Ti retains the SMX lobotomy and adds a couple more tricks to cut costs and rein in performance. First, the GK104’s path to memory has been reduced in width from 256 bits to 192. Strangely, though, the card still has 2GB of GDDR5 memory, not the odd number one might expect. To make such a configuration possible, four of the GK104’s memory controllers have been configured to run in 16-bit mode, while the other four remain in their native 32-bit mode. Thus, memory bandwidth has been trimmed by 25%, but total memory capacity is still a nice, round number. Second, one of the GK104’s four ROP partitions has been disabled, reducing the chip’s pixel fill rate and antialiasing power by a quarter.

Those calibrations were apparently sufficient in Nvidia’s view to keep this card separate from its elder siblings. The GTX 660 Ti runs at the exact same 915MHz base and 980MHz boost clocks as the GTX 670. Crunch all the numbers, and here’s how the cards’ key specs end up looking.

Base

clock

(MHz)

Boost

clock

(MHz)

Peak

ROP rate 

(Gpix/s)

Texture

filtering

int8/fp16

(Gtex/s)

Peak

shader

tflops

Memory

transfer

rate

Memory

bandwidth

(GB/s)

Price
GeForce GTX 660 Ti 915 980 24 110/110 2.6 6.0 GT/s 144 $299
GeForce GTX 670 915 980 31 110/110 2.6 6.0 GT/s 192 $399
GeForce GTX 680 1006 1058 34 135/135 3.3 6.0 GT/s 192 $499

The GTX 660 Ti gives up a little bit of ROP throughput and some memory bandwidth versus the GTX 670, but it still offers a whole lotta Kepler GPU power for a hundred bucks less. We suspect that for a great many folks, this class of graphics card will be more than sufficient.

Nvidia is hoping to entice those who currently own older cards in roughly the same class, such as the GeForce GTX 260 or 470, to upgrade to the new hotness. To sweeten the pot, GeForce GTX 660 Ti cards will come with a really, really attractive incentive: a coupon good for a free copy of Borderlands 2. Of course, you’ll have to wait for the game’s mid-September release date in order to play, but Borderlands 2 is easily the most anticipated title of 2012 within the dank confines of Damage Labs. I will probably disappear for a week or so in September, going vault hunting. Since this game is slated to cost 60 bucks, I’d consider the bundled copy a very nice bonus, to say the least.

Several options

The GTX 660 Ti is slated to be available at online retailers starting today, and indications point to a pretty wide release. We already have three different examples of retail cards on hand for testing.

We’ll start with PNY’s offering, since it kind of sets the bar. This card is based on Nvidia’s reference design (the same one used for the GTX 670), runs at the GTX 660 Ti’s stock speeds, and lists for $299.99. No real wrinkles there.

If you’re contemplating installing one in your PC, here are the basics. Like the GTX 670 before it, this card is 9.5″ in length, but the board itself is only 6.8″. The extra length is just… male enhancement. The board’s max power rating is 150W, and it requires a pair of six-pin aux power inputs. The display output array is visible above; it mirrors that of other GK104-based cards.

PNY’s pitch for this product is straightforward enough. Although higher-clocked variants of the GTX 660 Ti will be a bit faster, this card ought to be adequate for most gamers, and the price is right. Also, Nvidia’s reference coolers are often quite decent. While all of those things are true, we’re a little dubious about this particular reference cooler; it didn’t impress us aboard the GTX 670. We’ll have to see how it fares here.

Next up is MSI’s GeForce GTX 660 Ti Power Edition. This card has tweaked base and boost clocks of 1019 and 1097MHz, although its memory operates at a stock 6 GT/s.

You might have guessed that little else about this puppy is stock. MSI has customized the PCB design, adding an additional power phase, to yield a 5+2 config (versus 4+2 in the reference cards). Then there’s the slick custom cooler, with twin fans and quad heatpipes, whose aluminum fins stretch the 9.5″ length of the board. MSI rates this card for 190W max power, and its design should facilitate overclocking. If you like to live dangerously, the firm claims the Power Edition can achieve “triple overvoltage” via its excellent Afterburner overclocking utility.

MSI plans to charge a premium of just 10 bucks for these extras. You should see the Power Edition at online retailers for $309.99.

Finally, Zotac’s hopped-up variant of the GTX 660 Ti is pictured above next to its GTX 670 big brother.

I’m sorry, but it is so cute.

The circuit board is under 7″ long, and the card measures 7.5″ to the pointed tip of its cooling shroud. Unlike its similarly styled siblings, the Zotac GTX 660 Ti AMP! Edition occupies only two expansion slots—and it is totally primed to slip into a compact Mini-ITX enclosure of some sort.

Believe it or not, this little dynamo is also the fastest of these three GTX 660 Ti cards. The 1033MHz base and 1111MHz boost clocks and 190W TDP are similar to the MSI’s, but Zotac adds much faster GDDR5 memory, with a 6.6 GT/s transfer rate. Not surprisingly, then, the price tag is a little steeper at $329.99.

Base

clock

(MHz)

Boost

clock

(MHz)

Peak

ROP rate 

(Gpix/s)

Texture

filtering

int8/fp16

(Gtex/s)

Peak

shader

tflops

Memory

transfer

rate

Memory

bandwidth

(GB/s)

Price
PNY  GTX 660 Ti 915 980 24 110/110 2.6 6.0 GT/s 144 $299
MSI GTX 660 Ti OC 1020 1098 26 123/123 3.0 6.0 GT/s 144 $309
Zotac GTX 660 Ti AMP! 1033 1111 27 124/124 3.0 6.6 GT/s 159 $329

All told, the differences between these three products aren’t earth-shattering, but it’s worth noting that the two hot-clocked models actually eclipse the stock GTX 670 in terms of peak theoretical texture filtering and shader FLOPS rates.

The competition gets a boost

We’ve been comparing the new GeForce against its siblings in Nvidia’s lineup, but of course its natural competition is in the Radeon camp. Apparently the prospect of the Radeon HD 7950 competing against all of these hot-clocked GTX 660 Ti cards was making AMD at little bit uneasy.

Last week, AMD sent us a new BIOS to flash to our Radeon HD 7950 reference cards. This BIOS introduces a new feature, first seen in the Radeon HD 7970 GHz Edition, known simply as “boost.” Boost is a fairly straightforward modification to AMD’s PowerTune dynamic voltage and frequency scaling algorithm. Before, PowerTune could only reduce clock speeds in response to cases of unusually high GPU demand and power use. With boost, PowerTune can now raise clock speeds (in 4MHz increments) when the thermal headroom allows. Boost also promises more accurate power estimates that enable higher clock frequencies and fuller residency at those speeds.

The flash process was quick and painless. With the boost BIOS, our original Radeon HD 7950 review cards gained a chunk of new performance. The base clock is now 850MHz, up from 800MHz before, and the new boost peak clock is 925MHz. (Memory frequencies are unchanged.)

AMD tells us these clock speeds with boost constitute a new baseline specification for the Radeon HD 7950. The card’s list price will remain steady at $349, and board makers should start offering boost-enabled cards at online retailers starting today. Sapphire, HIS, and PowerColor are expected to be first to market.

Current owners of Radeon HD 7950 cards will want to pay attention to this next bit. AMD says all of the Radeon HD 7950 chips shipped to date should be able to run at the clock speeds enabled by the new BIOS. That means a simple BIOS flash offers the prospect of a nice little bump in performance for free. In fact, AMD told us it doesn’t mind folks distributing the new, boost-enhanced 7950 BIOS, although we understand this BIOS may not work on cards that vary from the reference design. Eventually, the firm expects boards makers to offer their own custom BIOSes, and those may find their way online, as well.

This fact is more notable because the 7950 reference card includes dual BIOSes, with a second, write-protected copy accessible via a switch just behind the CrossFire connectors. In other words, 7950 owners are pretty much free to attempt an upgrade to the boost BIOS without fear of bricking their cards. If the flash fails, just flip to the backup BIOS and recover.

Even more Radeons

We shouldn’t let this focus on the 7950’s enhancements distract from the GTX 660 Ti’s closest competitor, the Radeon HD 7870 GHz Edition.

No, this isn’t the same card pictured on the prior page. This is MSI’s Radeon HD 7870 Hawk, with a very similar custom cooler (though with five heatpipes.) The 7870 Hawk is clocked at 1100MHz, 100MHz above the 7870’s stock frequency. MSI asks $319.99 for this card.

Before AMD threw us the boost curve ball, we’d already selected a hot-clocked Radeon HD 7950 to use in this review, MSI’s R7950 OC Edition. This card lacks a boost BIOS but ratchets up the base clock to 880MHz. The upshot, as you’ll see, is performance very similar to the 7950 reference card with the boost BIOS. The R7950 OC Edition has been selling online for a while now at $349.99.

Base

clock

(MHz)

Boost

clock

(MHz)

Peak

ROP rate 

(Gpix/s)

Texture

filtering

int8/fp16

(Gtex/s)

Peak

shader

tflops

Memory

transfer

rate

Memory

bandwidth

(GB/s)

Price
Radeon HD 7870 GHz 1000 32 80/40 2.6 4.8 GT/s 154 $299
MSI R7870 Hawk 1100 35 88/44 2.8 4.8 GT/s 154 $319
Radeon HD 7950 800 26 90/45 2.9 5.0 GT/s 240 $349
MSI R7950 OC 880 28 99/49 3.2 5.0 GT/s 240 $349
Radeon HD 7950 w/Boost 850 925 30 104/52 3.3 5.0 GT/s 240 $349

Both the GeForce and Radeon camps now have the space between $300 and $350 firmly covered with options. Fortunately, so do we. We’ve tested a range of current and older graphics cards in this class. Let’s have a look at how they all compare.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-11-24
1T
Chipset drivers INF update
9.3.0.1019

Rapid Storage Technology Enterprise 3.0.0.3020

Audio Integrated
X79/ALC898

with Realtek 6.0.1.6526 drivers

Hard drive Corsair
F240 240GB SATA
Power supply Corsair
AX850
OS Windows 7 Ultimate x64 Edition

Service Pack 1

DirectX 11 June 2010 Update

Driver
revision
GPU
base

clock 

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

Galaxy
GeForce GTX 470 GC
GeForce
305.37 beta
625 837 1280
Asus
GeForce GTX 560 Ti TOP
GeForce 305.37
beta
900 1050 1024
PNY
GeForce GTX 660 Ti
GeForce 305.37
beta
915 1502 2048
Zotac
GTX 660 Ti AMP!
GeForce 305.37
beta
1033 1652 2048
Zotac
GTX 670 AMP!
GeForce 305.37 beta 1098 1652 2048
MSI
R7870 Hawk
Catalyst
12.7 beta
1100 1200 2048
MSI
R7950 OC
Catalyst
12.7 beta
880 1250 3072
Radeon
HD 7950 w/Boost
Catalyst
12.7 beta
850 1250 3072

Thanks to Intel, Corsair, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • We used the Fraps utility to record frame rates while playing either a 60- or 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card in order to counteract any variability. We’ve included frame-by-frame results from Fraps for each game, and in those plots, you’re seeing the results from a single, representative pass through the test sequence.
  • We measured total system power consumption at the wall socket using a Yokogawa WT210 digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Skyrim at 2560×1440 with the Ultra quality presets, 4X MSAA, and FXAA enabled.

  • We measured noise levels on our test system, sitting on an open test bench, using an Extech 407738 digital sound level meter. The meter was mounted on a tripod approximately 10″ from the test system at a height even with the top of the video card.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Battlefield 3

We tested Battlefield 3 with all of its DX11 goodness cranked up, including the “Ultra” quality settings with both 4X MSAA and the high-quality version of the post-process FXAA. Our test was conducted in the “Kaffarov” level, for 60 seconds starting at the first checkpoint.


Frame time
in milliseconds
FPS
rate
8.3 120
16.7 60
20 50
25 40
33.3 30
50 20

We’ve gathered a tremendous amount of data during our testing. To make it more manageable, we’re trying these new switchable plots. You can click on the buttons to see the GTX 660 Ti compared to various types of cards.

These are plots of the time required to render every frame in a single, representative test run. Lower frame times are preferable, and in some cases, you can see the undesirable spikes representing long-latency frames quite clearly in the plots above—particularly on the older GeForce cards.

If you’re confused by our use of frame times, let me direct you to my article, Inside the second: A new look at game benchmarking, for an introduction to our latency-focused approach to graphics testing. Traditional frame rate averages don’t offer enough resolution to capture the occasional stops and stutters that happen when your PC is just a bit too slow to run a game smoothly. We’ve found that concentrating on frame latencies allows us to evaluate performance more accurately. For reference, the table on the right offers some translations from notable frame time thresholds to FPS rates.

We won’t leave out the FPS average, of course, since it’s very familiar.

However, switching to the 99th percentile frame time—probably our best single-number performance summary—illustrates just how poorly the older GeForce cards fare in this test scenario. The time required to render that last ~1% of frames on the GTX 560 Ti and GTX 470 is pretty unfortunate.

A look at the broader latency curve further illuminates the problem. Frame times on the older GeForces stay happily below 40 milliseconds most of the time, but roughly three percent of the frames involve a much longer wait, spiking to 60 milliseconds or more, where you’d really notice the slowdown.

We’ve seen this problem in certain levels of BF3 a number of times before on Fermi-class GeForces. Fortunately, the newer Kepler cards appear to have overcome it. In fact, Zotac’s GTX 660 Ti AMP! hangs with the boost-enhanced Radeon HD 7950.


Our final metric may be my favorite, because it’s about avoiding those long-latency frames that disrupt gameplay. The question is: how much time does each card spend working on really long-latency frames? To answer that, we add up all of the time the cards spend rendering beyond a per-frame threshold.

We start by setting that threshold at 50 ms, which corresponds to 20 FPS, because we think the illusion of motion begins to break down for most folks somewhere around that point. (Movies run at 24 FPS, for instance.) We can then ratchet the threshold down if we want to be even pickier about performance.

As you can see, the two older GeForces are the only cards that produce really worrisome slowdowns. In fact, even at the 33-ms threshold, all of the newer cards are golden. That means they all maintain a near-constant frame rate of 30 FPS. The 16.7-ms threshold is the toughest test—equivalent to 60 FPS—and there, the GTX 660 Ti cards outperform all of the Radeons we tested, though the PNY card’s margin of victory over the 7950 is slim.

Max Payne 3

Max Payne 3 is a new addition to our test suite, and we should note a couple of things about it. As you’ll notice in the settings image above, we tested with FXAA enabled and multisampling disabled. That’s not the most intensive possible setting for this game, and as you’ll soon see, Max 3 runs quite quickly on all of the cards we’ve tested. We wanted to test with MSAA, but it turns out multisampling simply doesn’t work well in this game. Quite a few edges are left jagged. Even the trick of combining MSAA with FXAA isn’t effective here. Enabling both disables FXAA, somehow. We couldn’t see the point of stressing the GPUs arbitrarily while lowering image quality, so we simply tested with the highest quality setting, which in this case was FXAA.

Also, please note that this test session wasn’t as exactly repeatable as most of our others. We had to shoot and dodge differently each time through, so there was some natural variation from one run to the next, although we kept to the same basic area and path.


Although Max Payne 3 is a very good looking game with huge textures and some nice tessellated objects, it runs quite well on a range of graphics cards with only FXAA-style antialiasing enabled. We cranked things up to Korean 27″ IPS monitor resolution, and performance remained strong even on the older GeForce cards. The FPS averages and 99th percentile frame time results come close to mirroring one another, which suggests we don’t have any major issues with slowdowns.

A look at the broader latency curve confirms it. Even the GTX 470 churns out every single frame in less than 30 ms.


As you might expect, then, there’s not much to see in our measure of “badness.” The 16.7-ms threshold results are somewhat helpful, though. They tell us that all of the new cards will maintain a near-constant 60 FPS.

DiRT Showdown

We’ve added the latest entry in the DiRT series to our test suite at the suggestion of AMD, who has been working with Codemasters for years on optimizations for Eyefinity and DirectX 11. Although Showdown is based on the same game engine as its predecessors, it adds an advanced lighting path that uses DirectCompute to allow fully dynamic lighting. In addition, the game has an optional global illumination feature that approximates the diffusion of light off of surfaces in the scene. We enabled the new lighting path, but global illumination is a little too intensive for at least some of these cards.

This is a fantastic game, by the way. My pulse was pounding at the end of each 90-second test run.


Uh oh. Just a look at the raw frame time plots bodes poorly for the GeForces, all of ’em.

The frequent spikes to ~40 milliseconds on the GeForce GPUs take their toll on the FPS average, but their impact is more readily apparent in the 99th percentile frame times, which are nearly double those of the Radeons.

Frame latencies are higher across the board on the GeForces versus the Radeons, but you can see how the “tail” spikes upward on the GeForce cards somewhere in the last 3-8% of frames. Those frames, we know from the plots, are distributed throughout each test run, not confined to a single spot on the track.


The results from our 50-ms threshold offer a bit of a corrective for us. Although the Radeons are clearly faster overall, even the older GeForces don’t waste much time on really long-latency frames. That jibes with my seat-of-the-pants impression that Showdown was quite playable on all of the cards, even if it wasn’t as smooth on the GeForces as on the Radeons.

The Elder Scrolls V: Skyrim

Our test run for Skyrim was a lap around the town of Whiterun, starting up high at the castle entrance, descending down the stairs into the main part of town, and then doing a figure-eight around the main drag.

We set the game to its “Ultra” presets with 4X multisampled antialiasing. We then layered on FXAA post-process anti-aliasing, as well. We also had the high-res texture pack installed, of course.



Ok, two things to note in these results. First, although the legacy GeForce cards churn out FPS averages near the 60 FPS mark, they’re not actually great performers here, as their 99th percentile frame times and latency curves suggest. Some frames just take a while, especially on the GTX 560 Ti, which we suspect is bumping up against a memory size limitation. You can feel the slowdowns when playing.

Second, although the FPS averages suggest some separation between the slowest and fastest current-gen cards, the differences aren’t likely to matter, at least on a 60Hz display. The 99th percentile frame times for those cards are all within a millisecond of one another, and none of them spend much time working on frames longer than 16.7 milliseconds. For Skyrim at these settings, these cards are functionally equivalent.

Batman: Arkham City

We did a little Batman-style free running through the rooftops of Gotham for this one.


The plots show intermittent spikes for all of the cards, which is what we’ve come to expect from this test scenario. We cover a lot of ground in this test, and the game engine has to stream in detail for new areas as we go.

Unfortunately, those spikes are more frequent and severe on the Radeon HD 7870. Although it looks competitive with the GTX 660 Ti in terms of its FPS average, the 7870 drops into last place in our latency-focused 99th percentile frame time metric.


Crysis 2

Our cavalcade of punishing but pretty DirectX 11 games continues with Crysis 2, which we patched with both the DX11 and high-res texture updates.

Notice that we left object image quality at “extreme” rather than “ultra,” in order to avoid the insane over-tessellation of flat surfaces that somehow found its way into the DX11 patch. We tested 60 seconds of gameplay in the level pictured above, where we gunned down several bad guys, making our way across a skywalk to another rooftop.


Doh. The Radeons produce some pretty spiky plots, while the GeForces avoid that problem. The R7870, in particular, suffers here from occasional slowdowns.

Those slowdowns again are somewhat masked by the FPS average, but they act as a drag on the 99th percentile frame time, which allows the new GeForces to sneak into contention against the pricier Radeon HD 7950 cards.


Once more, we’re reminded not to make too much of the spikes we saw in the plots. The Radeon HD 7870 barely registers any time wasted beyond 50 milliseconds. It is slower than the GTX 660 Ti in a meaningful way here, but not disastrously so.

Power consumption

AMD’s ZeroCore power feature gives the Radeons an advantage when the display goes into power-save mode, as you can see. The Radeons drop into a low-power state, spin down their fans, and shave 15W or so off of total system power draw. Without this feature, the GeForces can’t match them.

When running a game—Skyrim, in this case—the GTX 660 Ti cards draw quite a bit less power than anything else in the field, including the competing Radeons. The Kepler architecture has proven to be very power efficient, and that goodness extends to its latest derivative.

Noise levels and GPU temperatures

With their fans spun down in ZeroCore mode, the Radeons approach our system and test environment’s noise floor. Then again, the Zotac GTX 660 Ti card is right there with them, though its fans are spinning.

To me, the biggest story of the results above is the effectiveness of MSI’s custom dual-fan coolers, which capture the top three spots for lowest noise levels under load. MSI’s GTX 660 Ti card is the quietest of the bunch, which makes sense given its more modest power draw; it has less heat to dissipate than the 7870 or 7950.

Beyond that, the Zotac GTX 660 Ti AMP! card deserves some praise for combining the lowest noise levels at idle with a mid-pack performance under load and a modest peak temperature of 67° C. I’d have preferred fan tuning that’s biased a little more toward quiet than cool, but that stubby little cooler does look to be pretty effective.

Meanwhile, PNY’s decision to use the stock Nvidia cooler looks unfortunate, since it combines high temperatures with relatively high noise levels, despite the GTX 660 Ti’s modest power draw (and thus modest heat generation). This cooler still isn’t terribly loud, like the reference 7950’s with boost is, but the custom coolers simply outperform it.

Conclusions

So, we’ve combed through a huge amount of info. What does it tell us? We can get a sense of things with one of our famous price-performance scatter plots.

Our overall performance numbers come from five of the six games we tested. (We’ve omitted DiRT Showdown because the vast gulf in brand-based performance there skews the results pretty wildly, even though we’re using a geometric mean. Clearly, that game is an outlier of sorts.) Our primary metric is the 99th percentile frame time, which we’ve converted into FPS for this plot, to make it easy to read. As usual, the better positions on this plot will be closer to the top left corner, where low prices meet high performance.


The two GTX 660 Ti cards we ran through our entire test suite are situated very nicely on our scatter plot, with performance to match the two Radeon HD 7950 cards but at lower prices. (Had we been able to run the MSI GTX 660 Ti Power Edition card through our suite, it likely would have placed right between the other two cards, also in a decent spot.) The Radeons are more competitively positioned in the scatter plot based on FPS averages, but we’ve seen why those numbers tend to be a less reliable gauge of gameplay quality. Since the GTX 660 Ti cards also drew substantially less power under load and had noise levels comparable to the best Radeons, we can probably count this one as a win for the GeForce camp.

MSI GTX 660 Ti Power Edition

Zotac GTX 660 Ti AMP! Edition

August 2012

To be specific, we’re bestowing Editor’s Choice distinction upon two of the GeForce GTX 660 Ti cards, the Zotac GTX 660 Ti AMP! Edition and the MSI GTX 660 Ti Power Edition. The PNY card with Nvidia’s reference cooler doesn’t make the cut. We’d happily pay $10 more for MSI’s superior twin-fan offering. The Zotac AMP! card’s solid cooler, squat profile, and 6.6 GT/s memory clock justify the $30 premium for it, as well, in our view. The Zotac card may be the most powerful video card by cubic volume anywhere, yet it’s a perfectly acceptable option to drop into a big tower case.

With that said, we can’t escape the creeping feeling that the performance differences here aren’t terribly meaningful. Most of these cards land within $50 of one another in price and within a few frames per second in overall performance. Even the Radeon HD 7870, which was the slowest of the current cards we tested, didn’t struggle much in our tests—and we were trying to stress these cards using the latest DirectX 11 games. Yes, we could have tested at somewhat more intensive graphical settings, but you would be hard-pressed to notice the visual differences versus the settings we chose. Half the time, we tested at the very high resolution of 2560×1440, too. At 1920×1080, the 7870 would barely have struggled at all.

This current state of GPU potency and parity is very good news indeed for the PC gamer. If you have a 1080p display, take your pick of any of these new cards and enjoy. They are all a worthy upgrade over older cards like the GeForce GTX 470. If you prefer to go the Radeon route, there are some advantages there, including the lower noise and power draw in power-save due to ZeroCore mode and the apparent superiority of the GCN architecture in GPU computing tasks.

If you are looking at a Radeon HD 7950, you may want to watch for an offering with the boost BIOS. We can see now why AMD chose to inject that product with a little more oomph. Without it, the 7950 wouldn’t keep pace with the lower-priced competition. Then again, MSI’s R7950 OC Edition already offers the same basic performance through a mildly tweaked static clock speed.

You can follow my madcap scatter-plotting exploits on Twitter.

Comments closed
    • Bensam123
    • 7 years ago

    This is going to be a bit late, but I really think the graphs on this page should have a 7970 and a 670 (680 possibly) on it as two reference points for stepping up to the next level as well as the 7770 stepping down a price point.

    I think it would make for three very interesting clusters. You can’t accurately compare the two without flipping back and forth between reviews and that’s not the same as comparing apples to apples as the X/Y axis are different.

    Thanks for the 460 and 560 BTW, they’re helpful points. Maybe making them grey or something for ‘depreciated’.

    …best fit line~

      • nciaootu
      • 7 years ago
    • Chrispy_
    • 7 years ago

    So the 660Ti is a half-decent stopgap which makes GK104 even more affordable, I would need to think twice before recommending a 7870 to someone.

    The elephant in the room is the fact that Nvidia has no competition for the[list<][*<]7850[/*<][*<]7770[/*<][*<]7750[/*<][*<][/*<][/list<] Although on an enthusiast site like this, $250-500 is a popular price point, we all know from hardware surveys, OEM's, sales figures and online resellers that 90% of the entire graphics card market exists below $250, with the sweet spot being $150-$200. Nvidia has nothing - The rapidly aging 560Ti has been poor value for a while, the 550Ti was always an overpriced oddball that never really performed enough to make it worth the generation gap and the GT640 is a joke; Even DDR5 and a pricecut might not be enough to attone for its sins. The way I see it, Nvidia has been binning GK104 since before release and the delayed launch of the 670 and now the 660Ti is their best effort to distract us from the fact that they have NOTHING worth buying this generation in the most important price bracket of them all.

      • clone
      • 7 years ago

      HD 7950’s start at $309.99 now, unless 660TI drops in price it won’t find a niche, HD 7870 and HD 7850’s also need to drop but that has been due for quite a while so I wouldn’t be surprised to see HD 7870 around $200 and HD 7850 under $200 making life tough for 660 TI.

    • kamikaziechameleon
    • 7 years ago

    At this stage I’m curious how crossfire and SLI are fairing with regards to frame time latency? Seems they’ve ironed that out on the single GPU cards.

    • `oz
    • 7 years ago

    You list the MSi 660 Ti as one of your editor choices but the card is missing from all the benchmarks? what gives?

    • Bensam123
    • 7 years ago

    “Here’s one way to think about it. In de-tuning the GK104 in order to make this more affordable version, Nvidia has taken the GeForce GTX 680 and shaved off two Xbox 360 consoles worth of graphics processing power. Not to worry, though: the GTX 660 Ti still has 14 or so Xboxes left.”

    ROFL… Using a Xbox 360 as a unit of measurement to show how awesome new tech is or how antiquated old tech is. Pure genius. XD

    “We’ve omitted DiRT Showdown because the vast gulf in brand-based performance there skews the results pretty wildly, even though we’re using a geometric mean. Clearly, that game is an outlier of sorts.”

    Perhaps the benchmark should be removed then? If you aren’t actually using it to find a conclusion, it’s sorta pointless to use in the first place.

    Curiously instead of using a monitor with a high resolution, have you guys tried using one with a higher refresh rate? I think people are more likely to purchase a 120hz monitor then a high resolution one… As a gamer, I am. I think it’s more relevant to the benchmarks used as well and may deliver interesting results as far as performance goes seeing as video cards have been tuned for 60hz for like the last decade.

    • rrr
    • 7 years ago

    Solid, but not mind-blowingly impressive. Performance is OK for the price, and so is power consumption.

    • willmore
    • 7 years ago

    Too bad I already pre-ordered Borderlands 2 on Steam or throwing that in would be a huge plus for me. Not enough to move me from my <$200 price point, but it really would have gotten my attention.

    • LaChupacabra
    • 7 years ago

    Thanks for adding the 33 and 16 ms buttons!

    =)

    • Star Brood
    • 7 years ago

    GeForce GTX 650 or Radeon HD 7790 at the $199 price point and one of these two guys have a sale (whoever gets there first).

    Of course I am probably one in tens of thousands who would buy one/two.

      • rogue426
      • 7 years ago

      Why do the green/ red camps not see this, it’s so freakin obvious to all of us. Even 200-250 price point in the new architectures.

    • Duck
    • 7 years ago

    [quote<]four of the GK104's memory controllers have been configured to run in 16-bit mode, while the other four remain in their native 32-bit mode. Thus, memory bandwidth has been trimmed by 25%, but total memory capacity is still a nice, round number. Second, one of the GK104's four ROP partitions has been disabled, reducing the chip's pixel fill rate and antialiasing power by a quarter.[/quote<] Wait, what? Comparing to the coverage at Anandtech, the GK104 has a total of 4 memory controllers, not 8. One of them was been disabled along with some L2 cache and 8 ROPs to create the 660Ti. With 3 out of 4 memory controllers left, that gives 75% of 256 which is 192. 2 of the extra VRAM packages get interleaved on one of the memory contrllers to get to the 2GB VRAM. All makes sense. But TR's explanation I do not really understand. Assuming Anadtech is accurate, I think TR might be wrong.

    • ALiLPinkMonster
    • 7 years ago

    I was gonna build an Econobox, but seeing how well Kepler has scaled down into each price point, I may have to wait for the 650 and see how it stacks up against the 7770. It’s a win-win too, because by then I may have enough in my savings to swing a Sweet Spot 🙂

    … or I could just buy 15 more Xboxes.

      • HisDivineOrder
      • 7 years ago

      They still haven’t released a Geforce 660 (non Ti). That’ll be $199 if they continue their pricing scheme. That’ll probably come in three or four months. So… November, December? Then three or four months later, you’ll get your 650 Ti. Then begins the long wait for the 650 non-Ti.

        • Flying Fox
        • 7 years ago

        The last rumour I read was that the non-Ti 660 to slot into the $249 slot.

        • Lazier_Said
        • 7 years ago

        Rumor has it that the standard 660 will be here in about three weeks.

      • kalelovil
      • 7 years ago

      Rumour has it that the 650 will be GK107 based, a 640 with GDDR5 essentially.
      If so, the 7770 should offer higher performance.

    • halbhh2
    • 7 years ago

    Another quality review. They are all about the same, except if you game like 4 hours/day, then you want that lower power draw of the 660, unless you love to play Dirt. But for the great majority it will be either the luck of finding something on sale, and/or sticking with your brand loyalty.

    For me though, as $100 is really my limit, on religious grounds practically, the choice is a a HD7750 or wait.

    • plonk420
    • 7 years ago

    AWESOME review! this confirms my suspicions on how crappy my 560(/560Ti) is with Frostbite 2 in spite of a 256-bit memory interface (opposed to being smoother on my 5870 that i thought was dying which the 560 was replacing).

    • DPete27
    • 7 years ago

    Can’t we get a couple examples of previous-gen Radeons on the conclusion graphs here? I hate having to flip through 10 different reviews to make comparisons to previous-gen GPUs.

    Maybe you combine “high end” and “mid-range” graphics cards into 2 groups using results you’ve taken since starting the “inside-the-second” stuff also? Anandtech and Tomshardware both have benchmark graphs where the reader can easily compare MANY graphics cards to oneanother, but they only care about average FPS. Benchmark graphs are useful so that all the work you’ve done doesn’t get buried beneath an inceasingly large number of reviews.

    • superjawes
    • 7 years ago

    Speaking of specific (and biased) benchmarks, I don’t see a lot of comments about how these cards come with Borderlands 2 codes. I’d be very interested to see how this card performs in that game, and that’s a pretty nice addition for anyone picking up a 660.

      • Guts_not_Dead
      • 7 years ago

      All these cards can play Borderlands 2 maxed out at XHD resolution.

    • Essence
    • 7 years ago

    you’re always going to have games that favor AMD or nvidia, i think people should be left to decide that. It could be that AMD tries harder on twimtbp than nvidia on AMD favored games.

    Most other reviews also used aftermarket 660 ti cards with 7950 (some added bios) and 7870 “sapphire” cards which tend to be the slowest from all AIB`s (apart from TOXIC). (xfx 7870, his 7870, etc would beat sapphire 7950), thats how bad sapphire have become.

    OT

    overall a good review as always

    • Chrispy_
    • 7 years ago

    Scott, one word:

    [b<]Buttons![/b<] No going back now; All your GPU reviews need buttons 😉 Since 120Hz and 3D are slowly gaining popularity and support, could you perhaps add an 8.3ms button? The higher the average framerate, the more jarring a momentary framerate drop becomes.

    • south side sammy
    • 7 years ago

    I looked around the web. TigerDirect has them. Even a 3gig model if that wasn’t a misprint.. ??

      • superjawes
      • 7 years ago

      [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16814162119<]Like this?[/url<] Looks real. And Newegg has several brands in stock...for now...

        • south side sammy
        • 7 years ago

        they didn’t earlier when I looked.

          • superjawes
          • 7 years ago

          I think they literally released a few hours ago, so that’s not surprising to me. I was just pointing out that they were available on Newegg and that Nvidia cards seem to have a habit of selling out.

            • south side sammy
            • 7 years ago

            yeah, seems like newegg lags a little behind the competition when cards are released…. albeit a few hours to a few days I’ve seen already. But the sold out thing………. good business not to over stock. Better to spend money where needed and as much but no more than necessary. Every business needs to watch how the spend money. And because you or I might rave the new card doesn’t mean the majority of people who go to the egg do.

    • jaychen
    • 7 years ago

    PNY’s GTX 660 Ti also comes with an Active 16 feet HDMI cable…
    That should help balance the overall value against the non-reference design cards…

    • steelcity_ballin
    • 7 years ago

    I don’t often log in to comment on these but I realized today just how much more I enjoy your plots and graphs along with your objective reviews in this arena (among others). Thanks for contributing to the better side of all things tech once again.

    This card is so tempting, my 560ti is still pulling it’s weight and I only game at 1920 so I really don’t need it, but man, I want it.

    • Dposcorp
    • 7 years ago

    Out of my price range for the amount of time I have to spare for gaming, but a great review none the less. +++ for buttons and “xboxes worth of performance.” 🙂

    For the record, I agree that any specific game or benchmark that for some reason is totally skewed in favor one way or another, when everything else isn’t, should be eliminated from the total comparison; however, the results always need to be shown so each reader themselves can factor them in. In other words, i agree 100% with the way it was done here.

    To me, this is no different then stuff that has happened in the past, like:

    1) Certain CPU/GPU vendors working behind the scenes to optimize their products for the most popular benchmarks/ programs, and once it was discovered, that needs to be tossed out.

    2) GPU maker “A” comes out with a DX10 card, and vendor “B” comes out with a DX 10.1 card.

    If vendor “B” told me to include a lot of DX10.1 games and benchmarks, I would be hesitant to do that since it seems like cheating.

      • superjawes
      • 7 years ago

      I would be interested in seeing those special results as well, to at least know if a card is specifically suited to the games that I will be playing, either through a roundup article or an “Other Thoughts” page after Conclusions.

      • dpaus
      • 7 years ago

      Equal time for opposing viewpoints: now that Scott has the cool graph buttons working, how about allowing us to use them to be able to see the price/performance graph ‘raw’ as well as ‘seasonally adjusted’ ??

      • l33t-g4m3r
      • 7 years ago

      [quote<]If vendor "B" told me to include a lot of DX10.1 games and benchmarks, I would be hesitant to do that since it seems like cheating.[/quote<] Not when dx10.1 had additional features that were separate from dx10. You're comparing apples to oranges to find out if you like oranges better than apples. That's like saying a dx11 to dx9 comparison is cheating. It's not cheating, but you can't call it a direct comparison either. edit: A more accurate analogy would be dx9 SM 3.0 vs 2.0. You could kinda run HDR in 2.0 if games specifically coded for it, but 3.0 was generally more capable of doing fancy shaders. dx10.1 was faster and more efficient, but that was by design and not a cheat. The only way you could describe it as a cheat would be if somebody was trying to directly compare dx10.1 vs dx10, which would be deception on the reviewer's part.

    • Hattig
    • 7 years ago

    This looks like a good card, and it’s arrived just in time to compete with AMD’s 7950 and 7870 cards which have barely appeared on the market.

    The real thing I want is for the cards to drop in price, but it looks like 28nm is still not at the stage that this can be done easily.

    As for the Dirt benches, it’s good to see that AMD has been getting some AMD-specific enhancements into games finally. Nvidia has been doing it for ages, what with the infinitely tessellated objects in some games.

    • brucethemoose
    • 7 years ago

    So, I guess this just shows modern games aren’t really limited my memory bandwidth in any meaningful way, even at 2560×1440. The 660 TI’s meager bandwidth (144.2 GB/s) compared to the 7950’s (240 GB/s) didn’t seem to make much difference.

    Also, where’s our Civ V benchmark?

      • BestJinjo
      • 7 years ago

      You can’t really compare memory bandwidth between AMD and NV. It doesn’t work that way since the way each architecture uses memory bandwidth along with ROPs is differnet. Compare 660Ti vs. 670 and you’ll see that 660Ti took a huge hit at 2560×1400 in all the games tested in this review (MP3, Skyrim, Batman AC). This is expected based on the fact that GK104 is memory bandwidth limited.

      Coincidentally the 660Ti lost in every 2560×1440 AA benchmark in this review against the 7950. So while the difference it’s going to be (240 / 144 GB/sec = 67%) based on theoretical specs, the 7950/670 do much better at 2560×1440 AA. At 2560×1400, 660Ti is closer to the 7870 not 7950/670. For high resolution, you’ll definitely be better off with an overclocked 7950 or spending a bit more for a 670.

        • brucethemoose
        • 7 years ago

        So… the 660 TI is basically the perfect 1080p card (if not overkill), while people running higher resolutions should spend a little bit more for the 7950.

          • Chrispy_
          • 7 years ago

          You don’t need to spend $300 to get a great 1080p card, and the 660Ti isn’t the best choice for resolutions above that.

          A few dollars more gets you a 7950 which has an extra GB of RAM for a bit of future-proofing headroom, no hocus-pocus 192-bit bandwidth handicap, will run better than the 660Ti at 2560 resolutions and (if it matters to you) will run OpenCL stuff a helluva lot faster than Kepler; I for one appreciate Photoshop and OpenCL V-Ray rendering on my 6950 at work.

          • clone
          • 7 years ago

          given HD 9750’s are selling for as low as $309.99 vs 660 TI’s $299.99….. I’m not sure I could agree with that statement even if someone had a gun to my head and I had no way to raise another $10…. I’d be lying if I said I agree’d and I’d find a way to get the other $10.

            • brucethemoose
            • 7 years ago

            Didn’t realize 7950s were selling for that. The 660 TI needs to come down to ~$270 to make it more interesting, otherwise the 7950 makes this card pretty irrelevant.

            • clone
            • 7 years ago

            a $40 difference isn’t significant given the additional ram and bandwidth that comes with HD 7950, 660 TI while a very capable should be $225 to make it compelling…. but that still depends on AMD because if they drop HD 7850/70 to sub $200 they’ll own the “sweet spot” and be more than enough for HD gaming.

            Nvidia did an excellent job back in the day when they priced GTX 460’s under $200 which stole the thunder of everything above it but they don’t have the full lineup this time around that AMD does which is giving AMD the luxury of dictating the market…. if they want too.

            • clone
            • 7 years ago

            AMD agreed with me, HD 7870 is now $219 and HD 7850 is $180.99.

      • clone
      • 7 years ago

      Xbit labs has it in their review.

    • redwood36
    • 7 years ago

    How stable do you guys think the Zotac 660 ti would be long run? My previous card is a XFX 285 and my card needs to be constantly monitored (and ihave to control the fan speed directly) and more than likely this was cause of some irresponsible clock speed that the factory set. Anyone have any clue as to whether I’d run in this problem on the zotac?

    Also, when are we gonna see a witcher 2 benchmark? I asked about this a while back.
    Finally, awesome review as always.

    • can-a-tuna
    • 7 years ago

    You omitted Dirt Showdown from the graphs because Radeons were just good at it? That’s mighty impressive objective journalism again.

      • Damage
      • 7 years ago

      I omitted the Showdown results from our overall index for several related reasons. First, because AMD told us themselves that they worked directly with CodeMasters to implement a new lighting path in this game engine. That lighting path happens to work very poorly on GPUs produced by AMD’s competitor–so poorly, in fact, that the GeForce results for that game are *half* the speed of the Radeons in the 99th percentile frame times. That’s true despite the fact that these Radeons and GeForces perform comparably in every other scenario tested. Also, the size of the performance gap in Showdown skews the overall results sufficiently that it offers a very different picture than we see in the other five games.

      Thus, baking in the Showdown results to our overall index didn’t seem fair.

      There is precedent here, too. We omitted HAWX 2 results from the overall index when it skewed strongly toward Nvidia.

      We did explain the logic behind our decision, and we published the Showdown data for the world to see. If you prefer a different solution, feel free to create your own index that includes that data.

      You are free to disagree with us.

      I will caution you, though. You have shown, over a long stretch of time, that you can be relied upon to offer a bitter complaint about us slighting the Radeon side no matter what we do. Your tone is constantly accusatory and rude, and it seems as if the only outcome you would accept is the conclusion that AMD totally dominates the GPU landscape. That’s simply not congruent with the current reality at all. Also, your hit-and-run tactics are growing wearisome. I may ask the other moderators to consider banning you if you can’t participate constructively in any fashion.

        • chuckula
        • 7 years ago

        Ah snap.. can-a-tuna just got served in a sandwich by the Damage Opener!

        • Arag0n
        • 7 years ago

        To be fair, I think that vendor-specific optimizations should be considered. Those optimizations show the real capabilities of the GPU´s. Given no optimizations is likely Showdown performed in AMD as it does in NVIDIA GPU´s.We have seen for years NVIDIA boosting performance by driver optimization working closely to game developers too, and we may argue that it still happens.

        Resuming, I don´t care if it´s because a vendor works close to game developers or because the card itself is better as long as I get better performance. I would say that a fair thing to do is to include games that are skewed to both vendors in order to balance the final result. It´s also an important point to know how much faster is a card when optimizations make a difference.

        • novv
        • 7 years ago

        The point here is to be objective: tell that if someone really likes Dirt game series should buy an AMD card for best experience or if you really like Batman then definitely should go for NVIDIA and so on. It’s not hard to point that at the end of the article because this is what I want to know. It’s already very nice that current cards can play easily at 1080p any game but there are surely some advantages for AMD owners in some games and for NVIDIA owners in others. Now, about the new card from NVIDIA, I really like it but the interface with the memory is not what I expected (maybe less memory but 256bit was a better choice)

        • xeridea
        • 7 years ago

        I somewhat agree. Though I would like to add that besides optimizations, the game would run much better on AMD anyway, because Kepler is really bad at GPGPU, something else that was omitted from review. There are other games that are going to show similar, though less severe results now and in the future, so they should be noted. Also, vendor optimizations or not, it doesn’t matter that much, it runs better, whats it matter to consumer. I would say to include it, but have it count as only half as much or something, rather than totally disregarding it, and do the same for both AMD and Nvidia when there are games like this. It did still have a full page for game, for those interested, so good job there.

        Love the beyond X ms switcher that was added. Keep up the good work.

          • BestJinjo
          • 7 years ago

          That’s right. When a consumer wants to play a certain game he/she doesn’t care how the game was coded but if it will run fast on his/her graphics card whether it is an HD4000/GTX690 or HD7850. Sniper Elite v2, Sleeping Dogs and DirtShowdown run poorly due to Kepler’s GK104’s lack of DirectCompute performance. It’s a deficiency in the architecture much in the same way NV cards run faster with MSAA in deferred game engines (i.e., Frostbite 2.0 / BF3). Both sides have their advantages and disadvantages and reviewers job is to point them out.

        • Bensam123
        • 7 years ago

        To be a bit of a devils advocate, Nvidia spends tons of money getting games optimized for their cards and AMD tries to cleanup after that. You know all those ‘The way it’s meant to be played’ logos we see all the time? I’d say they’re in the majority of games I play. Unless that is just a logo and has relatively little to no actual meaning (which it’s not from what I’ve read), there is usually a bias.

        So, with the size of Nvidias ‘optimization’ budget for working with developers how fair are benchmarks to begin with? Nvidia may actually have a trend of most developers playing favorites to their cards and AMD is playing with a handicap. Is it actually possible to make a Nvidia card function on the same playing field (without optimizations) as a AMD card so we can see unbiased statistics? And vice versa.

        Although this would ruin the generalizeability of the review because games ARE optimized, it would most definitely be something interesting worth looking in on.

        (I just want to say in my post below I opted to have the benchmark completely removed as it wasn’t used in the final conclusion and therefore sort of pointless to have in the review in the first place.)

          • RickyTick
          • 7 years ago

          Even though I think it’s misspelled, a +1 for use of the word generalizability. I’m still struggling to pronounce it properly. 🙂

            • Bensam123
            • 7 years ago

            Psychology: Applies to motherfucking everything.

      • Damage
      • 7 years ago

      can-a-tuna’s greatest hits:

      On a multi-GPU round-up, justification for excluding skewed results: “You should disqualify nvidia’s HAWX results: [url<]http://www.hardwarecanucks.com/news/video/amd-nvidia-spat-msaa-cheating-hawx-2/[/url<]" On GT 640: "The only interesting aspect of this card is that will it blend." On 7970 GHz review, in which we also used a hot-clocked 7950 Black Edition and then concluded the 7970 GHz is "unambiguously a better value than the stock-clocked GeForce GTX 680.": "Is Techreport actually become more nvidia biased site than Tom's? GTX 680 Amp! and GTX 670 Amp! Really?" On 7870 GHz edition review: "Again, very nvidia friendly game titles chosen (I guess all of them are TWIMTBP) but new Radeons still managed to take clean victory. How about adding: Metro 2033, Dirt 3, Deus Ex, Aliens vs Predator,..." In 7950 review comments, a bold new name for Nvidia's Kepler: " If Krapler is better (when it ever gets out), then AMD sure as hell will drop those prices." On GTX 560 Ti review: "Scott again with his biased reviews. HD6970 slower than GTX560, right. I'm wondering why AMD even bothers to send you any samples." On our Skyrim article, he demands a recount and suggests something more sinister: "How about running the tests again with with Catalyst 11.11a drivers. Seems that always when AMD gets their performance drivers for certain game ready you just before do these kind of reviews." On "Inside the second," where we proposed ideas that could revolutionize game benchmarking: <crickets> On GTX 590 review, a personal attack on me: "He is similar to Tom's "Tom". I've been bitchin' about that but in the end, it doesn't help so why even bother. They have chosen their path. AMD must be substantially better to break nvidia's force field. Being just a little better is not enough." On 6950 review, in reponse to a resident Nvidiot: "Won't you just die already?"

        • Dposcorp
        • 7 years ago

        Stop picking on him.
        He is one of the top contributors in the comments section, and his comments are always will thought out, polite, and present arguments for and against, with him never taking one side over another.

        Of course, when a member like him is posting from the Fortress of TinFoil-itude, while using a ouija board to contact Lee Harvey asking him if the second and third shooters were also the guys dressed as astronauts in the New Mexico Desert to simulate the moon landing, so that they could then take off the space suits and quickly hide the aliens, well, every once in a while I expect a bad post.

        And in honor of him, I call BS on this last review.
        How come the Nvidia Cards got “14 or so Xboxes worth of performance left,” but the AMD cards got “12 Dreamcast worth of performance?” How is that Apples to Oranges? Also, please admit that the AMD cards were all benched on BEos.

        I found this in the same folder with the AMD results.

        [url<]http://www.guidebookgallery.org/pics/gui/interface/dialogs/aboutgui/beos5.png[/url<] P.S. Don't ban me bro 🙂

          • derFunkenstein
          • 7 years ago

          You are a beautiful man.

        • flip-mode
        • 7 years ago

        I suppose if you’re not going to ban him, this is the next best response.

        • Meadows
        • 7 years ago

        When did you ever review the 6590? Neither the GPU review archives nor an internet search helped me at all.

          • Meadows
          • 7 years ago

          Good morning, [i<]minus-man[/i<], might want to show me to the review if you so disagree.

            • Chrispy_
            • 7 years ago

            You can have another minus for getting needlessly self-righteous over what was so obviously a 6950 typo. What were you hoping to achieve, exactly?

            • superjawes
            • 7 years ago

            You can also add that the typo has been corrected now.

            • Meadows
            • 7 years ago

            I don’t follow Radeon development and the 6590 sounded like a likely model number. It was in earnest.

            • Krogoth
            • 7 years ago

            He’s the archetype of this.

            [url<]http://redwing.hutman.net/~mreed/warriorshtm/grammarian.htm[/url<]

            • DeadOfKnight
            • 7 years ago

            Where do you find all this crap?

            • dmjifn
            • 7 years ago

            I’m afraid it’s forbidden knowledge. With it, you might become Grokoth, ascendant to – and possibly beyond – Krogoth. No, it’s better if this lore remain buried. For otherwise it would surely mean the end of our species’ sanity, if not our essential humanity.

        • Essence
        • 7 years ago

        Most neutral reviewers will be attacked by both camps (amd & nvidia), Your the first am seeing being attacked by one side only (amd). The nvidia fan base must be awfully pleased every time, for them, not being at your throat lol

        Edit; well, your not the 1st to be honest, there has been a couple more

      • chrissodey
      • 7 years ago

      I think XFX hired can-a-tuna as a consultant. He/She must have been the one to convince them to get out of the nvidia business…….. I for one, will never stop buying nvidia cards. Their cards could be and have been slower, but I buy them for their driver and developer support.

        • Meadows
        • 7 years ago

        Same. I buy NVidia because of their drivers.

          • Guts_not_Dead
          • 7 years ago

          Of course, the same old rant about an issue that doesn’t exist. I’ve had my 6970 for over a year now and i don’t think I’ve had a single crash in any of the games that I’ve played with this card.

            • Meadows
            • 7 years ago

            I’m not talking about just crashes, Joe Average.

            • DeadOfKnight
            • 7 years ago

            The issue exists if you’re dual booting.

            • swaaye
            • 7 years ago

            Rage comes to mind. Maybe you missed those 3 months or didn’t play the game.

            38×0 doesn’t work correctly with Skyrim.

            Catalyst 12.4-5 had funky problems with 4xxx cards and a few games like Skyrim and TOR. Actually I haven’t checked to see if they fixed this considering they’ve backburnered those cards.

            Of course NVIDIA has bugs too but that multi-month Rage event was ridiculous.

            • kc77
            • 7 years ago

            Rage is the WORST example. Carmack himself finally admitted that the game was rushed and not ready. Games typically go through anywhere from a minimum of 3 compatibility tests to as much as 6 or more for AAA titles. Why? To prevent exactly the problems that existed in Rage.

            They didn’t have to rush to make it for the Christmas season if the game was good it would have sold well afterwards and beyond.

            In today’s day and age there is absolutely NO reason to release a game with glaring graphic anomalies. There’s far less graphic companies now than there were 10 years ago. There’s no S3 Virge, or Voodoo Banshee’s to worry about in addition to ATI Rage 128’s and Riva TNT’s. We’ve got just two graphic manufacturers that need to be addressed and still we are seeing problems that we shouldn’t be seeing.

            Sorry that game is a classic case of producers trumping programmers and testers nothing more.

            • Darkmage
            • 7 years ago

            Just a question: Who on earth is still using a 4xxx series card? That’s a 2008 product if my quick Google skills haven’t failed me.

            I am curious about NVidia driver support, actually. I’ve been buying Radeons since the x850 or so. Once they went agile with their unified driver package, ATI’s drivers have been very good to me. And when they weren’t, it was fixed within a month.

            In any case, with the advent of simple triple-screen gaming on the NVidia platform, I’m seriously considering switching over to NVidia for my next upgrade. So I ask: Given an equal level of stability/support/bug fixes from both companies, what are the strengths of NVidia’s drivers?

      • Guts_not_Dead
      • 7 years ago

      I dare you to find a review that’s this well written and includes all these useful bits of information anywhere else on the web.
      I don’t even look at the average FPS charts anymore, as they’re not even half as representative of the real-time performance of those cards as the 99% of frame times and the time spent beyond 16.7[b<]/[/b<]30ms charts. The Dirt Showdown results were not included in the final calculations because those results are a unique case of a heavily optimized game for the Radeons and do not show the true picture of the relative performance figures of those products. Lastly, Can we get an article from techreport about how bad is the texture quality in Battlefield 3, especially the distant textures and shiny objects.

      • kc77
      • 7 years ago

      Unfortunately I would have to disagree with you. I can’t stand biased reviews and I will point them out regardless of the manufacturer. However, lately I would say for at least 7 months TR has been far more on point then the rest…. especially with the latest video card releases. I would even go as far to say their video card reviews lately have been the best out of the bunch.

      Feel free to take a look around there’s only two sites that tested the new Boost feature in the same review Anand and TR . Out of those only TR went into detail around the fact that other 7950’s could obtain the new BIOS. Anand? They mention it… sort of. They spent most of the review complaining about free performance.

      With TR putting the Boost cards right in there with the other AMP cards it showed that it could give AMD the same treatment as Nvidia within a review. It was fair and honest.

      It’s OK to disagree. I’ve done it TONS of times. However, the facts aren’t with you on this one.

      • travbrad
      • 7 years ago

      They omitted the HAWX results back in the 6870/6850 review too, when the Geforce cards were waaaay faster in that game due to a similar developer relationship advantage.

      So I guess Tech Report is biased against AMD and Nvidia at the same time.

    • sircharles32
    • 7 years ago

    The important question (to me), is how is this going to affect the Radeon HD 7850 price?
    That’s all I really care about.

    • Madman
    • 7 years ago

    Performance expressed in Xboxes is EPIC! Love it!

    • Krogoth
    • 7 years ago

    660 Ti = Nvidia’s direct answer to 7870, coming from sub-par GK104 yields.

    They both trade blows with each other and go for the same price point.

    The marginable differences come from the differences in architecture and how the software utilizes it.

    In any event, any of the cards in the testing suite can effortlessly handle 2Megapixel gaming, even the aging 470 although it does struggle a bit when you throw AA/AF at it under the newest titles.

      • Duck
      • 7 years ago

      I’m not so sure.

      First, it’s nvidia’s 570 replacement. Plus I thought it’s most direct competitor from AMD was meant to be the 7950 (based on performance anyway). The 7870 is lower performance, no?

      Second, if it’s binned from broken GPUs, I would expect it to have fewer shaders and maybe slightly lower clocks compared to the GTX670. That’s what makes up the bulk of the die size, whereas the memory controller is really small in comparison. It seems more like they cut down the card just to hit a lower price point. But IMO it should have just come with 1.5GB of VRAM onboard, and be even more competitive on price.

        • Krogoth
        • 7 years ago

        7950 pulls itself ahead of the 660Ti and 7870.

        660Ti is a little slower or faster than 7870, depending on the game in question.

        Some of the reviews are using factory OC’ed 660Tis which some reader are overlooking.

          • Duck
          • 7 years ago

          Factory overclocked 660Ti beats the factory overclocked 7870 by a massive amount in BF3. About 20% faster for the average FPS. Crysis 2, the 7870 is clearly trailing. Skyrim is too close (0.3ms covers the 99th percentile frame time for most of the cards). Portal2 or source engine game is missing :(. Don’t really care about the other games like dirt.

            • Krogoth
            • 7 years ago

            calling 5-15% “massive”? That’s quite the overstatement.

            • Duck
            • 7 years ago

            Huh?! I said “…a massive amount in BF3. About 20% faster…”.

        • willmore
        • 7 years ago

        I guess, if the memory controllers are faulty and not just intentionally disabled, we’d see four versions of the card–each with different banks not populated. I noticed some empty pads on the back of one of the 660Ti cards. But, that could just be because it’s a reference design which has all four channels layed out.

        I agree that the memory controllers aren’t huge in terms of silicon area, but don’t forget that the drive transistors are also part of the memory channel and they are fairly large–in relation to the other transistors, that is.

          • Duck
          • 7 years ago

          The larger the transistors are, the less likely they are to be broken (I think I just made that up by the way, but it sounds right).

            • willmore
            • 7 years ago

            The more area something occupies, the more chance that area will include a flaw. To much of a flaw and the transistor won’t perform to spec–fail at speed/temp. Now, yeah, larger area averages out what flaw may be there–a large transistor with a little flaw may still work whicle a little one will be completely corrupted.

            So, yes and no.

      • Meadows
      • 7 years ago

      Sorry it took this long, but I finally see a pattern. Allow me to demonstrate my [u<]Krogoth Generator[/u<]: Brand Model = X's DIRECT ANSWER to Y<, may add random rationalisation according to taste>. Blah blah TRADE BLOWS blah blah PRICE POINT. Yip yap yadda not impressed, not at all. Blah blah blah EFFORTLESSLY HANDLE <X> MEGAPIXEL GAMING blah blah yadda blahdy WHEN YOU THROW AA/AF AT IT yappity yap. See how authentic it is? From now on, I'll pay close attention to your comments on GPU reviews, because I'm certain you don't even read them. I've seen the above template from you on at least two previous occasions by now.

        • Krogoth
        • 7 years ago

        Still agitated?

        • Meadows
        • 7 years ago

        [url<]https://techreport.com/discussions.x/22890?post=635802#635802[/url<] (you only used the generator starting halfway in this one) This is better: [url<]https://techreport.com/discussions.x/23150?post=646638#646638[/url<]

          • DeadOfKnight
          • 7 years ago

          Actually he seemed fairly impressed in both of those comments; that demolishes your Krogoth generator.

            • Meadows
            • 7 years ago

            Crap, now I have to patch the software.

        • rogue426
        • 7 years ago

        Are the 2 of you married?

          • Meadows
          • 7 years ago

          Good god, I hope not.

    • l33t-g4m3r
    • 7 years ago

    What I gathered from this review is that the 7870 is now pointless, and the 7950 needs a price drop. I’ve been leaning toward a 7950, but now it’s a toss up for price. The batman benchmark here is totally useless since dx11 is turned off, so I don’t know how the 660 actually performs in that game. Feels like a repeat of TR’s Metro 2033 benchmark using medium (dx9) with tessellation. These goofy settings are in no way representative of how people are actually going to play the game with that card.

    One thing I’d like to know more about is fp16. How many games use it, is it even a limiting factor, and what would amd’s performance be compared to nvidia’s cards.

    Edit: Oh hey look! I was right. The 7950 did get a price drop.

      • Krogoth
      • 7 years ago

      What are you smoking? 660 Ti is 7870’s direct competitor. They have similar performance and are going for the same price point ($299).

        • l33t-g4m3r
        • 7 years ago

        Wrong.
        [quote<]the GTX 660 Ti is a solid and relatively consistent 10-15% faster than the 7870, while the 7950 is anywhere between a bit faster to a bit slower depending on what benchmarks you favor.[/quote<] [quote<]AMD has already bracketed the GTX 660 Ti by positioning the 7870 below it and the 7950 above it, putting them in a good position to fend off NVIDIA.[/quote<] -Anand

          • Krogoth
          • 7 years ago

          Check the numbers and figures again.

          It is a close race.

          Before Nvidia had nothing in $299 range that could stand against 7870 (570 was slower and hotter). The 660Ti addresses this.

          • brucethemoose
          • 7 years ago

          If you leave your cards at stock, ya.

          But the 7950 has gobs of OC headroom. For those of us who overclock their GPUs, the 7950 sits squarely in GTX 670 OC territory, above the 660 TI.

          Pitcairn simply doesn’t have the OC headroom Tahiti has, and neither does the 660 TI.

            • BestJinjo
            • 7 years ago

            With overclocking there is no contest in terms of performance — 7950 wins hands down on performance, but not as power efficient as the 660Ti. The 7950 comes very close to a GTX680 at only 1.03Ghz but many can overclock to 1.15-1.20ghz:

            [url<]http://www.techpowerup.com/reviews/Sapphire/HD_7950_Flex/31.html[/url<]

      • clone
      • 7 years ago

      you’re talking about a card that’ll run any game at 1920 X 1080 at max pretty easily.

      GTX 660 just got reviewed this wee…. no tod…. ok hours ago.

      I’d love to see HD 7850’s & 70’s for $125 – $225 which would signal it’s time for AMD to drop all of the old HD 6xxx inventory they’ve been pushing so aggressively in that price range.

      on the other side of things, Nvidia now has 3 cards that perform notably close while priced radically different, personally I don’t believe the market would miss the GTX 680 very much just because it’s been largely unavailable until 6 weeks ago & offered negligible potential when clocked, a “respin” of the GTX 670 called “ghz+” would seem possible while improving yields at the same time.

      • BestJinjo
      • 7 years ago

      What I gathered from your post is you use 1 review to decide how fast cards perform. How about 10+ reviews?
      [url<]http://www.3dcenter.org/artikel/launch-analyse-nvidia-geforce-gtx-660-ti/launch-analyse-nvidia-geforce-gtx-660-ti-seite-2[/url<] What I also gathered is you don't like overclocking your cards? HD7950 @ 1.1-1.2ghz > OCed 660Ti. Not saying that 660Ti isn't a good card but 7950 doesn't need a price drop since across a variety of reviews it's neck and neck at 800mhz but plenty of 7950s already come preoverclocked with 880-900mhz clocks for $320 at which point they are 5% faster than a stock 660Ti and have additional 30-40% overclocking headroom, 3GB of VRAM for SKYRIM ENB mods and ability to handle MSAA should you want to crank it to 8AA. Both cards are good, with Borderlands 2 making the 660Ti a good choice for the non-overclocker / BF3 / Guild Wars 2 player. However, someone who wants to play a wider variety of games and likes overclocking will be better off with a Gigabyte Windforce 3x/ MSI TwinFrozr 3 7950 @ 1.1-1.2ghz as those cards > HD7970/GTX680. That's a bargain that 660Ti can't touch since it is crippled with 24 ROP / 192-bit bus, which means it wont really have a shot beat a 680, not with MSAA.

        • clone
        • 7 years ago

        everyone is comparing early launch prices as if they matter, when AMD launched HD 7770 at a silly price everyone condemned it because the price was silly, now that things have settled down instead of the $150 launch price the cards sell for less than $110 making the discussion at the time less than pointless.

        GTX 660 is in much the same situation, it’s launching high, it looks weak against HD 7950 but where will it fit once it’s price point gets adjusted…… talk has been that it’s to fight with HD 7870 and HD 7850’s where depending on price it looks really nice.

        the most concerning issue I have with the review is the inconsistency in the results from one game to another, has it ever been no mixed?…. so brutal almost to the point where specific games need specific cards…. if video games hadn’t stalled for the past 5 years in development for the sake of the consoles currently on the market I wonder if the situation wouldn’t be drastically worse with unplayable games depending on brand.

    • cegras
    • 7 years ago

    I noticed that the GTX 660 seems to have a lot more spread in the frame time versus the 7870, although I’m not sure the wiggling is noticeable. Could you quantify this with a standard deviation (sigma) or variance about the average?

      • Damage
      • 7 years ago

      Pretty sure our current methods are superior for capturing the problem–high-latency frames–than the metrics you suggest. Variance on the low side of the mean is no bad thing in a real-time graphics system.

        • cegras
        • 7 years ago

        Right, the variance in, say, BF3 with the 660 Ti and the 7870 are well in the unnoticeable regime, but just for interest I wonder what the average of the 99th frame times + a variance / standard dev. would be like. For example, 99% of frames can be under 30 ms, but what if the average was 25 +/- 5? Is that better than a card with 20 +/- 10? 27 +/- 3?

    • Ryhadar
    • 7 years ago

    The performance of the recent new GPUs has been phenomenal, but it’s hard for me to get excited about them when they’re all $300+.

    If I were looking for a new card today I’d probably grab one of the following:
    – 7850
    – 560 Ti
    – 6950

    • Arclight
    • 7 years ago

    I expected more, but meh if it brings a price war that will be nice. But dang, look at the 7870 Hawk

      • Ryhadar
      • 7 years ago

      I hope so. But the fact of the matter is nVidia STILL doesn’t have a complete line-up of cards to fill out their 600 series.

      Even if nVidia starts to sell more 660 Ti’s than 7950s or 7870s, AMD still has the rest of the 7 series below the $300 mark to which it has no contemporary nVidia competitor; loses at the high end can easily be recuperated at the lower end.

      Also, newegg has no stock of 660 Ti’s at the moment. That doesn’t bode well for any price drops in the coming weeks if nVidia’s partners can’t keep the stock of cards up.

        • ultima_trev
        • 7 years ago

        Except that the bulk of low end sales will be Intel’s HD 4xxx GPUs, which are good enough that it’s not worth the price premium for anything in the Radeon HD 66xx, HD 67xx or HD 77xx series.

          • Washer
          • 7 years ago

          I think you’re confused. The HD7770 costs roughly $130. Intel’s HD4000 IGP on the desktop is only available in Core i5 3570K ($220~) and up.

          Take a look at these benchmark results, pay attention to the settings.

          HD4000: [url<]https://techreport.com/articles.x/22835/17[/url<] HD7770 (and cheaper cards): [url<]https://techreport.com/articles.x/22473/7[/url<] Skyrim shows us a miserable, barely playable experience for the HD4000 versus 1080P 8xAA 8xAF High settings on a HD7770. Go to the BF3 results to see an even greater difference. That's the difference between a nice experience (1080P medium in BF3 is still nice) versus unplayable. You don't even need to spend the $130 for a HD7770. Buy a HD6850 used for cheap and get a similar performance level.

    • Forge
    • 7 years ago

    Looks nice. The green team has a suitable competitor for 300$, at least.

    Now I want AMD to respond with something that justifies them staying around. Failing that, we’re going to need to start ramping up a new discrete GPU OEM. Intel/Nvidia have too different goals to really compete.

      • BestJinjo
      • 7 years ago

      HD7970 costs less and HD7950 overclocks better and can handle MSAA rather well. Thus AMD does have 2 competing cards priced around the 660Ti. Perhaps you forgot that 7950 has 30-40% overclocking headroom? You are on an enthusiast board 🙂

    • superjawes
    • 7 years ago

    OMG. BUTTONS!

    Wow, Scott, that’s just a little thing that makes the review awesome. Well done.

      • MadManOriginal
      • 7 years ago

      Yup, and they actually work on touchscreen devices unlike ‘mouse rollover’ comparisons.

    • kamikaziechameleon
    • 7 years ago

    Would have liked to see a 680, or 7970 for some real perspective. At that I’m seeing cards like th e 680 coming with substantial over clocks or extra large memory. Buffers. I was wondering how s 4 gb 680 would fair agianst a over clocked 2 gb and a stock clocked card.

      • MadManOriginal
      • 7 years ago

      4GB GTX 670 review: [url<]http://www.xbitlabs.com/articles/graphics/display/evga-geforce-gtx-670-4gb.html[/url<] Only matters for a very few games in multi-monitor setups.

        • kamikaziechameleon
        • 7 years ago

        Thanks, jumped to the conclusion sounds like its sorta worthless. You’d be better off with SLI or Crossfire.

    • derFunkenstein
    • 7 years ago

    Holy crap I love pressing buttons. This layout is just great.

    Also, this is still more video card than I want or need. I’m only running 1080p, so a “650” is apparently where I’m looking.

      • Flying Fox
      • 7 years ago

      660 without the Ti at 200-250 should be good for you. I am sort of waiting and seeing on that one too. However, with the game coupon the Ti is almost like 250.

        • derFunkenstein
        • 7 years ago

        Yeah, it’s definitely something to consider for lots of folks. If I was interested in Borderlands, I’d be able to talk myself into it much easier for sure.

    • rechicero
    • 7 years ago

    I’d say the open bench setup in unfair to the PNY offering. I’d really want to see the noise and temp graphs with a standard case (one front fan, one fan in the back)… And yes, I know different cases would mean diferente results, but a standard case would be a good reference. An open bench is a best case scenario for these multiple fans coolers… and virtually nobody will use these cards in those conditions.

    Maybe it doesn’t matter, but I’d like to see at least one review on that. For reference. Maybe with a Xonar. A card just in front of the fans, without an exhaust system to create a positive flow to the space between cards… I ‘d like to know how much of the heated air is going to be recirculated in those conditions.

    • chuckula
    • 7 years ago

    If Nvidia can keep the supplies moving it looks like a winner.

    • sweatshopking
    • 7 years ago

    price drop by amd is now necessary

    first

      • MadManOriginal
      • 7 years ago

      first to actually read the article :p CHEATER!

      A little bit of a shame about the price versus past x60 cards. I guess GTX 460-like pricing was too much to ask given yield and supply constraints and NV using the same GK104 chip.

      Switchable graphs are awesome btw.

        • sweatshopking
        • 7 years ago

        well, your first part is right. i didn’t read it first.

        as for the price, yeah, it is higher, but it’s also faster. idk. they’d like to see the 620 sell for 200 bones. i won’t buy any in the near (3-4 years) future, so to me, it’s irrelevant.

          • derFunkenstein
          • 7 years ago

          Didn’t the 560Ti launch at $300? I think this is a reasonable price given the performance. It’s just far more performance than most people will even appreciate, let alone need.

            • Krogoth
            • 7 years ago

            Correct, the 560Ti 384 did launch at $299 price point.

            • sweatshopking
            • 7 years ago

            well then! sounds like it’s consistent!

          • Srsly_Bro
          • 7 years ago

          Troll first, read never! +1

      • cegras
      • 7 years ago

      Many 7870s can be found under 300

      [url<]http://pcpartpicker.com/parts/video-card/#c=82,110[/url<]

      • south side sammy
      • 7 years ago

      somehow, after reading other reviews I can’t see this card being priced at $300. But I might be one of those few people who don’t think FPS is the only way to judge a cards performance.

      • Krogoth
      • 7 years ago

      Not really.

      660 Ti doesn’t change anything, unless Nvidia decides to throw at $249. But that will cannibalize their remaining 560Ti and 570 sales which should be going on discount as we speak.

        • derFunkenstein
        • 7 years ago

        570Ti?

Pin It on Pinterest

Share This