Spectacular dual-GPU graphics cards are all well and good if you’re trying to entertain your friends before taking them to the golf course in your Porsche SUV, then heading out to a five-star Thai restaurant where waiters massage your feet as you eat. The rest of us tend to shop in a more reasonable price range—you know, because silky smooth frame rates at ridiculous resolutions aren’t necessarily worth missing rent.
According to Nvidia, most e-tail graphics card sales are made at the $200 price point. That’s your GeForce GTX 460 1GB or GTX 560 Ti, your Radeon HD 6850 or 6950. These GPUs typically handle 1080p resolutions with ease, letting gamers sprinkle on eye candy and antialiasing as they see fit. No wonder cards of this caliber are often recommended in our system guides and deal-of-the-week posts. They’re as fast as most folks are gonna need.
Some insist on seeking out cheaper alternatives, and Nvidia’s latest creation caters to those penny pinchers. With a $149 suggested e-tail price, the GeForce GTX 550 Ti shouldn’t put much of a dent in your credit card statement. Heck, it’s affordable enough to fit into our el-cheapo Econobox build from the latest TR System Guide. Nevertheless, this newcomer should purportedly drive games happily at 1680×1050 with a smack of antialiasing, a bit like you might drive your Honda Civic to your one-bedroom apartment. It’s the kind of “good enough” one might want to upgrade from… eventually.
The GeForce GTX 550 Ti, pictured above in souped-up MSI and Zotac variants, is powered by a new GF116 graphics processor. “New” is kind of a strong word, though. Nvidia has been replacing its GeForce 400-series GPUs with refreshed versions over the past several months, and this latest revamped GPU follows the same pattern. The GF116 is architecturally the same as the GF106 chip found inside the $129 GeForce GTS 450 that debuted last September, but this time around, all of the chip’s key units are enabled.
We’ve seen this pattern before with practically all of Nvidia’s graphics processors derived from the Fermi architecture. Nvidia took the earlier GPU, retained the same basic architecture and unit counts, and did extensive design work to better adapt it to TSMC’s 40-nm fabrication process. The resulting GPU could tolerate higher clock speeds at lower voltages without the need to disable major portions of the chip. Thus, the GeForce GTX 460 gave way to the much faster GeForce GTX 560 Ti, to cite one example.
In the same vein, the GeForce GTX 550 Ti comes along behind the GTS 450. The GTS 450 is staying put at around $129 for now, while the GTX 550 Ti will occupy a middle ground between the GTS 450 and the GTX 460 1GB. (If you’re wondering about the GeForce GTX 460 768MB and 460 SE, Nvidia says those parts are reaching end-of-life status, so they might not stick around for too long.)
The GeForce GTX 550 Ti’s GF116 GPU has one graphics processing cluster with four streaming multiprocessors, each packing 48 ALUs (a.k.a. “CUDA cores” in Nvidia parlance) and a texture block capable of filtering eight texels per clock. That means 192 ALUs and 32 texels/clock in total. The GF116 rounds this off with three ROP partitions that can push a total of 24 pixels/clock, plus three 64-bit memory controllers that make up a combined 192-bit memory interface.
Those who read our GeForce GTS 450 review may recall that the GTS 450’s GF106 GPU technically has the same capabilities. However, Nvidia disabled a ROP partition and the third memory controller, restricting that card to 16 pixels/clock of output and an aggregate 128-bit memory interface. The GTX 550 Ti has no such handicaps.
Two other notable attributes differentiate the GTX 550 Ti from its slower, less gifted sibling. First, base clock speeds have gone up quite a bit. Where the GTS 450 ran at 783MHz with a 1566MHz shader speed and a 900MHz GDDR5 memory clock (for an effective 3600 MT/s), the GTX 550 Ti’s base spec calls for 900MHz core, 1800MHz shader, and 4104 MT/s memory speeds. Retail cards are even quicker, as we’ll see in a minute.
Second, Nvidia has incorporated some special sauce that allows the GTX 550 Ti to feature an even 1GB of RAM despite its lopsided memory interface. Normally, you’d want each of a GPU’s memory controllers to have the same amount of memory at its disposal. That’s why cards with 192-bit memory interfaces are often seen carrying 768MB of RAM, or 256MB per memory controller. The GeForce GTX 550 Ti arranges its memory differently.
There are still six chips—two per memory controller. However, four of those are 128MB chips, while the remaining two have 256MB of capacity. In other words, two of the memory controllers shoulder 256MB each, and the third controller has 512MB all to itself. Nvidia says the GTX 550 Ti is its first ever graphics card to ship with such a mixed memory configuration. Supporting it required the implementation of custom logic in both the drivers and the GF116 GPU.
Now, given the need to balance bandwidth between the GF116’s three controllers, it would appear unlikely the GTX 550 Ti is making full use of the extra 256MB on its third controller—even with drivers attempting to take advantage. This mixed memory config might therefore not be so much a technical feat as a marketing ploy, whereby the extra RAM’s chief purpose could be to let Nvidia slap a nice “1GB” on the box so that the card matches up well against competing Radeons in the minds of less informed consumers. “768MB” does look a little unsightly these days, and as the company keenly pointed out while briefing us, outfitting the GTX 550 Ti with 1.5GB of RAM (512MB per controller) wouldn’t have been cost-effective.
We’ll see how the GTX 550 Ti scales as we crank up the resolution soon. First, though, let’s have a closer look at a couple of cards.
Owing to the prevalence of video cards with higher-than-normal clock speeds in the Nvidia camp, neither of the GeForce GTX 550 Ti variants that found their way into our labs follow Nvidia’s base specification.
The lowest-clocked of the two is MSI’s GeForce GTX 550 Ti Cyclone II, named after its rather flamboyant cooler. MSI runs this card’s GPU and memory 50MHz quicker than Nvidia’s prescribed base speeds, resulting in a 950MHz GPU clock, 1900MHz shader clock, and 4300 MT/s memory transfer rate. In spite of those fairly substantial increases and the fancy cooler, the Cyclone II carries a $154.99 price tag—only five bucks above Nvidia’s suggested price.
Zotac’s GeForce GTX 550 Ti AMP! Edition takes things to the next level with core, shader, and memory speeds of 1000MHz, 2000MHz, and 4400 MT/s, respectively. The cooler provided here isn’t as extravagant as the MSI design, but it’s a custom one nonetheless. (Nvidia’s stock GTX 550 Ti cooler resembles the rather bland one strapped to the GTS 450.) The Zotac card also costs $154.99.
Update: We originally wrote that the Zotac GeForce GTX 550 Ti AMP! Edition would retail for $169. That information, which we received from the manufacturer, turned out to be incorrect, as listings at Newegg and other e-tailers show. We’ve updated this review accordingly.
A little competish’
Now, what’s cooking on AMD’s side of the fence?
Ladies and gentlemen, give a big hand to Gigabyte’s Radeon HD 5770 Super Overclock. The Super Overclock label is admittedly be a bit of a misnomer, since all this card has to show for it is a 50MHz GPU speed hike to 900MHz. Still, with a $139.99 asking price at Newegg, this card could prove to be a compelling alternative to the GeForce GTX 550 Ti—if it can keep up, that is.
Another alternative is the vanilla Radeon HD 6850, which AMD says is now available for as little as $149.99 after mail-in rebates. Newegg indeed sells one such card for $169.99 with a $20 MIR, but other variants will set you back at least $175 (or $160 after rebate). Depending on whether you mind waiting weeks for a rebate check that may just end up behind a dumpster somewhere, the 6850 could turn out to be a better choice than either Nvidia’s newcomer or Gigabyte’s riced Radeon HD 5770.
Alongside these mail-in rebates, AMD is fighting back against Nvidia’s onslaught with publicly available Catalyst 11.4 preview drivers, which promise meaty performance increases in a number of titles for owners of Radeon HD 6800- and 6900-series cards who play at high resolutions with antialiasing. AMD talks of increases of as much as 70% in Civilization V, 49% in Call of Duty: Black Ops, and 33% in Left 4 Dead 2, to name a few. We’ll be using this driver to benchmark our Radeons over the next few pages.
|GeForce GTS 450||12.5||25.1||25.1||601||783||57.7|
|GeForce GTS 450 AMP!||14.0||28.0||28.0||672||875||64.0|
|GeForce GTX 550 Ti||21.6||28.8||28.8||691||900||98.5|
|GeForce GTX 550 Ti Cyclone||22.8||30.4||30.4||730||950||103|
|GeForce GTX 550 Ti AMP!||24.0||32.0||32.0||768||1000||106|
|GeForce GTX 460 768MB||16.2||37.8||37.8||907||1350||86.4|
|GeForce GTX 460 1GB||21.6||37.8||37.8||907||1350||115|
|GeForce GTX 560 Ti||26.3||52.6||52.6||1263||1644||128|
|Radeon HD 5770||13.6||34.0||17.0||1360||850||76.8|
|Radeon HD 5770 SOC||14.4||36.0||18.0||1440||900||76.8|
|Radeon HD 6850||24.8||37.2||18.6||1488||775||128|
|Radeon HD 6870||28.8||50.4||25.2||2016||900||134|
|Radeon HD 6950||25.6||70.4||35.2||2253||1600||160|
This wouldn’t be a TR graphics review without a geeky chart showing all of the key contestants’ raw theoretical speeds. As you can see above, the GTX 550 Ti’s high clock rates and still-healthy ROP count allow it to keep up with the GeForce GTX 460 1GB’s peak pixel fill rate. Otherwise, it sits more or less between the GTX 460 and the old GeForce GTS 450 on the theoretical scale—pretty much where you’d expect.
In the AMD camp, the Radeon HD 5770 looks somewhat outmatched overall, especially since AMD’s peak theoretical numbers tend to be less representative of real-world performance than Nvidia’s. The Radeon HD 6850 looks to be in a better position to give the GeForce GTX 550 Ti a whupping, though. Let’s see how that potential translates into real-world frame rates.
Our testing methods
To keep things even, we tested the Radeons with AMD’s Catalyst 11.4 preview driver and the GeForces with a fresh beta driver Nvidia provided last week. We also configured our Radeons’ Catalyst Control Panel like so, leaving optional AMD optimizations for tessellation and texture filtering disabled.
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.
Our test system was configured as follows:
|Processor||Intel Core i5-750|
|North bridge||Intel P55 Express|
|Memory size||4GB (2 DIMMs)|
|Memory type||Kingston HyperX KHX2133C9AD3X2K2/4GX
DDR3 SDRAM at 1333MHz
|Memory timings||9-9-9-24 1T|
|Chipset drivers||INF update 188.8.131.525
Rapid Storage Technology 10.1.0.1008
with Realtek R2.57 drivers
|Graphics||Gigabyte Radeon HD 5770 Super OC 1GB
with Catalyst 11.4 preview drivers
|XFX Radeon HD 6850 1GB
with Catalyst 11.4 preview drivers
|Zotac GeForce GTS 450 1GB AMP! Edition
with GeForce 267.59 beta drivers
|MSI GeForce GTX 550 Ti Cyclone II 1GB
with GeForce 267.59 beta drivers
|Zotac GeForce GTX 550 Ti AMP! Edition 1GB
with GeForce 267.59 beta drivers
|Zotac GeForce GTX 460 1GB
with GeForce 267.59 beta drivers
|Hard drive||Samsung SpinPoint F1 HD103UJ 1TB SATA|
|Power supply||Corsair HX750W 750W|
|OS||Windows 7 Ultimate x64 Edition
Service Pack 1
Thanks to Intel, Kingston, Samsung, MSI, and Corsair for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.
Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.
We used the following test applications:
- Aliens vs. Predator benchmark
- Sid Meier’s Civilization V
- Just Cause 2
- Metro 2033
- Fraps 3.3.2
- GPU-Z 0.5.1
Some further notes on our methods:
- Many of our performance tests are scripted and repeatable, but for Bulletstorm, we used the Fraps utility to record frame rates while playing a 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We raised our sample size, testing each Fraps sequence five times per video card, in order to counteract any variability. We’ve included second-by-second frame rate results from Fraps for those games, and in that case, you’re seeing the results from a single, representative pass through the test sequence.
We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.
The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Bulletstorm at a 1920×1200 resolution with 4X AA and 16X anisotropic filtering.
We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 8″ from the test system at a height even with the top of the video card.
You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.
- We used GPU-Z to log GPU temperatures during our load testing.
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
I’ve made no secret of my appreciation for Bulletstorm‘s cathartic gameplay and gorgeous environments, so it seems like a fitting start to our round of benchmarking. This shooter was tested at 1680×1050 with 2X antialiasing, medium post-processing quality, and other detail settings cranked up. Since this game has no built-in benchmarking mode, I played through the first 90 seconds of the “Hideout” echo five times per card, reporting the median of average and low frame rates obtained.
The Radeons are off to an impressive start, leaving even the GeForce GTX 460 1GB in the dust. Do note, however, that the Radeon HD 5770’s minimum frame rate is no higher than the GTX 550 Ti’s.
As Scott pointed out in his review of the Radeon HD 6990 earlier this month, AMD’s recent driver releases have improved multisampled antialiasing performance especially in games that use deferred shading—a list that includes Bulletstorm among other Unreal Engine 3-based titles. So, the numbers above aren’t entirely unexpected (or inexplicable).
Civ V has several interesting tests, including a built-in compute shader benchmark that measures the GPU’s ability to decompress textures used for the graphically detailed leader characters depicted in the game. The decompression routine is based on a DirectX 11 compute shader. The benchmark reports individual results for a long list of leaders; we’ve averaged those scores to give you the results you see below.
The GeForce GTX 550 Ti cards do exceedingly well here. We’re not quite sure why the GTX 550 Ti is faster than the GTX 460, but we suspect the GTX 550 Ti’s high clock speeds and ROP rates are playing a role. As we noted earlier, its configuration enables pixel fill rates equivalent to or greater than those of the GeForce GTX 460 1GB’s.
In addition to the compute shader test, Civ V has several other built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. As with the earlier compute shader test, we chose to average the results from the individual leaders.
At 1680×1050 with 4X antialiasing, even the GeForce GTS 450 barely breaks a sweat rendering these scenes, at least on average.
Another benchmark in Civ V focuses on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.
Frame rates are quite a bit lower here, although they’re no real cause for concern in a real-time strategy game lacking in fast camera movements. Competitively speaking, Nvidia has the upper hand, with MSI’s GeForce GTX 550 Ti Cyclone II matching the Radeon HD 6850.
Just Cause 2
Although it’s not the newest kid on the block, JC2 is a good example of a relatively resource-intensive game with flashy DirectX 10 visuals. It doesn’t hurt that the game has a huge, open world and addictively fun gameplay, either.
This title supports a couple of visual effects generated by Nvidia’s CUDA GPU-computing API, but we’ve left them disabled for our testing. The CUDA effects are only used sparingly in the game, anyhow, and we’d like to keep things even between the different GPU brands.
We tested performance with JC2‘s built-in benchmark, using the “Dark Tower” sequence.
No question about it, ponying up for a Radeon HD 6850 pays dividends in Just Cause 2. The GF106- and GF116-based GeForces are palpably slower, as is the 5770.
Sometimes, and especially with low-end GPUs like the GeForce GTX 550 Ti, treating yourself to a decent amount of fancy shader effects without killing frame rates means having to sacrifice antialiasing. I ran Metro 2033‘s built-in benchmark using the “High” graphical preset with 16X anisotropic filtering and no antialiasing. PhysX effects were left disabled to ensure a fair fight between all of our contestants.
The Radeon HD 6850 pulls ahead of both GTX 550 Ti cards again, although Nvidia’s own GeForce GTX 460 1GB ends up at the top of the podium. The souped-up Radeon HD 5770 does reasonably well, keeping up with the slower of the two GTX 550 Ti variants.
Aliens vs. Predator
AvP uses several DirectX 11 features to improve image quality and performance, including tessellation, advanced shadow sampling, and DX11-enhanced multisampled anti-aliasing. Naturally, we were pleased when the game’s developers put together an easily scriptable benchmark tool. This benchmark cycles through a range of scenes in the game, including one spot where a horde of tessellated aliens comes crawling down the floor, ceiling, and walls of a corridor.
For these tests, we turned up all of the image quality options to their maximums, along with 2X antialiasing and 16X anisotropic filtering.
This test gives us a chance to see how our contestants fare as we turn up the resolution. At these settings, the GTX 550 Ti cards seem to handle resolution scaling slightly better than the Radeon HD 5770, whose 1GB of RAM is hooked up to a conventional memory configuration. We should of course point out that Nvidia’s older GeForce GTX 460 768MB also fared well at higher resolutions in this test, so the 550 Ti’s extra RAM might not even come into play here.
That said, hitting smooth frame rate at 1680×1050 with the 550 Ti cards involves disabling either antialiasing or some of the eye candy. (Disabling AA seemed to have a more pronounced effect when we were tinkering with the settings.) The Radeon HD 6850, meanwhile, maintains an average of just over 30 FPS even at 1080p with antialiasing enabled.
At idle, the two new GeForces sip power. Heck, the GTX 550 Ti Cyclone actually draws less than the slower (albeit slightly souped-up) GeForce GTS 450 AMP! Edition. Neither Radeon is anywhere near as modest with its power utilization.
Put these puppies under a load, and the picture is reversed, with the fastest of the two GeForces pulling an extra 20W over the Radeon HD 6850. That 1GHz core clock speed comes at a cost, apparently.
Noise levels and GPU temperatures
The two 550 Ti variants, and especially the faster Zotac one, are surprisingly quiet under load. The Radeon HD 6850 isn’t quite so lucky. Not only is it a good 4 dB louder at peak, but I’d also say the pitch of its fan is more bothersome even at idle.
On the temperatures front, it’s clear which cards are tuned to stay cool and which are optimized for low noise levels. MSI’s Cyclone II cooler proves its effectiveness in the slower GTX 550 Ti, while the Radeon HD 6850 sits in the middle of the pack.
Take our probably inadequate sample size and mash the performance numbers together with pricing data into a big, potentially misleading scatter plot, and this is what we get:
For the record, the performance data used to fashion the plot above were averaged from all of our 1680×1050 game benchmarks, minus Civilization V‘s leader and compute-shader tests. Prices were pulled from Newegg and Amazon for the particular cards we tested. You’ll likely find slightly cheaper variants of some of the offerings above, though, like that PowerColor Radeon HD 6850 we mentioned on page two.
This scatter plot tells us a few things. First, the GeForce GTX 550 Ti sits, as intended, between the GeForce GTS 450 and the GTX 460 1GB on the performance front. It’s perhaps a little closer to the former than the latter, but it’s in the middle nonetheless.
At the same time, the current pricing landscape seems to favor slightly more upscale cards like the GeForce GTX 460 1GB and Radeon HD 6850, which deliver substantially higher frame rates pretty much across the board for not a whole lot of extra money. You’re looking at a paltry $15-25 step up from the GeForce GTX 550 Ti Cyclone to those faster cards… and that’s before the mail-in rebates that can bring the 6850 to $149.99 and the GTX 460 1GB to $159.99.
The GTX 550 Ti’s saving grace could be its lower noise levels with the coolers we tested. Considering the fact that the Radeon HD 6850 doesn’t actually have higher power draw under load, though, I would assume it’s just as easy to cool quietly with the right third-party solution (the card we tested had a stock AMD cooler). The large fan on that $169.99 PowerColor Radeon HD 6850 looks promising in that regard, although we’ll reserve judgment until we’ve gotten a chance to hear it in action.
Ultimately, the GeForce GTX 550 Ti is a tough product to recommend. Were it offered as a replacement to the GeForce GTS 450 at $129, it’d be a no-brainer for cash-strapped gamers. Who knows—perhaps future price cuts will take it there. Right now, however, those users would be better off setting aside an extra sawbuck or two and moving up the food chain.