Nvidia’s GeForce GTX 550 Ti graphics processor

Spectacular dual-GPU graphics cards are all well and good if you’re trying to entertain your friends before taking them to the golf course in your Porsche SUV, then heading out to a five-star Thai restaurant where waiters massage your feet as you eat. The rest of us tend to shop in a more reasonable price range—you know, because silky smooth frame rates at ridiculous resolutions aren’t necessarily worth missing rent.

According to Nvidia, most e-tail graphics card sales are made at the $200 price point. That’s your GeForce GTX 460 1GB or GTX 560 Ti, your Radeon HD 6850 or 6950. These GPUs typically handle 1080p resolutions with ease, letting gamers sprinkle on eye candy and antialiasing as they see fit. No wonder cards of this caliber are often recommended in our system guides and deal-of-the-week posts. They’re as fast as most folks are gonna need.

Some insist on seeking out cheaper alternatives, and Nvidia’s latest creation caters to those penny pinchers. With a $149 suggested e-tail price, the GeForce GTX 550 Ti shouldn’t put much of a dent in your credit card statement. Heck, it’s affordable enough to fit into our el-cheapo Econobox build from the latest TR System Guide. Nevertheless, this newcomer should purportedly drive games happily at 1680×1050 with a smack of antialiasing, a bit like you might drive your Honda Civic to your one-bedroom apartment. It’s the kind of “good enough” one might want to upgrade from… eventually.

The GeForce GTX 550 Ti, pictured above in souped-up MSI and Zotac variants, is powered by a new GF116 graphics processor. “New” is kind of a strong word, though. Nvidia has been replacing its GeForce 400-series GPUs with refreshed versions over the past several months, and this latest revamped GPU follows the same pattern. The GF116 is architecturally the same as the GF106 chip found inside the $129 GeForce GTS 450 that debuted last September, but this time around, all of the chip’s key units are enabled.

We’ve seen this pattern before with practically all of Nvidia’s graphics processors derived from the Fermi architecture. Nvidia took the earlier GPU, retained the same basic architecture and unit counts, and did extensive design work to better adapt it to TSMC’s 40-nm fabrication process. The resulting GPU could tolerate higher clock speeds at lower voltages without the need to disable major portions of the chip. Thus, the GeForce GTX 460 gave way to the much faster GeForce GTX 560 Ti, to cite one example.

In the same vein, the GeForce GTX 550 Ti comes along behind the GTS 450. The GTS 450 is staying put at around $129 for now, while the GTX 550 Ti will occupy a middle ground between the GTS 450 and the GTX 460 1GB. (If you’re wondering about the GeForce GTX 460 768MB and 460 SE, Nvidia says those parts are reaching end-of-life status, so they might not stick around for too long.)

The GeForce GTX 550 Ti’s GF116 GPU has one graphics processing cluster with four streaming multiprocessors, each packing 48 ALUs (a.k.a. “CUDA cores” in Nvidia parlance) and a texture block capable of filtering eight texels per clock. That means 192 ALUs and 32 texels/clock in total. The GF116 rounds this off with three ROP partitions that can push a total of 24 pixels/clock, plus three 64-bit memory controllers that make up a combined 192-bit memory interface.

Those who read our GeForce GTS 450 review may recall that the GTS 450’s GF106 GPU technically has the same capabilities. However, Nvidia disabled a ROP partition and the third memory controller, restricting that card to 16 pixels/clock of output and an aggregate 128-bit memory interface. The GTX 550 Ti has no such handicaps.

Two other notable attributes differentiate the GTX 550 Ti from its slower, less gifted sibling. First, base clock speeds have gone up quite a bit. Where the GTS 450 ran at 783MHz with a 1566MHz shader speed and a 900MHz GDDR5 memory clock (for an effective 3600 MT/s), the GTX 550 Ti’s base spec calls for 900MHz core, 1800MHz shader, and 4104 MT/s memory speeds. Retail cards are even quicker, as we’ll see in a minute.

Second, Nvidia has incorporated some special sauce that allows the GTX 550 Ti to feature an even 1GB of RAM despite its lopsided memory interface. Normally, you’d want each of a GPU’s memory controllers to have the same amount of memory at its disposal. That’s why cards with 192-bit memory interfaces are often seen carrying 768MB of RAM, or 256MB per memory controller. The GeForce GTX 550 Ti arranges its memory differently.

There are still six chips—two per memory controller. However, four of those are 128MB chips, while the remaining two have 256MB of capacity. In other words, two of the memory controllers shoulder 256MB each, and the third controller has 512MB all to itself. Nvidia says the GTX 550 Ti is its first ever graphics card to ship with such a mixed memory configuration. Supporting it required the implementation of custom logic in both the drivers and the GF116 GPU.

Now, given the need to balance bandwidth between the GF116’s three controllers, it would appear unlikely the GTX 550 Ti is making full use of the extra 256MB on its third controller—even with drivers attempting to take advantage. This mixed memory config might therefore not be so much a technical feat as a marketing ploy, whereby the extra RAM’s chief purpose could be to let Nvidia slap a nice “1GB” on the box so that the card matches up well against competing Radeons in the minds of less informed consumers. “768MB” does look a little unsightly these days, and as the company keenly pointed out while briefing us, outfitting the GTX 550 Ti with 1.5GB of RAM (512MB per controller) wouldn’t have been cost-effective.

We’ll see how the GTX 550 Ti scales as we crank up the resolution soon. First, though, let’s have a closer look at a couple of cards.

The cards

Owing to the prevalence of video cards with higher-than-normal clock speeds in the Nvidia camp, neither of the GeForce GTX 550 Ti variants that found their way into our labs follow Nvidia’s base specification.

The lowest-clocked of the two is MSI’s GeForce GTX 550 Ti Cyclone II, named after its rather flamboyant cooler. MSI runs this card’s GPU and memory 50MHz quicker than Nvidia’s prescribed base speeds, resulting in a 950MHz GPU clock, 1900MHz shader clock, and 4300 MT/s memory transfer rate. In spite of those fairly substantial increases and the fancy cooler, the Cyclone II carries a $154.99 price tag—only five bucks above Nvidia’s suggested price.

Zotac’s GeForce GTX 550 Ti AMP! Edition takes things to the next level with core, shader, and memory speeds of 1000MHz, 2000MHz, and 4400 MT/s, respectively. The cooler provided here isn’t as extravagant as the MSI design, but it’s a custom one nonetheless. (Nvidia’s stock GTX 550 Ti cooler resembles the rather bland one strapped to the GTS 450.) The Zotac card also costs $154.99.

Update: We originally wrote that the Zotac GeForce GTX 550 Ti AMP! Edition would retail for $169. That information, which we received from the manufacturer, turned out to be incorrect, as listings at Newegg and other e-tailers show. We’ve updated this review accordingly.

A little competish’

Now, what’s cooking on AMD’s side of the fence?

Ladies and gentlemen, give a big hand to Gigabyte’s Radeon HD 5770 Super Overclock. The Super Overclock label is admittedly be a bit of a misnomer, since all this card has to show for it is a 50MHz GPU speed hike to 900MHz. Still, with a $139.99 asking price at Newegg, this card could prove to be a compelling alternative to the GeForce GTX 550 Ti—if it can keep up, that is.

Another alternative is the vanilla Radeon HD 6850, which AMD says is now available for as little as $149.99 after mail-in rebates. Newegg indeed sells one such card for $169.99 with a $20 MIR, but other variants will set you back at least $175 (or $160 after rebate). Depending on whether you mind waiting weeks for a rebate check that may just end up behind a dumpster somewhere, the 6850 could turn out to be a better choice than either Nvidia’s newcomer or Gigabyte’s riced Radeon HD 5770.

Alongside these mail-in rebates, AMD is fighting back against Nvidia’s onslaught with publicly available Catalyst 11.4 preview drivers, which promise meaty performance increases in a number of titles for owners of Radeon HD 6800- and 6900-series cards who play at high resolutions with antialiasing. AMD talks of increases of as much as 70% in Civilization V, 49% in Call of Duty: Black Ops, and 33% in Left 4 Dead 2, to name a few. We’ll be using this driver to benchmark our Radeons over the next few pages.

  Peak pixel

fill rate


Peak bilinear

integer texel

filtering rate


Peak bilinear

FP16 texel

filtering rate


Peak shader











GeForce GTS 450 12.5 25.1 25.1 601 783 57.7
GeForce GTS 450 AMP! 14.0 28.0 28.0 672 875 64.0
GeForce GTX 550 Ti 21.6 28.8 28.8 691 900 98.5
GeForce GTX 550 Ti Cyclone 22.8 30.4 30.4 730 950 103
GeForce GTX 550 Ti AMP! 24.0 32.0 32.0 768 1000 106
GeForce GTX 460 768MB 16.2 37.8 37.8 907 1350 86.4
GeForce GTX 460 1GB 21.6 37.8 37.8 907 1350 115
GeForce GTX 560 Ti 26.3 52.6 52.6 1263 1644 128
Radeon HD 5770 13.6 34.0 17.0 1360 850 76.8
Radeon HD 5770 SOC 14.4 36.0 18.0 1440 900 76.8
Radeon HD 6850 24.8 37.2 18.6 1488 775 128
Radeon HD 6870 28.8 50.4 25.2 2016 900 134
Radeon HD 6950 25.6 70.4 35.2 2253 1600 160

This wouldn’t be a TR graphics review without a geeky chart showing all of the key contestants’ raw theoretical speeds. As you can see above, the GTX 550 Ti’s high clock rates and still-healthy ROP count allow it to keep up with the GeForce GTX 460 1GB’s peak pixel fill rate. Otherwise, it sits more or less between the GTX 460 and the old GeForce GTS 450 on the theoretical scale—pretty much where you’d expect.

In the AMD camp, the Radeon HD 5770 looks somewhat outmatched overall, especially since AMD’s peak theoretical numbers tend to be less representative of real-world performance than Nvidia’s. The Radeon HD 6850 looks to be in a better position to give the GeForce GTX 550 Ti a whupping, though. Let’s see how that potential translates into real-world frame rates.

Our testing methods

To keep things even, we tested the Radeons with AMD’s Catalyst 11.4 preview driver and the GeForces with a fresh beta driver Nvidia provided last week. We also configured our Radeons’ Catalyst Control Panel like so, leaving optional AMD optimizations for tessellation and texture filtering disabled.

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test system was configured as follows:

Processor Intel Core i5-750
Motherboard MSI P55-GD65
North bridge Intel P55 Express
South bridge
Memory size 4GB (2 DIMMs)
Memory type Kingston HyperX KHX2133C9AD3X2K2/4GX

DDR3 SDRAM at 1333MHz

Memory timings 9-9-9-24 1T
Chipset drivers INF update

Rapid Storage Technology

Audio Integrated ALC889

with Realtek R2.57 drivers

Graphics Gigabyte Radeon HD 5770 Super OC 1GB

with Catalyst 11.4 preview drivers

XFX Radeon HD 6850 1GB

with Catalyst 11.4 preview drivers

Zotac GeForce GTS 450 1GB AMP! Edition

with GeForce 267.59 beta drivers

MSI GeForce GTX 550 Ti Cyclone II 1GB

with GeForce 267.59 beta drivers

Zotac GeForce GTX 550 Ti AMP! Edition 1GB

with GeForce 267.59 beta drivers

Zotac GeForce GTX 460 1GB

with GeForce 267.59 beta drivers

Hard drive Samsung SpinPoint F1 HD103UJ 1TB SATA
Power supply Corsair HX750W 750W
OS Windows 7 Ultimate x64 Edition

Service Pack 1

Thanks to Intel, Kingston, Samsung, MSI, and Corsair for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • Many of our performance tests are scripted and repeatable, but for Bulletstorm, we used the Fraps utility to record frame rates while playing a 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We raised our sample size, testing each Fraps sequence five times per video card, in order to counteract any variability. We’ve included second-by-second frame rate results from Fraps for those games, and in that case, you’re seeing the results from a single, representative pass through the test sequence.

  • We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Bulletstorm at a 1920×1200 resolution with 4X AA and 16X anisotropic filtering.

  • We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 8″ from the test system at a height even with the top of the video card.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.


I’ve made no secret of my appreciation for Bulletstorm‘s cathartic gameplay and gorgeous environments, so it seems like a fitting start to our round of benchmarking. This shooter was tested at 1680×1050 with 2X antialiasing, medium post-processing quality, and other detail settings cranked up. Since this game has no built-in benchmarking mode, I played through the first 90 seconds of the “Hideout” echo five times per card, reporting the median of average and low frame rates obtained.

The Radeons are off to an impressive start, leaving even the GeForce GTX 460 1GB in the dust. Do note, however, that the Radeon HD 5770’s minimum frame rate is no higher than the GTX 550 Ti’s.

As Scott pointed out in his review of the Radeon HD 6990 earlier this month, AMD’s recent driver releases have improved multisampled antialiasing performance especially in games that use deferred shading—a list that includes Bulletstorm among other Unreal Engine 3-based titles. So, the numbers above aren’t entirely unexpected (or inexplicable).

Civilization V
Civ V has several interesting tests, including a built-in compute shader benchmark that measures the GPU’s ability to decompress textures used for the graphically detailed leader characters depicted in the game. The decompression routine is based on a DirectX 11 compute shader. The benchmark reports individual results for a long list of leaders; we’ve averaged those scores to give you the results you see below.

The GeForce GTX 550 Ti cards do exceedingly well here. We’re not quite sure why the GTX 550 Ti is faster than the GTX 460, but we suspect the GTX 550 Ti’s high clock speeds and ROP rates are playing a role. As we noted earlier, its configuration enables pixel fill rates equivalent to or greater than those of the GeForce GTX 460 1GB’s.

In addition to the compute shader test, Civ V has several other built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. As with the earlier compute shader test, we chose to average the results from the individual leaders.

At 1680×1050 with 4X antialiasing, even the GeForce GTS 450 barely breaks a sweat rendering these scenes, at least on average.

Another benchmark in Civ V focuses on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.

Frame rates are quite a bit lower here, although they’re no real cause for concern in a real-time strategy game lacking in fast camera movements. Competitively speaking, Nvidia has the upper hand, with MSI’s GeForce GTX 550 Ti Cyclone II matching the Radeon HD 6850.

Just Cause 2

Although it’s not the newest kid on the block, JC2 is a good example of a relatively resource-intensive game with flashy DirectX 10 visuals. It doesn’t hurt that the game has a huge, open world and addictively fun gameplay, either.

This title supports a couple of visual effects generated by Nvidia’s CUDA GPU-computing API, but we’ve left them disabled for our testing. The CUDA effects are only used sparingly in the game, anyhow, and we’d like to keep things even between the different GPU brands.

We tested performance with JC2‘s built-in benchmark, using the “Dark Tower” sequence.

No question about it, ponying up for a Radeon HD 6850 pays dividends in Just Cause 2. The GF106- and GF116-based GeForces are palpably slower, as is the 5770.

Metro 2033

Sometimes, and especially with low-end GPUs like the GeForce GTX 550 Ti, treating yourself to a decent amount of fancy shader effects without killing frame rates means having to sacrifice antialiasing. I ran Metro 2033‘s built-in benchmark using the “High” graphical preset with 16X anisotropic filtering and no antialiasing. PhysX effects were left disabled to ensure a fair fight between all of our contestants.

The Radeon HD 6850 pulls ahead of both GTX 550 Ti cards again, although Nvidia’s own GeForce GTX 460 1GB ends up at the top of the podium. The souped-up Radeon HD 5770 does reasonably well, keeping up with the slower of the two GTX 550 Ti variants.

Aliens vs. Predator
AvP uses several DirectX 11 features to improve image quality and performance, including tessellation, advanced shadow sampling, and DX11-enhanced multisampled anti-aliasing. Naturally, we were pleased when the game’s developers put together an easily scriptable benchmark tool. This benchmark cycles through a range of scenes in the game, including one spot where a horde of tessellated aliens comes crawling down the floor, ceiling, and walls of a corridor.

For these tests, we turned up all of the image quality options to their maximums, along with 2X antialiasing and 16X anisotropic filtering.

This test gives us a chance to see how our contestants fare as we turn up the resolution. At these settings, the GTX 550 Ti cards seem to handle resolution scaling slightly better than the Radeon HD 5770, whose 1GB of RAM is hooked up to a conventional memory configuration. We should of course point out that Nvidia’s older GeForce GTX 460 768MB also fared well at higher resolutions in this test, so the 550 Ti’s extra RAM might not even come into play here.

That said, hitting smooth frame rate at 1680×1050 with the 550 Ti cards involves disabling either antialiasing or some of the eye candy. (Disabling AA seemed to have a more pronounced effect when we were tinkering with the settings.) The Radeon HD 6850, meanwhile, maintains an average of just over 30 FPS even at 1080p with antialiasing enabled.

Power consumption

At idle, the two new GeForces sip power. Heck, the GTX 550 Ti Cyclone actually draws less than the slower (albeit slightly souped-up) GeForce GTS 450 AMP! Edition. Neither Radeon is anywhere near as modest with its power utilization.

Put these puppies under a load, and the picture is reversed, with the fastest of the two GeForces pulling an extra 20W over the Radeon HD 6850. That 1GHz core clock speed comes at a cost, apparently.

Noise levels and GPU temperatures

The two 550 Ti variants, and especially the faster Zotac one, are surprisingly quiet under load. The Radeon HD 6850 isn’t quite so lucky. Not only is it a good 4 dB louder at peak, but I’d also say the pitch of its fan is more bothersome even at idle.

On the temperatures front, it’s clear which cards are tuned to stay cool and which are optimized for low noise levels. MSI’s Cyclone II cooler proves its effectiveness in the slower GTX 550 Ti, while the Radeon HD 6850 sits in the middle of the pack.


Take our probably inadequate sample size and mash the performance numbers together with pricing data into a big, potentially misleading scatter plot, and this is what we get:

For the record, the performance data used to fashion the plot above were averaged from all of our 1680×1050 game benchmarks, minus Civilization V‘s leader and compute-shader tests. Prices were pulled from Newegg and Amazon for the particular cards we tested. You’ll likely find slightly cheaper variants of some of the offerings above, though, like that PowerColor Radeon HD 6850 we mentioned on page two.

This scatter plot tells us a few things. First, the GeForce GTX 550 Ti sits, as intended, between the GeForce GTS 450 and the GTX 460 1GB on the performance front. It’s perhaps a little closer to the former than the latter, but it’s in the middle nonetheless.

At the same time, the current pricing landscape seems to favor slightly more upscale cards like the GeForce GTX 460 1GB and Radeon HD 6850, which deliver substantially higher frame rates pretty much across the board for not a whole lot of extra money. You’re looking at a paltry $15-25 step up from the GeForce GTX 550 Ti Cyclone to those faster cards… and that’s before the mail-in rebates that can bring the 6850 to $149.99 and the GTX 460 1GB to $159.99.

The GTX 550 Ti’s saving grace could be its lower noise levels with the coolers we tested. Considering the fact that the Radeon HD 6850 doesn’t actually have higher power draw under load, though, I would assume it’s just as easy to cool quietly with the right third-party solution (the card we tested had a stock AMD cooler). The large fan on that $169.99 PowerColor Radeon HD 6850 looks promising in that regard, although we’ll reserve judgment until we’ve gotten a chance to hear it in action.

Ultimately, the GeForce GTX 550 Ti is a tough product to recommend. Were it offered as a replacement to the GeForce GTS 450 at $129, it’d be a no-brainer for cash-strapped gamers. Who knows—perhaps future price cuts will take it there. Right now, however, those users would be better off setting aside an extra sawbuck or two and moving up the food chain.

Comments closed
    • sweatshopking
    • 12 years ago


    • Arclight
    • 12 years ago

    This card sucks and i wonder why nobody is saying this, it’s priced too high. Who would choose this over the GTX 460 or HD 6850? Last time i checked in my country they were about the same price. This card is a flop just like the 430. Look at the 1Ghz OCed results, slower than 6850 and with higher power consuption……

    • swaaye
    • 12 years ago

    Hey I killed my Radeon 8500 by punching a screw thru a cap while trying to attach a big copper 1U heatsink. 😉 Stupid caps were very close to the GPU on that card and I didn’t realize what was happening while I tightened the fan down.

    • Shinare
    • 12 years ago

    Awww, the Ti in the name reminds me of my first video card I ever modded, a GF4 Ti 4200 which I OC’d to 4800 speed and then later added a better fan and broke the card at the same time. heh

    • OneArmedScissor
    • 12 years ago

    The 4870 was the high end at its release and the first card to use GDDR5, which was in low supply and not cheap at the time. The 4850 was the same chip, but was $160-200 right out of the gate.

    The reason the 4870 dropped so much was that GDDR5 went down to near the same price as GDDR3. Then it was bumped down by the 4890, which was much less than $300 at release, surely because of the GDDR5 price coming down.

    Nvidia were the losers there. They were selling a monstrous chip that was supposed to be the bleeding edge for as low as $150.

    I think that’s why they’re hesitant to put any significant pressure on AMD below about $250. Look at what they did with the GTX 560. It’s actually more expensive than the 460s were at their release about 9 months ago.

    They undoubtedly took a good, hard look at the GTX 550 and said, “Hey, we can make this better…but they can make theirs even better yet!” and didn’t want to find out.

    • swaaye
    • 12 years ago

    I think the 5770 and GTX 550 need to get to $100. Then they’d be a worthwhile choice over the 6850 and 460. That would bring in considerably better value.

    • flip-mode
    • 12 years ago

    There is likely some validity to that.

    • swaaye
    • 12 years ago

    I think perceptions have been twisted a bit by the 4870/4850 era. From what I’ve read, the pricing of those two were the result of an intense price war between NV and ATI that year. I think the 4870 was more of a $300+ card but the price war brought it down much faster than what it realistically should have been at. That year was apparently pretty bad for the financials of both companies. But of course that was also around the time the economy imploded too.

    That’s what I’ve read anyway…

    • Incubus
    • 12 years ago

    are you gei

    • Suspenders
    • 12 years ago

    Interesting review. I have to agree with most of you, nothing earth shattering here. Thanks for the review!

    One request I’d have for future reviews is to include Shogun 2: Total War in the game test mix. It’s a proper PC exclusive (unlike most of the rest of the test games), and from the looks of it should be able to tax these cards fairly well. It’s supposed to get a DirectX 11 update soon through a patch. Also, I’ll be basing my future graphics card purchases on how well it will run Shogun 🙂

    • flip-mode
    • 12 years ago

    That is indeed extremely lame. I’ve always been disappointed in the pricing of the 5770.

    • Voldenuit
    • 12 years ago

    [quote=”OneArmedScissor”<]I think the stand out here is the test system used. Check out those power figures! Thank you very much for going with something that just makes sense and represents what most people would see in real life.[/quote<] Yeah, too many sites test budget cards with Core i7 EEs and X58 mobos and end up posting silly power draw figures (surprise surprise). Good to see TR is being sensible with testbeds for budget and midrange systems. I'm glad I got my Sapphire Vapor-X 5770 1 GB when I did - $109 after MIR. Performance may be a bit lower than the 550Ti, but it's drastically cheaper than the MSRP of the nvidia card - though I expect street prices to drop to match its performance soon. The Vapor-X card is dead silent, though its measly 10 MHz FOC was a bit insulting.

    • swaaye
    • 12 years ago

    They range from $120 to $140. The 900 MHz Juniper chips (like the one compared in this review) are up at the high end of that range. I am ignoring rebates because who knows if you will ever receive that money. Also, I’m sure you can expect to see GTX 550 cards with rebates soon enough here.

    • OneArmedScissor
    • 12 years ago

    The really lame thing is that the 5770s have only just recently started to sell for what the 4870s were going for. That was quite some time back, and everything from that point down has pretty much just been floundering around. The 6850 is about all that’s hadn’t been seen before under $200, but it’s riding that line.

    I figured we’d have a retail “6770” by now that at least bumps up into 4890 territory, but it looks like the 6700 tags are OEM only now. Considering that range’s absence, it’s as if AMD were waiting to see what Nvidia would do, but this isn’t terribly likely to spur them into action.

    Then again, maybe Nvidia is happy with that? They created a monster last time they started a price war with AMD and they’ve been paying for it ever since.

    • bittermann
    • 12 years ago

    How would it destroy the 5770? The performance certainly doesn’t point that it destroys it?

    I don’t understand the point of this card unless you drop the card by $20 to not compete with the 460’s.

    • UberGerbil
    • 12 years ago

    Power isn’t a factor. All of the power is delivered through the pins on the smaller connector section that is common to all PCIe “sizes” — so a PCIe x1 card can draw exactly as much power as a PCIe x16 card. The larger section is purely data (and ground) pins; there is no additional power provided no matter how many data lanes the slot might offer. There are about a half-dozen “reserved” pins, however, and those could be left without conductors.

    • Anvil
    • 12 years ago

    Doh, guess I missed that. This card would certainly make sense in that context for Nvidia anyway. For consumers, maybe not that much.

    • flip-mode
    • 12 years ago

    It sounds like I’m not the only one that is disappointed that this didn’t beat the 5770 while costing less at the same time. And this thing just barely beats the 5770 – it’s not beating it by any really respectable measure, it’s more like a TKO than a KO. I’d have loved to see a KO with a UC (knock out with [s<]upper[/s<] under cut!) 😆 The 5770 is nearing 18 months old and it's gone pretty well uncontested in it's price range the whole entire time. We're 18 months on and Nvidia's been schooled by the 5770 the whole time. It's not quite as bad as how AMD got dominated by the 8800GT, but it's still a schooling.

    • OneArmedScissor
    • 12 years ago

    They’re too inconsistent. They’ll test 40 boards/CPUs in a row that idle at 30-40w with their standard HTPC test setup, which makes up the majority of their articles. That’s all fine and dandy.

    But then when they do a graphics card review, a wildly different and completely illogical setup will appear out of thin air. They tested the 5870 with a Pentium D, and the 6800s with an Athlon II X3. Whaaaa?!? The power figures tend to look inflated compared to their others, as well, and that’s the only time they use a different PSU, which I don’t understand.

    Their CPU comparison test setup is downright bizarre. I can’t imagine there are many people out there running a Velociraptor and a 9400GT on computers with better integrated graphics lol.

    • sweatshopking
    • 12 years ago

    same price? a 5770 is 110$

    • esterhasz
    • 12 years ago

    The 5770 is really quite the bargain. It’s also the best selling DX11 card by an important margin and developers will make sure that games work at least decently on that card for many months to come…

    • Bauxite
    • 12 years ago

    Is it me, or are an awful lot of pins missing on that pci-e connector? Seems like more than just the extra ground pins would be, even though this is a lower power part.

    • swaaye
    • 12 years ago

    How about SilentPCReview. They don’t cover absolutely every graphics card but they do test them with a power and noise conscious mentality.

    • derFunkenstein
    • 12 years ago

    The article says, in fact, that the 768MB card is reaching EOL. But until such a time, I’d buy one over this thing.

    • spiked_mistborn
    • 12 years ago

    It’s nice to see how this card stacks up against the rest of the field. I was considering waiting for this card to be released before buying a video card, but I ended up getting a 5770 since Newegg has a Sapphire available for $109 after rebate and I already have another 5770 to use for CF. This makes me feel like I made the right choice with the 5770 since my very limited budget for computer hardware generally goes towards baby supplies. It’s nice to see that the 5770 is still a good option even after being on the market since October 2009!

    • OneArmedScissor
    • 12 years ago

    I think the stand out here is the test system used. Check out those power figures! Thank you very much for going with something that just makes sense and represents what most people would see in real life. I am not sure I have seen any other site do this with current hardware in the last few years.

    Can I bribe you to throw something like one of the very affordable 300-400w Seasonic or Antec Earthwatts PSUs in there next round? That would make it pretty much perfect.

    • swaaye
    • 12 years ago

    It’s not too bad. It’s usually faster than a 5770. The prices are so close that I’m not going to whine about that aspect. The 24px/32tx reminds me of 8800GTX but this has less memory bandwidth and fewer controllers so I wonder if it’s less efficient. The mixed memory configuration can’t be doing it any favors. Obviously it has considerably more shader throughput though.

    NV and AMD have most market segments so saturated these days that they are fighting themselves when they launch anything new. Options at every $20 increment it seems, and “last generation” cards like the 460 are the same price. Basically this is yet another NV rebrand, with some supposed core improvements mentioned to make it feel somewhat fresh and keep us from bitching too much.

    • Cyril
    • 12 years ago

    We originally wrote that the Zotac GeForce GTX 550 Ti AMP! Edition would retail for $169. That information, which we received from the manufacturer, turned out to be incorrect, as listings at [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16814500194&nm_mc=AFC-Techreport&cm_mmc=AFC-Techreport-_-NA-_-NA-_-NA<]Newegg[/url<] and other e-tailers show. We've updated this review accordingly. Our conclusion hasn't changed, though.

    • jthh
    • 12 years ago

    Man, if I didn’t read TR I would have assumed the 550 would be a clear performance upgrade over a 450 or 460. I am glad I read the review(s). Thanks for your work!

    • HisDivineShadow
    • 12 years ago

    This card is destined to have its price lowered in the very near future. In the meantime, it will simply serve to fill OEM boxes as an “upgrade” when the OEM doesn’t have a contract with AMD to go red/green.

    I imagine the GPU’s in this are also being used with slightly different wiring in laptops as the mid-high end, so it’s probably not a total loss for nVidia either way. I think anyone getting a card for a modern machine with a display at 1080p or higher will probably be skipping this and going for a 460 while they last or the 6850. If I had been nVidia, I probably wouldn’t have released this card as is. I would have dropped the price on it to let it destroy the 5770.

    • jensend
    • 12 years ago

    Trying to soup up a card like this by putting the extra 256MB on it and overclocking it seem a little pointless. I’d bet prices fall to ~$130 fairly soon.

    There’s one way I could see this being an interesting and worthwhile card- if standard clocked or slightly underclocked versions come out with lower voltages, this could finally be nV’s answer to the 57xx for the low power/silent pc crowd. I bet most of the 20W difference under load between it and the tested 450 could be erased without all that much of a performance hit if the board were tuned for low power.

    • Anvil
    • 12 years ago

    Maybe they’re phasing out the 768mb?

    Anyways, from the info we had about this card I always had the feeling that it was likely going to be a bit of a stinker. I guess my instincts were right in this case.

    • revparadigm
    • 12 years ago

    Exactly. That is what I bought. This card at that price is going to bomb.

    • Corrado
    • 12 years ago

    But you can get a GTX460 768mb for $130 and a 1GB for $150 after rebate. Makes this seem kind of redundant.

    • revparadigm
    • 12 years ago

    Personally i think Nvidia never should have come out with the GTS 450 and had this as their “top of the line budget card” for a 100 bucks. This really doesn’t hit anybody’s sweet spot imho. I bought a GTX 460 768 192 bit bus EVGA for almost the same amount as this list. I would be considered a classic budgeteer atm :)…and this doesn’t even get a sort of nice from me.

    • dpaus
    • 12 years ago

    Perhaps it’s there and I just missed it, but… Does the 550 Ti support more than 2 simultaneous displays?

    • tay
    • 12 years ago

    I think 130 would be fine too. I bet its at 130 in 2 months.

    • derFunkenstein
    • 12 years ago

    I agree, this card should be $110-120 max. For $150-160 you can do so much better.

    • flip-mode
    • 12 years ago

    Price / Performance is sucksmuch. For the asking price it is a very, very lame offering.

    • codedivine
    • 12 years ago

    Wow a GPU review from Cyril! Good review. Thanks!
    Question: Does 550 Ti support double-precision? 450 did so I am assuming this does also?

Pin It on Pinterest

Share This

Share this post with your friends!