Nvidia seems to be all about plugging holes in its product line lately. A couple of months back, we saw the arrival of the GeForce GTX 550 Ti, a GPU aimed at the previously unoccupied $149 price point. That part ended up sandwiched between the cheaper GeForce GTS 450 and the more upscale GTX 460 1GB, providing a decent compromise for shoppers too indecisive to choose between two starkly different options.
Today, in the form of the new GeForce GTX 560, Nvidia is extending another olive branch to shoppers who are fond of middle grounds. Priced at $199, this card is meant to sit north of the GTX 460 1GB and south of the quicker GTX 560 Ti. Although its name might suggest otherwise, the newcomer doesn’t actually replace any existing products—at least, that’s what Nvidia is telling us. The GTX 460 is staying put for now despite its age.
You could say Nvidia is really just splitting hairs at this point. According to the latest listings on Newegg, you can get a GTX 560 Ti for as little as $232.99, or $212.99 after a mail-in rebate. As we’re about to see, some of the freshly introduced GTX 560 offerings are marked up slightly above Nvidia’s suggested $199 e-tail price, so we’re likely to see at least some overlap between standard and Ti versions. Yeah, I don’t think that’s going to help indecisive folks any.
Now, how exactly does the GTX 560 differ from the 560 Ti? You know, beside dropping those extra two letters and carrying a slightly-lower-but-maybe-not-always price tag. Well, I’m glad you asked!
The GeForce GTX 560 and 560 Ti are both based on Nvidia’s GF114 graphics processor, but the newcomer has one of that chip’s eight shader multiprocessors disabled. Since each SM has 48 stream processors, eight texture units, and one polymorph engine, that means the GTX 560 has 336 SPs, 56 texture units, and seven polymorph engines. Other key attributes, like memory size, memory interface width, and ROPs, are unchanged from the full-fat GF114 found in the GeForce GTX 560 Ti.
Graphics architecture buffs might notice that the GTX 560’s configuration resembles that of the original GeForce GTX 460 1GB, which features the older GF104 GPU. The GF104 and GF114 are similar architecturally, but the latter has been further optimized for TSMC’s 40-nm fab process.
That all sounds simple enough… until we take a look at clock speeds. Here’s what Nvidia told us when we asked about the standard, reference clock rates for the GTX 560:
Because the chip is a drop-in replacement for existing GF104/GF114 designs on the market, our partners are offering a wild range of GTX 560 varieties. There is no one “official” speed. The right word to use is probably “slowest”.
The slowest clock is 810MHz core/4004MHz memory, but most of those boards will be shipped to OEMs. At the etail level the vast majority of boards will ship between 850-950MHz.
(The e-tail cards Nvidia is talking about have memory speeds ranging from 4.0 to 4.4 GT/s, by the way.)
Funnily enough, then, the GTX 560s sold at online retailers will actually be clocked higher than many GTX 560 Ti variants. The slowest versions of that product run at 822MHz with their memory pushing bits at around 4 GT/s, which is slightly south of e-tail GTX 560 territory. Here are the implications of those discrepancies when it comes to peak theoretical performance numbers:
|GeForce GTX 460 1GB||21.6||37.8||37.8||907||1350||115|
|GeForce GTX 560 Twin Frozr II||27.8||48.7||48.7||1169||1740||131|
|GeForce GTX 560 TOP||29.6||51.8||51.8||1243||1850||134|
|GeForce GTX 560 Ti||26.3||52.6||52.6||1263||1644||128||GeForce GTX 560 Ti AMP!||30.4||60.8||60.8||1459||1900||141|
|Radeon HD 6850||24.8||37.2||18.6||1488||775||128|
|Radeon HD 6870||28.8||50.4||25.2||2016||900||134|
|Radeon HD 6870 TOP||29.3||51.2||25.6||2016||915||134|
|Radeon HD 6950||25.6||70.4||35.2||2253||1600||160|
As you can see, the GTX 560 cards have higher peak pixel fill rates, higher rasterization rates, and more memory bandwidth than their stock-clocked Ti cousin. It takes a higher-clocked version of the GTX 560 Ti—in our example, Zotac’s $267.99 GTX 560 Ti AMP! Edition, which runs at 950MHz with its memory at 4 GT/s—to pull ahead of GTX 560s. I suppose it’s no surprise that a much more expensive product would be faster, but that doesn’t make matters any less complicated in the $200-$250 range, where GTX 560 and GTX 560 Ti cards look set to coexist.
In a moment, we’ll look at how the GTX 560s compare to their Ti-infused brethren in some real-world games. First, though, let’s take a closer look at the two GTX 560 variants we’ll be testing.
Asus and MSI were kind enough to each send us their version of Nvidia’s new graphics card.
MSI’s offering, dubbed the N560GTX Twin Frozr II, is the slower and more affordable of the two. The card runs at 870MHz with a memory speed of 1020MHz (or 4.08 GT/s), and it has a suggested asking price of $209.99. MSI outfits the Frozr with a dual-fan cooler, “Military Class II” components, and a 6+1 power-phase design. The circuit board also features “anti-warpage support.” Translation: MSI engineered this puppy with durability and overclocking in mind.
Priced at $219.99, Asus’ GTX 560 DirectCU II TOP is slightly more onerous than its MSI counterpart, but it justifies the premium with higher clock speeds: 925MHz for the GPU and 1050MHz (or 4.2 GT/s) for the memory.
Asus also includes a dual-fan cooler (branded DirectCU II because of its copper base), and it touts “best in class PCB Design and Components with Super Alloy Power Design that delivers 15% Performance boost, 2.5 longer lifespan, 35C cooler operation for the [voltage regulator modules].” As with the MSI card, provisions to prevent the card from warping are on the menu. There are even LEDs on the circuit board that change color when you plug in the two required six-pin PCI Express power connectors—somewhat superfluous but neat nonetheless.
Nvidia’s GeForce GTX 560 isn’t launching in a vacuum. In fact, right out of the gate, it will have its mettle tested by AMD’s Radeon HD 6870, which also retails around the $200 mark. Asus sent us its HD 6870 DirectCU to represent the Radeon team against the latest GeForce:
This Radeon currently retails for $199.99 ($179.99 after a mail-in rebate) at Newegg, so it has the virtue of being cheaper than both of our contenders from the GTX 560 clan. Don’t let the lower price fool you, though. Asus has tuned and tweaked this design, too, raising the GPU clock speed from 900MHz to 915MHz and installing a custom DirectCU cooler with heat pipes and a copper base. The company claims the card offers a longer life span and considerably quieter operation over the reference 6870 thanks to the DirectCU cooler and choice electrical components.
Our testing methods
We conducted testing using the latest drivers from Nvidia and AMD. Nvidia provided us with 275.20 drivers for its GeForce cards, and we used AMD’s freshly released Catalyst 10.5a hotfix drivers with the Radeons. When benchmarking the Radeons, we left optional AMD optimizations for tessellation and texture filtering disabled.
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.
Our test system was configured as follows:
|Processor||Intel Core i5-750|
|North bridge||Intel P55 Express|
|Memory size||4GB (2 DIMMs)|
|Memory type||Kingston HyperX KHX2133C9AD3X2K2/4GX
DDR3 SDRAM at 1333MHz
|Memory timings||9-9-9-24 1T|
|Chipset drivers||INF update 220.127.116.115
Rapid Storage Technology 10.1.0.1008
with Realtek R2.57 drivers
|Graphics||XFX Radeon HD 6850 1GB
with Catalyst 8.85.6 RC1 drivers
|Asus Radeon HD 6870 TOP
with Catalyst 8.85.6 RC1 drivers
|XFX Radeon HD 6950 1GB
with Catalyst 8.85.6 RC1 drivers
|Zotac GeForce GTX 460 1GB
with GeForce 275.20 beta drivers
|MSI GeForce GTX 560 Twin Frozr II
with GeForce 275.20 beta drivers
|Asus GeForce GTX 560 TOP
with GeForce 275.20 beta drivers
|Zotac GeForce GTX 560 Ti AMP!
with GeForce 275.20 beta drivers
|Hard drive||Samsung SpinPoint F1 HD103UJ 1TB SATA|
|Power supply||Corsair HX750W 750W|
|OS||Windows 7 Ultimate x64 Edition
Service Pack 1
Thanks to Intel, Kingston, Samsung, MSI, and Corsair for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.
Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.
We used the following test applications:
- Aliens vs. Predator benchmark
- Crysis 2
- Sid Meier’s Civilization V
- Just Cause 2
- Metro 2033
- Fraps 3.3.2
- GPU-Z 0.5.1
Some further notes on our methods:
Many of our performance tests are scripted and repeatable, but for Bulletstorm, we used the Fraps utility to record frame rates while playing a 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We raised our sample size, testing each Fraps sequence five times per video card, in order to counteract any variability. We’ve included second-by-second frame rate results from Fraps for those games, and in that case, you’re seeing the results from a single, representative pass through the test sequence.
We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.
The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Bulletstorm at a 1920×1080 resolution with 4X AA and 16X anisotropic filtering.
We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 8″ from the test system at a height even with the top of the video card.
You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.
We used GPU-Z to log GPU temperatures during our load testing on all cards but the Radeon HD 6790, which wasn’t supported by GPU-Z’s temperature probing component. With that card, we ran the load test in a window and jotted down the temperature reported by AMD’s Catalyst Control Center.
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
I’ve made no secret of my appreciation for Bulletstorm‘s cathartic gameplay and gorgeous environments, so it seems like a fitting start to our round of benchmarking. This shooter was tested at 1920×1080 with 4X antialiasing and detail settings cranked up to “high.”
Since this game has no built-in benchmarking mode, I played through the first 90 seconds of the “Hideout” echo five times per card, reporting the median of average and low frame rates obtained.
We’ve seen Radeons pull ahead in this game a few times before, so we’re not surprised to see the GeForce GTX 560s trailing the slightly cheaper Radeon HD 6870. That said, 58 FPS is awfully close to the 60Hz refresh rate of most current monitors, and the GTX 560 TOP actually pulled off slightly higher minimum frame rates than the 6870.
Note that our amped-up GeForce GTX 560 Ti is quite a bit faster than our quickest GTX 560, even though some of the cards’ peak theoretical numbers are in the same neighborhood. Meanwhile, the stock GeForce GTX 460 1GB drags its feet, clearly outclassed by Nvidia’s new hotness.
Crytek’s new cross-platform shooter has garnered criticism for omitting features like DirectX 11 shader effects and being less PC-focused than the original Crysis and Far Cry. Nevertheless, this title still make a high-end gaming PC sweat—and it looks gorgeous.
Since Crysis 2 is another one of those games without a built-in benchmark, I resorted to 90-second Fraps sessions, just as with Bulletstorm. I ran and gunned my way through the game’s version of Battery Park five times per card, sticking to the same path through the level to avoid drastic differences between samples. The game was set to run at a 1920×1080 resolution with the “Extreme” detail preset.
In Crysis 2, our cards produce results opposite those we saw in Bulletstorm. The GeForces have a sizeable advantage across the board, and from a seat-of-the-pants perspective, they definitely feel quicker.
Civ V has several interesting tests, including a built-in compute shader benchmark that measures the GPU’s ability to decompress textures used for the graphically detailed leader characters depicted in the game. The decompression routine is based on a DirectX 11 compute shader. The benchmark reports individual results for a long list of leaders; we’ve averaged those scores to give you the results you see below.
In this non-gameplay test, the GTX 560 cards also pull ahead of the Radeon HD 6870.
In addition to the compute shader test, Civ V has several other built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. As with the earlier compute shader test, we chose to average the results from the individual leaders.
Load up one of Civilization V‘s Leader scenes, and the tables turn, with the Radeon HD 6870 and 6950 now taking the lead. This distinction is somewhat academic, though, because even the GTX 460 1GB churns out a smoother-than-butter 84 FPS.
Another benchmark in Civ V focuses on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.
Busy scenes like this one are where you’re likely to spend most of your time when playing Civ V. Here, the GeForces have a genuine advantage. That advantage isn’t just academic, either, since the difference between 40 and 51 FPS should be palpable.
Just Cause 2
Although it’s not the newest kid on the block, JC2 is a good example of a relatively resource-intensive game with flashy DirectX 10 visuals. It doesn’t hurt that the game has a huge, open world and addictively fun gameplay, either.
This title supports a couple of visual effects generated by Nvidia’s CUDA GPU-computing API, but we’ve left them disabled for our testing. The CUDA effects are only used sparingly in the game, and we’d like to keep things even between the different GPU brands.
We tested performance with JC2‘s built-in benchmark, using the “Dark Tower” sequence.
The GTX 560 cards score another win in Just Cause 2. They squeeze ahead of the Radeon HD 6870 and, in the case of the GTX 560 TOP, nip at the Radeon HD 6950’s heels.
We ran Metro 2033‘s built-in benchmark using the “High” graphical preset with 4X antialiasing and 16X anisotropic filtering. PhysX effects were left disabled to ensure a fair fight between all of our contestants.
Uh, so, yeah… I think we can call this one a tie.
Aliens vs. Predator
AvP uses several DirectX 11 features to improve image quality and performance, including tessellation, advanced shadow sampling, and DX11-enhanced multisampled anti-aliasing. Naturally, we were pleased when the game’s developers put together an easily scriptable benchmark tool. This benchmark cycles through a range of scenes in the game, including one spot where a horde of tessellated aliens comes crawling down the floor, ceiling, and walls of a corridor.
For these tests, we turned up all of the image quality options to their maximums, along with 2X antialiasing and 16X anisotropic filtering.
Our AvP results are pretty much a toss-up, as well. The Radeon HD 6870 ends up smack in the middle of the two GTX 560s.
The two GeForce GTX 560 cards use no more energy than the stock-clocked GeForce GTX 460 1GB at idle, but they’re quite a bit more power-hungry under load. That’s not exactly a big surprise, considering their higher clock speeds and clearly superior performance. The Radeon HD 6870 is the opposite, drawing roughly the same number of watts under load but chugging an extra 10W at idle.
Noise levels and GPU temperatures
At idle, you’re more or less guaranteed a quiet experience—unless you opt for a GeForce GTX 460 1GB like ours, which has a stubby, rather noisy cooler.
Under load, the Radeon HD 6870 TOP’s low power consumption and fancy cooler pay dividends, enabling the lowest noise levels of the pack. The GeForce GTX 560 TOP isn’t far behind, though, and my ears didn’t detect much of a difference between the two. They did hear a light hum with the GTX 560 Frozr II that wasn’t present on either of the Asus cards.
Performance-per-dollar scatter plots have become a staple of our graphics card reviews. We create these plots by averaging frame rates obtained at the maximum resolutions in all of the games we run (except for Civilization V, where we discard the DirectCompute and Leader data) and factoring in prices from Newegg.
As always, we should warn that, while our scatter plots are interesting and potentially useful curiosities, they don’t show the whole picture. Our average performance numbers are derived from a limited sample of games, so they’re by no means absolutes. Even if they were, noise and power consumption are also important factors when choosing a graphics card, and they’re not accounted for here.
Providing a definite recommendation is just a little bit difficult when you’ve got a nearly linear progression between the cards, as we see from the 6870 TOP through to the GTX 560 Ti AMP! Edition. But we can certainly try.
Based on our tests, it’s pretty clear that the GTX 560 has a raw performance edge over the competition from AMD overall. However, that performance advantage isn’t huge, and the Asus Radeon HD 6870 TOP has the distinction of being uncannily quiet for a $200 graphics card. Oh, sure, it draws a little more power at idle, but that doesn’t seem to affect noise levels. As icing on the cake, the Radeon can drive as many as six displays.
In the other corner, the GeForce GTX 560 TOP is faster, nearly as quiet, and costs only $20 more. It has company, too. MSI’s GeForce GTX 560 Frozr II is almost as quick, only a little bit louder, and $10 cheaper. While neither card has hexa-display support, both let you enjoy Nvidia-specific perks like PhysX and GeForce 3D Vision. Some users like those extras, while others couldn’t care less. You can make up your own mind on that front.
In the end, I think it’s safe to say Nvidia’s new $200(-ish) graphics card is a decent option for users who can afford something just a little bit nicer than a GeForce GTX 460 1GB. I’m not thrilled about the tight grouping of GTX 560 and entry-level GTX 560 Ti cards in the $200-250 range, but I wouldn’t be surprised to see at least some GTX 560 variants fall well below $200—maybe not straight away, but certainly after a few weeks.
Of course, I also wouldn’t waste too much time worrying about $10-20 or frame-rate differences that can be counted on the fingers of one hand. You don’t want to be splitting hairs, now, do you?