The GeForce GTX 600-series lineup hasn't been sitting still since its introduction, either. Nvidia has long given its partners wide latitude in setting clock speeds, and the resulting cards in this generation are much more attractive than the stock-clocked versions. We've lined up several of them to face off against the 7970 GHz Edition and friends, including a pair of ringers from Zotac.
If the Radeon HD 7970 GHz Edition wants to own the title of the fastest single-GPU graphics card, it'll have to go through Zotac's GeForce GTX 680 AMP! Edition. At $549.99, the GTX 680 AMP! costs a bit more than the newest Radeon, but what's 50 bucks in this rarefied air? You will also have to accept the potential clearance issues created by the heatpipes protruding from the top of Zotac's custom cooler and the fact that this thing eats up three expansion slots in your PC. In return, the GTX 680 AMP! is a pretty substantial upgrade over the stock GTX 680.
Believe it or not, Zotac's GeForce GTX 670 AMP! is also an upgrade over the stock GeForce GTX 680. Yes, the GK104 graphics processor in the GTX 670 has had a mini-lobotomy—one of its eight SMX units disabled—but Zotac more than makes up for it with aggressive core and memory clocks. Have a look at the numbers.
|Zotac GTX 670 AMP!||1098||1176||38||132/132||3.2||6.6 GT/s||211||$449|
|GeForce GTX 680||1006||1058||34||135/135||3.3||6 GT/s||192||$499|
|Zotac GTX 680 AMP!||1111||1176||38||151/151||3.6||6.6 GT/s||211||$549|
Assuming the GPU typically operates at its Boost clock speed—and that seems to be a solid assumption to make with GK104 cards—then Zotac's GTX 670 AMP! nearly matches the stock GTX 680 in texture filtering and shader flops. Since the GTX 670 silicon isn't hobbled at all in terms of memory interface width or ROP count, the AMP! matches its bigger brother in terms of pixel fill rate (which corresponds to multisampled antialiasing power) and memory throughput, surpassing the stock GTX 680. On paper, at least, I'd expect the GTX 670 AMP! to outperform a stock GTX 680, since memory bandwidth may be the GK104's most notable performance constraint.
Along those lines, notice that the fastest cards above have "only" 211 GB/s of memory throughput, while the 7970 GHz Edition is rated for 288 GB/s. That's a consequence of the fact that Nvidia's GK104 is punching above its weight class. This middleweight Kepler only has a 256-bit memory interface. The Tahiti chip driving the Radeon HD 7900-series cards is larger and sports a 384-bit memory interface. By all rights, AMD ought to be able to win this contest outright. The fact that folks are buying up GTX 680 cards for 500 bucks or more is vaguely amazing, given the class of hardware involved. But, as we'll see, the performance is there to justify the prices.
Before we dive into the test results, I should mention a couple of things. You will notice on the following pages that we tested games at very high resolutions and quality levels in order to stress these graphics cards appropriately for the sake of our performance evaluation. We think that's appropriate given the task at hand, but we should remind you that a good PC gaming experience doesn't require a $450+ video card. We've hand-picked games that are especially graphically intensive for our testing. Not every game is like that.
For instance, we wanted to include Diablo III in our test suite, but we found that, on this class of graphics card, it runs at a constant 100 FPS with its very highest image quality settings at our monitor's peak 2560x1600 resolution. Diablo III is a pretty good looking game, too, but it's just no challenge.
Along the same lines, we have tested practically everything at a resolution of 2560x1600. We realize that 1080p displays are the current standard for most folks and that they're much more widely used than four-megapixel monsters. Here's the thing, though: if you're going to fork over the cash for a $500 video card, you'll want a high-res display to pair with it. Maybe one of those amazingly priced Korean 27" monitors. Otherwise, the video card will probably be overkill. In fact, as we were selecting the settings to use for game testing, the question we asked more often was whether we shouldn't be considering a six-megapixel array of three monitors in order to properly stress these cards. Also, in the odd case where we did think 1920x1080 might be appropriate, we found that the beta Nvidia drivers we were using didn't expose that resolution as an option in most games.
|Samsung's Notebook 9 portables rock eighth-gen Core i7s||2|
|Thursday deals: a nice Z370 mobo, a huge VA display, and more||0|
|Rumor: Ryzen 2 set for Q1 2018 and a Fenghuang APU breaks cover||42|
|TR's 2017 Christmas giveaway: eight days left and counting||7|
|MSI gives Radeon RX Vega cards an Air Boost||22|
|Corsair's latest SO-DIMM kit takes 32 GB of DDR4 to 4000 MT/s||8|
|Report: Intel Inside co-marketing program will get a budget cut||32|
|Gingerbread House Day Shortbread||17|
|iMac Pro details and release date come into focus||49|