Not to be outdone, Nvidia has responded to the R7 260X launch with some price cuts, which it says are permanent.
As of Monday, the GeForce GTX 650 Ti Boost 2GB is available for $149.99, or $129.99 after a mail-in rebate. The 1GB version of the same card is down to $129.99, or $109.99 after a mail-in rebate. Those are $20 cuts over last week's prices, and they put the 650 Ti Boost cards smack dab in R7 260X territory.
That's a big deal. Our review of the GeForce GTX 650 Ti Boost 2GB showed that card substantially outperforming a retail 7790 clocked not far below the R7 260X. The GTX 650 Ti Boost 1GB should be slower than its 2GB sibling, since its memory runs at 5 GT/s instead of 6 GT/s, but it should still be a tantalizing solution at $129.
For reference, here's how these cards all stack up on paper. (Don't worry; we'll get to the game benchmarks in a minute.)
|Radeon HD 7790||16||56||28||1.8||2.0||96.0|
|Radeon HD 7790 (Asus)||17||60||30||1.9||2.2||102|
|Radeon R7 260X||18||62||31||2.0||2.2||104|
|Radeon HD 7850||28||55||28||1.8||1.7||154|
|GeForce GTX 650 Ti||15||59||59||1.4||1.9||86|
|GeForce GTX 650 Ti 2GB (Zotac)||15||60||60||1.4||1.9||86|
|GeForce GTX 650 Ti Boost 1GB||25||66||66||1.6||2.1||120|
|GeForce GTX 650 Ti Boost 2GB||25||66||66||1.6||2.1||144|
|GeForce GTX 650 Ti Boost 2GB (Asus)||26||69||69||1.7||2.2||144|
The Radeon R7 260X has higher peak shader performance and a higher peak rasterization rate than both versions of the GeForce GTX 650 Ti Boost. However, 650 Ti Boost cards have higher peak pixel fill rates and texture filtering rates. They also have higher peak memory bandwidth, thanks to their wider, 192-bit memory interfaces. The R7 260X may fare better in highly shader- or tessellation-intensive titles, but it looks like the GeForces could have the edge otherwise.
Our testing methods
We tested using our tried-and-true "inside the second" methods. Since we don't have FCAT equipment up here at TR North, we used Fraps to generate all our performance numbers.
Fraps gives us information about things happening at the start of the rendering pipeline—not, as FCAT does, at the end of the pipeline, when frames reach the display. Having both sets of numbers would be better, but the Fraps data is largely sufficient for the kind of testing we're doing here. We don't expect there to be much of a discrepancy between Fraps and FCAT numbers on single-GPU, single-monitor configurations like these.
In any event, our "inside the second" Fraps numbers are far more informative than the raw frames-per-second data produced by more conventional benchmarking techniques. Such data can cover up problems like latency spikes and micro-stuttering, which have a real, palpable impact on gameplay.
For more information about Fraps, FCAT, and our inside-the-second methodology, be sure to read Scott's articles on the subject: Inside the second: A new look at game benchmarking and Inside the second with Nvidia's frame capture tools.
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we reported the median results. Our test systems were configured like so:
|Processor||Intel Core i7-3770K|
|North bridge||Intel Z77 Express|
|Memory size||4GB (2 DIMMs)|
|Memory type||AMD Memory
DDR3 SDRAM at 1600MHz
|Chipset drivers||INF update 188.8.131.521
Rapid Storage Technology 11.6
|Audio||Integrated Via audio
with 6.0.01.10800 drivers
|Hard drive||Crucial m4 256GB|
|Power supply||Corsair HX750W 750W|
|OS||Windows 8 Professional x64 Edition|
|Driver revision||Base GPU
|Asus Radeon HD 7790||Catalyst 13.11 beta V1||1075||1600||1024|
|AMD Radeon R7 260X||Catalyst 13.11 beta V1||1100||1625||2048|
|XFX Radeon HD 7850 1GB||Catalyst 13.11 beta V1||860||1200||1024|
|Zotac GeForce GTX 650 Ti 2GB (simulated)||GeForce 331.40 beta||941||1350||2048|
|EVGA GeForce GTX 650 Ti Boost 1GB||GeForce 331.40 beta||980||1502||1024|
|Asus GeForce GTX 650 Ti Boost 2GB||GeForce 331.40 beta||1020||1502||2048|
Thanks to AMD, Corsair, and Crucial for helping to outfit our test rig. AMD, Asus, Nvidia, XFX, and Zotac have our gratitude, as well, for supplying the various graphics cards we tested.
Image quality settings for the graphics cards were left at the control panel defaults, except on the Radeon cards, where surface format optimizations were disabled and the tessellation mode was set to "use application settings." Vertical refresh sync (vsync) was disabled for all tests.
We used the following test applications:
Some further notes on our methods:
We used the Fraps utility to record frame rates while playing 60- or 90-second sequences from the game. Although capturing frame rates while playing isn't precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card in order to counteract any variability. We've included frame-by-frame results from Fraps for each game, and in those plots, you're seeing the results from a single, representative pass through the test sequence.
We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.
The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Crysis 3 at the same quality settings used for our performance testing.
We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 8" from the test system at a height even with the top of the video card.
You can think of these noise level measurements much like our system power consumption tests, because the entire systems' noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card's highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.
We used GPU-Z to log GPU temperatures during our load testing.
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
|SilverStone Nitrogon NT08-115XP cooler fits in nearly any case||2|
|Samsung set to disable remaining Galaxy Note 7 handsets||33|
|Deals of the week: laptops and spinning storage||12|
|Qualcomm readies up 48-core Centriq 2400 ARM server chip||54|
|BitFenix Shogun chassis goes for internal and external coolness||3|
|AMD and Intel join forces for a bundle of hardware and games||59|
|Report: Samsung Galaxy S8 may go into full-screen mode||23|
|Gigabyte XK700 keyboard will challenge your limits||22|
|Microsoft and Intel set to bring AR to the people with Project Evo||10|