The GeForce GTX 770
The GeForce GTX 770 debuted a few weeks ago at $400. This higher-end offering is basically a juiced-up version of the GTX 680 for about $20 less. Today, we take our first look at how it performs.
Unlike the other members of the GeForce 700 series, the GTX 770 has a fully functional GPU. The GK104 chip is the same as what's found in the GTX 760, but all four GPCs are intact, and so are the SMX units that lie within them. The same configuration was used in the GeForce GTX 680. This time around, however, the clock speeds have been turned up.
The GeForce GTX 770 has base and Boost clocks of 1046MHz and 1085MHz, respectively. Those are modest increases over the 1006/1058MHz clocks of the old GTX 680, and Nvidia's new GPU Boost 2.0 algorithm deserves some of the credit. The clock-boosting tech is the same as that employed by the GTX 760 and other members of the 700 series.
On the memory front, the GeForce GTX 770 offers a 7 GT/s transfer rate, up 1 GT/s from the GTX 680. The memory interface is still 256 bits wide, so bandwidth has risen by a substantial 17%. Standard cards are available with 2GB of GDDR5 memory, and some vendors are offering 4GB variants for around $450. Card makers have also concocted hot-clocked models with Boost frequencies up to 1202MHz and memory transfer rates as high as 7.2 GT/s.
To accommodate higher clock speeds, the GeForce GTX 770 also has a higher thermal envelope. The 230W TDP is up 35W from the GTX 680's, and the onboard power connectors have changed to help supply the additional power. Instead of the dual 6-pin PCIe connectors on the GTX 680, the GTX 770 has one six-pin connector and one eight-pin one.
The card we've tested is an Nvidia reference model that uses the same swanky cooler as the GeForce Titan. This cooler is beautifully crafted from magnesium and aluminum, and it's whisper quiet. Unfortunately, the heatsink doesn't seem to be available on GeForce GTX 770s out in the wild. Not one of the cards listed at Amazon or Newegg features the Titan cooler. Instead, they're all equipped with custom solutions that quite likely aren't as nice. Keep that in mind when looking at the noise and temperature results later in this review.
Other things of note
Nvidia has a handful of software bonuses for its recent GeForce cards. The first is GeForce Experience, which automates driver updates and game setting optimizations. Automating driver updates is fairly straightforward. Optimizing in-game detail settings based on the user's hardware is a little more involved, though all the work is done on Nvidia's end.
Games typically make their own settings recommendations based on system hardware. Those defaults tend to be fairly conservative, and they don't always recognize new graphics cards. GeForce Experience is more aggressive, and it knows all about the latest GeForce models. It's also capable of modifying game config files directly, making the optimization process a one-click affair for end users.
GeForce Experience's optimization intelligence is based on profiling work conducted by Nvidia. The firm uses human testers to find demanding sections of games and benchmark the performance impact of various graphical settings. The performance impact of individual settings is weighted against their visual impact. Minimum frame rates are also defined based on the nature of the gameplay. All this information is fed into a software simulator that performs loads of iterative testing to determine the ideal settings for various hardware configurations.
For newbies who don't know the difference between ambient occlusion and antialiasing, GeForce Experience takes the guesswork out of graphics tweaking—and explains how the various settings affect image quality. The settings recommendations aren't just for the uninitiated, either. They can also be used as a starting point from which seasoned enthusiasts can proceed with further fiddling. The list of profiled games is already quite extensive.
On Kepler-based graphics cards, GeForce Experience will also serve as the server software for Nvidia's Shield gaming handheld. Only a handful of games presently support streaming to the device, which is due to be released June 27.
Shield streaming relies on the H.264 encoding block incorporated in Kepler GPUs. Next month, that block will also be used by Nvidia's ShadowPlay software. This application promises to record gaming sessions with much less of a performance penalty than existing game capture software. In fact, the performance overhead is so minimal that Nvidia expects gamers to have the feature enabled at all times. The always-on "shadow" mode allows users to allocate a chunk of system storage to recording the last few minutes of gameplay, ensuring there's always evidence of epic feats. Let's hope there's an option for SSD users to point ShadowPlay to mechanical storage. There is an option for manual recording, and ShadowPlay may eventually support live broadcast via streaming services.
|Steam's 2017 Summer Sale is downright hot||3|
|Asus XG-C100C NIC breaks the gigabit barrier||10|
|Stuff a terabyte of RAM in Gigabyte's MZ31-AR0 Epyc motherboard||16|
|National HVAC Tech/Onion Ring Day Shortbread||16|
|Imagination Technologies hangs a "for sale" sign in its window||12|
|Vulkan is about to erupt in CryEngine 5.4||0|
|Mionix's new RGB LED keyboard lights the Wei forward||5|
|ThinkPad lineup will get a retro model for its 25th anniversary||22|
|Netgear readies the Nighthawk X6S for take-off||23|
|That last number didn't sink in slowly; It was more like a number-howitzer fired it straight through Intel's corporate brain, gory mess and all. Serio...||+45|