By that standard, NVIDIA's GeForce 6 series graphics processors have had plenty of success. Since the introduction of the GeForce 6800 Ultra last April, NVIDIA has rolled out a top-to-bottom rework of its entire GPU lineup, with the exception of the integrated graphics core in its nForce chipsetsand we spotted early versions of the new C51G chipset at Computex, complete with GeForce 6-class integrated graphics. The scope of the product refresh is impressive, but the real news is the potency of those products. In nearly every segment from uber-high-end setups down to cheapo $59 graphics cards, NVIDIA has contended closely with ATI for market leadership. And in some cases, like the $199 sweet spot for gamers' graphics cards, the GeForce 6 series has been the uncontested leader. Cards based on the NV4x series of chips have packed a wallop in terms of pixel processing power, enabling new sorts of effects in real time that game developers have only just begun to employ.
Now, NVIDIA is back with a new high-end chip, known by the code-name G70, that promises significantly more power than the GeForce 6800 Ultra. To add to the intrigue, the G70 chip is closely related to the RSX graphics processor that NVIDIA is developing for the PlayStation 3. Can NVIDIA duplicate the success of the NV40 series by following it up with a worthy successor? Can the GeForce 7800 GTX really deliver on NVIDIA's claims of up to twice the shader power of the GeForce 6800 Ultra? And if you run a pair of GeForce 7800 GTX cards together in an SLI config, will the fabric of time and space warp? Keep reading for some answers.
Inside the G70 GPU
Despite NVIDIA's protestations to the contrary, the G70 architecture is clearly derived from the NV40. Not that there's anything wrong with that. The NV40 and G70 both have a full implementation of Shader Model 3.0, the programming model for Windows graphics that looks like it will be the standard until Microsoft's Longhorn OS arrives. Shader Model 3.0 packs plenty of capability and precision for real-time graphics use in the next year or two, at least, and ATI's upcoming R520 chip is rumored to support SM3.0, as well.
In order to understand how the G70 differs from and improves upon the NV40 architecture, let's take a look at a simplified block diagram of the G70 design.
Not only does G70 have more vertex units, but those units can complete more operations per clock than the NV40's. NVIDIA says it has tweaked the vertex shader units so that they now can process MADD (multiply-add) operations in a single clock cycle. These tweaked vertex units are purportedly up to 30% faster in scalar math ops, as well.
Across the middle of the block diagram are the G70's pixel shader units, arranged in groups of four. This grouping of pixel shaders into "quads" is not a change from NV40; this diagram is simply more accurate than the ones NVIDIA supplied to us at the time of our GeForce 6800 Ultra review. The shader units in each quad share some on-chip resources, including an L2 texture cache. We have known about the grouping of shader pipes into quads for some time now (ATI does it, too), but the really interesting thing in the diagram above is the 2x2 arrangement of pixel shader units, so that two of them are in line with two others. This arrangement isn't really a faithful representation of the flow of data inside the chip; the G70's "quad" piplines actually operate on four fragments concurrently and in parallel.
|ROG Strix X299-XE Gaming motherboard is rather groovy||4|
|Miniature Golf Day Shortbread||8|
|GeForce 385.69 drivers are Game Ready for a ton of titles||1|
|Thursday deals: big external drives, a sweet case, and more||3|
|Google acqui-hires 2,000 HTC employees for $1.1 billion||20|
|Some of AMD's next chips will arrive on GloFo's new 12LP process||36|
|The Tech Report System Guide: September 2017 edition||61|
|Intel shows off 10-nm Cannon Lake wafer and talks process tech||26|
|AOC Agon AG322QCX offers 32" of gaming goodness on the cheap||24|
|I still would strongly recommend against any of Kaby-Lake X SKUs unless you plan on upgrading to a Skylake-X down the road. Just stick with 7700K and...||+22|