NVIDIA’s GeForce 7600 GT and 7900 series GPUs

THE STRETCH FROM late 2005 into early ’06 has been a remarkably busy time in the world of PC graphics hardware. Most of it has been ATI’s doing. Their CrossFire dual GPU technology was tardy, but it showed up to class just before the end of September. Shortly thereafter, ATI finally delivered its much-delayed Radeon X1800 GPU as part of a top-to-bottom lineup of new graphics chips, all among the first GPUs to be built on a 90nm fab process. Among other things, the Radeon X1800 XT put the company back on top in the high-end performance sweeps, however briefly. The welts had barely faded from the whooping NVIDIA laid on the Radeon X1800 XT with its GeForce 7800 GTX 512 when ATI countered with its new high-end GPU (right on schedule this time), the Radeon X1900. Not only was the Radeon X1900 XT the new fastest card on the block, but it got some help on the multi-GPU last week in the form of a new dual sixteen-lane PCI Express chipset, the CrossFire Express 3200.

Phew.

Now it’s time for another round of new stuff, and the competition is tighter than ever. Today, NVIDIA is unveiling two GPUs powering three different flavors of video cards. Say hello to the GeForce 7600 and 7900 series, new mid-range and high-end graphics cards ranging from $199 to, well, very expensive. These introductions mark NVIDIA’s transition in earnest to 90nm fab process technology, a place where ATI has been for a number of months.

Speaking of ATI, the folks there couldn’t stand by and watch new GeForces arrive without doing something, so they’ve decided to unveil a new $249 Radeon card and rework their product lineup to include a range of more affordable options. We’ve tested the Radeon X1800 GTO alongside the new NVIDIAs.

Oh, yes. And we have the juicy details on the King Kong of PC graphics, quad SLI, where not one, not two, but four GeForce 7900 GPUs can pull together to squash all competition and raid your Swiss bank account.

The G71 graphics processor
Code-named G71, NVIDIA’s new high-end GPU is the successor to the G70 graphics processor that powers the GeForce 7800 lineup. The G71 is very similar to the G70 in terms of basic specs, but it’s produced using a new, finer 90nm fabrication process that squeezes a similar number of transistors into less space. Like its precursor, the G71 has six pixel shader “quads,” for a total of 24 pixel shader units, plus eight vertex shaders and 16 raster operators (ROPs).

The G71 is more than just a die shrink, however. Get this: the transistor count estimate is actually down from the G70’s 302 million to 278 million for the G71. Why so? NVIDIA says it has replumbed the G71’s internal pipelines throughout the chip, making them shorter, because those longer pipes and extra transistors aren’t needed to help the G71 achieve acceptable clock speeds—the faster-switching transistors of the 90nm process will suffice for that purpose. How’s that for confidence?

Shorter pipelines typically make for higher clock-for-clock performance, and we may see some of that from G71, but this isn’t a radical change. I wouldn’t expect anything revolutionary on that front.

A couple other alterations to G71 will have a more tangible impact. NVIDIA’s design team has modified the SLI logic so the GPU can pass sample data for the SLI antialiasing mode over an SLI link, instead of via PCI Express like on current chips. This arrangement ought to bring higher performance with SLI antialiasing, perhaps to the point where it becomes a truly useful load-balancing method for multi-GPU setups.

Also, the G71 includes two built-in dual-link TMDS transmitters, so GeForce 7900 cards can power a pair of high-def digital displays without the need for an external TDMS transmitter. ATI’s Radeon X1800 and X1900 cards have this feature already, and the G71 does well to follow suit.

Beyond those tweaks, we couldn’t shake loose much more info on changes in the G71 compared to the G70. NVIDIA confirms it hasn’t changed the amount of register space on the GPU, and although there have been some modifications to buffer sizes and the like, the company prefers not to get into those gory details.


The G71 GPU

The final product of this die shrink and handful of revisions is a GPU that fits into 196 mm2, way down from the G70, which I measured (somewhat shakily) at 333 mm2. If things go as they should and the 90nm process doesn’t lead to excessive leakage, the G71 should run at lower voltages, consume less power, and produce less heat than the G70.

Those of you who follow these things closely will recognize that this combination of elements could present an intriuging challenge to ATI’s high-end part—a possibility that NVIDIA is crowing about every chance it gets. At 110nm, the G70 was pretty competitive with ATI’s R580 GPU (a.k.a the Radeon X1900) in terms of die size and power consumption, with only slightly lower overall performance. The changes put the G71 at just under two thirds the size of the R580; although it’s also manufactured on a 90nm fab process, the R580’s die size is roughly 315 mm2. If the G71 handles well enough, it could end up offering equivalent performance to the R580 in a fraction of the physical and thermal footprint.

 

Meet the 7900s
NVIDIA has given the G71 a pair of initial assignments. The flashiest of the two is in the GeForce 7900 GTX, its new top-of-the-line video card. On the 7900 GTX, the bulk of the G71 GPU will flip bits at 650MHz, while the vertex processors will be clocked a few ticks higher at 700MHz. (Yes, like the G70, the G71 has multiple internal clock domains.) The GPU will share a card with 512MB of GDDR3 memory running at 800MHz, or effectively 1.6GHz in the highly caffeinated world of DDR memory chips. NVIDIA estimates that this combo will require about 120W worth of cooling capacity.


The GeForce 7900 GTX

Cosmetically, the GeForce 7900 GTX is a dead-ringer for its little brother, the GeForce 7800 GTX 512, adorned with the same dual-slot cooler with wicked heatpipe fingers extending into its fins. Thanks to an additional hundred megahertz of core clock speed and just a tick less total memory bandwidth, though, the 7900 GTX should generally get the better part of any sibling rivalry. Of course, such bragging rights rarely come cheap, and their actual price tag can fluctuate wildly depending on the situation. Perhaps that’s why NVIDIA quotes a price range of between $499 and $649 for the 7900 GTX. I suppose they’re also hedging their bets after ordering up too few copies of the GeForce 7800 GTX 512 and watching prices shoot into the ionosphere. The hope is, they say, for this new GTX’s street prices to fall much closer to the $499 part of that range than the $649 one. We’ve done some digging, and I believe 7900 GTX card should start at $499 and range up to $599, with “overclocked in the box” clock speeds as high as 700MHz core and 900MHz memory.

There’s nothing like playing some F.E.A.R. on a pair of GeForce 7900 GTXs after a long weekend of yachting, but most of us will probably be more personally interested in the G71’s other assignment, the GeForce 7900 GT. This puppy spins its core clock at 450MHz, with the vertex shaders at 470MHz. This more modest GPU config stores its pixels in 256MB of GGDR3 memory clocked at 660MHz. That 200MHz drop-off in core clock speed from the GTX to the GT is precipitous, and thus NVIDIA has deemed it unnecessary to deactivate any of the G71’s functional units for the sake of product segmentation; the 7900 GT keeps all of the G71’s 24 pixel shaders, eight vertex shaders, and 16 ROPs intact.


The GeForce 7900 GT.

Look at that itty bitty cooler! Now that’s just showing off. The 7900 GT’s thermal design power is only 80 watts, so NVIDIA stuck a cooler smaller than a deck of cards on its reference design and called it a day. With a better cooler, the 7900 GT could have some serious overclocking headroom.

In fact, more than one of NVIDIA’s board partners will offer multiple versions of the 7900 GT, with some versions having better coolers strapped to the side, higher clock speeds, and higher prices. Perhaps that’s why NVIDIA told us the 7900 GT’s expected price range is between $249 and $399. You’ve gotta use FP16 to get that kind of dynamic range (har har—graphics geek humor.) Based on what we’ve heard from folks in the know, I would not expect to see 7900 GTs selling for much under $299 right out of the gates. Instead, I’d look for 7900 GT cards to begin life between $299 and $349 at online stores, with “overclocked in the box” variants selling in the higher part of that range. The more expensive 7900 GTs will boast core clock speeds as high as 560MHz and memory clocks as high as 625MHz. If I’m wrong about that, we’ll find out soon, because both the 7900 GT and GTX are slated to be available today at online vendors.

 

A new mid-range entry
It’s nice to park the Benz in the garage and kick back with a game of Quake 4 on the ol’ 7900 GT, but no member of the 7900 series is really poised for mass market success. Instead, that role has fallen to NVIDIA’s other new 90nm graphics processor, the G73. In order to appeal to the Honda Accord set, the G73 design team fired up the world’s smallest chainsaw and cut a G71 in half. They’d probably tell you it’s more complicated than that, what with their engineering degrees and fancy chip design tools, but the G73’s specs tell the story: five vertex shaders, three pixel shader “quads” for a total of 12 units, and eight ROPs to convert fragments into true pixels. Even the memory interface is halved, from 256 bits to 128.

The G73’s DNA matches that of the G71, right down to the bits about improved SLI antialiasing and shorter pipelines than previous 7-seres GPUs. The G73 shares pretty much all of the same non-3D-graphics logic with the G71, including NVIDIA’s PureVideo video processing engine. However, this lower-end part comes with one dual-link TMDS transmitter and a second single-link one.


The G73 GPU

Cheaper chips make for cheaper cards, and the G73 looks promising on that front; its 178 million transistors reside in a space only 125 mm2. Contrast that, if you will, to ATI’s mid-range graphics processor, the RV530, also known as the Radeon X1600 GPU. Also a 90nm part, the RV530 weighs in at about 132 mm2. ATI has given the RV530 a healthy 12 pixel shader units, much like the G73, but the RV530 can only lay down four textures per clock and can only write out four pixels per clock to the frame buffer. Each of the G73’s 12 pixel shader units can also act as texture units, and the G73’s eight ROPs can write out as many as eight pixels in each clock cycle. In the right configuration, the G73 might put the X1600 to shame.

That configuration may have just arrived in the form of the GeForce 7600 GT, the first G73-based video card and the long-awaited replacement for the immensely popular GeForce 6600 GT. With a 560MHz core clock and a matching 560MHz vertex clock, the 7600 GT will possess some serious fill rate muscle: 6.72 billion texels per second, just a touch under the 6.8 gigatexels/s of the GeForce 6800 Ultra and well above the anemic 2.36 gigatexels/s of the Radeon X1600 XT. With GeForce 7-class pixel shader units, the 7600 GT should outstrip the 6800 Ultra in terms of pure processing power, as well. Memory bandwidth limitations could hold the 7600 GT back somewhat, though. The card’s 256MB worth of GDDR3 RAM chips will run at 700MHz, yielding 22.4GB/s. That’s well below the level of a 6800 Ultra, but it’s a tick more than the Radeon X1600XT and an exact match for the Radeon X800.

However you cut it, the 7600 GT is still an awful lot of video card for $199 to $229, its expected price range. (Some brands may hock 128MB versions for $179, as well, for those who prefer not to have enough memory for their fancy GPUs.)


The GeForce 7600 GT

As you can see, the 7600 GT shares a cooler with its bigger sibling, the 7900 GT. This card doesn’t require the help of an auxiliary six-pin power connector, however, and has a somewhat lower TDP of 60 to 70 watts. Like the 7900 cards, NVIDIA expects these babies to be selling at online retailers starting today.

Before we move on, I should mention here that ATI would almost certainly take issue with my direct comparison of the GeForce 7600 GT with the Radeon X1600 XT, despite their apparent similarities in chip size and shader power. The Radeon X1600 XT has made an impressive odyssey in the few short months since its introduction, from a brand-new contender in the mid-range graphics segment listing for $249 to a card with much humbler aspirations selling for as little as $149. The price drops have certainly been welcome, but they’ve also made the Radeon X1600 XT a moving target for reviewers like us who have been trying to put our fingers on the X1600 XT’s most direct competitors. Given what we’ve seen of its performance to date, the X1600 XT would certainly seem to belong at this lower price point, but ATI’s initial aims for the XT were much grander, obviously. Fortunately, ATI has finally moved to plug the gap in the middle of its Radeon lineup caused by the incredible shrinking X1600 XT, as we’re about to explain.

 

ATI counters with a more affordable X1800
Not content to sit back and let NVIDIA hog the spotlight, ATI has decided to launch its own new video card today, as well. The Radeon X1800 GTO will sell for about $249, and ATI bills it as a rival to NVIDIA’s G73. That’s not quite accurate from a list price standpoint, and I doubt the street prices will be exactly equal, either.

That said, the X1800 GTO is based on higher-priced hardware: an ATI R520 GPU with a 256-bit memory interface. In fact, the GTO shares the same board design used for the Radeon X1800 XL and XT. For this application, the R520 has had its wings clipped a little bit. All eight of its vertex shaders remain, but one of its pixel shader “quads” has been deactivated, leaving 12 pixel shader units intact. Accordingly, ATI has disabled four of the GPU’s 16 texture address units and four of its 16 render back ends. The maximum number of threads possible in its dispatch processor drops from 512 to 384, as well. This scaled-back R520 will run at 500MHz on the Radeon X1800 GTO, and it will make use of 256MB of GDDR3 memory running at the exact same speed.


The Radeon X1800 GTO

The GTO’s appearance with these specifications sets up heckuva battle with the 7600 GT. ATI was obviously aiming for the 7600 GT when they cooked up the GTO, but they may not have fully anticipated the 7600 GT’s potency: the GTO’s peak multitextured fill rate is 6 gigatexels/s, while the 7600 GT’s is 6.72 gigatexels/s. What it gives up in texturing capacity, though, the GTO makes up in pixel fill rate (6 gigatexels/s to the 7600 GT’s 4.48 gigatexels/s) and memory bandwidth (32GB/s to the 7600 GT’s 22.4GB/s.) The question, of course, ultimately comes down to price and performance, all other things being approximately equal. We’ll have to see how the GTO-GT battle shapes up in our tests.

In order to make the Radeon X1800 GTO more competitive in the mid-range of the graphics market, ATI will be releasing a driver in the coming days that will allow “connectorless” CrossFire multi-GPU configurations with the GTO. Such configs will not require a CrossFire “master” card or an external dongle; they will, instead, transfer data between the graphics cards via PCI Express. Connectorless CrossFire systems are subject to the bandwidth limitations of the host system’s PCI Express connection, but the GTO might do fairly well in combination with ATI’s new dual 16-lane CrossFire Xpress 3200 chipset. Of course, because it’s an X1800, the GTO could also be used in conjunction with the Radeon X1800 CrossFire Edition, but that’s a more expensive card that will have to disable half of its 512MB of memory when running in tandem with the GTO.

ATI expects Radeon X1800 GTO cards from its board partners to be on store shelves by the end of March, possibly sooner. There will not, however, be any “built by ATI” versions of the GTO. This is a partner-only product.

The X1800 GTO is notable for another reason, too: it’s a new Radeon X1800. We were given the distinct impression at this past CES that the Radeon X1900 would be replacing the X1800 series. ATI even showed us a slide with its new product lineup featuring the X1300, X1600, and X1900 series, with no mention of the X1800 at all. Unless “EOL” stands for “alive and well” in Etruscan, the X1800 series was on the way out. Now, not only does the GTO reanimate the Radeon X1800 line, but it’s part of a larger repositioning of Radeon X1800 cards at lower prices. Here’s a look, straight from ATI, at the new mix of Radeons.


The new Radeon product mix. Source: ATI.

This remixed product lineup brings a number of changes, starting with lower expected selling prices up top. The Radeon X1900 XTX is down $100 from its launch, and the XT is down $80. Radeon X1800 cards have also been rejiggered to better fit into their new places below the X1900s, with the XT dropping down to 256MB and the XL jumping up to 512MB. (All of this makes sense if you have a marketing degree, I’m quite sure.) The GTO rounds out the bunch at $249.

This apparent change of plans from ATI is a welcome one, because it plugs what we’ve called “a gaping hole” in its product line—in a place where PC enthusiasts have recently found the most value for their money in a graphics card. We will, of course, have to see whether street prices wind up reflecting ATI’s newly stated wishes. We have seen eleventh-hour price cuts from ATI in the past that didn’t really materialize in the market.

We’re talking about essentially three new products in the Radeon X1800 series, despite the “available” notes in ATI’s table above. As I write, the Radeon X1800 XL 512MB is nowhere to be found in our price search engine, and the Radeon X1800 XT 256MB shows up as only one card available from one vendor. Also, prices on the incumbent cards haven’t yet fallen to these levels. You’ll pay between $469 and $499 for a Radeon X1800 CrossFire card today. As for the X1900 series, the going rate on the XTX is about $589, and the XT weighs in at about $510. ATI’s plan for its new lineup looks like it could offer excellent value for the money, but that lineup simply hasn’t arrived yet.

 

When too much is not enough, now there’s quad SLI
NVIDIA first demonstrated a four-GPU SLI system at this year’s CES with GeForce 7800 GPUs, but the first quad-SLI rigs available to the public will be based on the 7900 series.


The quad SLI setup shown at CES 2006

We were left with a number of questions after seeing quad SLI in action at CES, most of which concern exactly how it all worked. Now we have more details.

As is obvious from the pictures, quad SLI relies on a pair of PCI Express graphics cards, each of which is comprised of two circuit boards. On each circuit board is a GPU and its associated frame buffer memory. On the card’s main circuit board is a custom 48-lane PCI Express switch chip that connects to each of the two GPUs (16 lanes each) and to the PCI-E slot (the final 16 lanes.) The card also has an SLI link that connects its two GPUs to one another. Each GPU has a total of two SLI interconnect interfaces on it, though, so that’s not the whole story. A quad SLI subsystem will also make use of two card-to-card SLI bridge connections, one linking PCB 1 from card A to PCB 1 of card B, and the other connecting PBC 2 from card A to PCB 2 of card B. Once all of the connections are made, each GPU will talk to two other GPUs in a quad-SLI configuration, resulting in a ring topology that looks something like this:


A logical block diagram of a quad-SLI setup. Source: NVIDIA.

Whoa.

I wasn’t aware of the presence of dual SLI links on NVIDIA’s GPUs before now. Turns out those links share pins with the chip’s interfaces for external TDMS transmitters. That’s pretty much a non-issue for this application, particularly given the G71’s incorporation of twin dual-link TMDS transmitters.

If you’re familiar with dual-GPU SLI, you already know about split-frame rendering, where the screen is subdivided into two sections, each of which is rendered by one GPU. You also know about alternate-frame rendering, where even frames are rendered by one GPU and odd frames by the other. (Alternate frame rendering is the preferred method where possible, because it offers the best performance scaling and raises geometry throughput as well as pixel throughput.) The third and final general method of SLI load balancing is SLI antialiasing mode, where the two GPUs render the scene using different sample points at a sub-pixel level and the two sets of results are combined.

The quad SLI physical topology goes along with a clutch of new load-balancing modes for splitting up the rendering work. Those are: four-way split-frame rendering, four-way alternate-frame rendering, a combination of alternate-frame and split-frame rendering, and SLI antialiasing. They work about as one might expect them to work, for the most part.



The four quad-SLI load-balancing modes illustrated. Source: NVIDIA.

If you can unravel that SLI 16X AA diagram, let me know. As I understand it, what’s basically happening is that each GPU is rendering the scene with 4X multisampling, with each GPU grabbing samples at an offset from the others. Two pairs of GPUs combine their images after transmitting sample data over the SLI links between them, and then the two resulting images are combined after that. Presto: you have 16X AA. There’s also a 32X SLI AA mode for quad SLI where each GPU renders the scene using NVIDIA’s 8xS antialiasing method.

I have some reservations about the image quality likely to result from all of this effort. NVIDIA’s antialiasing hardware has built-in limitations on sample positions that may blunt the impact of grabbing that many samples. Current dual-GPU SLI AA modes suffer from this problem. We’ll have to see whether the quad SLI modes fare any better.

Four-way alternate-frame rendering has the potential to introduce some additional latency into the graphics subsystem as four frames are buffered before being sent to the screen, which would be a Very Bad Thing for an extreme gaming rig. However, NVIDIA argues latency won’t be an issue so long as frame rates are high enough, because frames will be pushed out to screen before too many milliseconds have passed. If that doesn’t work, there’s always the option of split-frame or partial split-frame modes to keep potential delays in check.

So if latency doesn’t kill it, how well will quad SLI scale? We hope to get our hands on a quad-SLI system soon in order to test it, but NVIDIA claims the graphics subsystem isn’t the bottleneck: it’s the CPU. Quad SLI will require more PCI Express traffic in order to maintain texture coherency when techniques like render to texture are in use, but dual 16-lane PCI-E chipsets have practically been looking for an app like this to justify their existence. Otherwise, the presence of two SLI links per GPU means that the SLI scheme ought to have sufficient bandwidth to work well enough. The main problem may be that driving four GPUs will incur additional overhead on CPUs that are already having trouble keeping up with dual-GPU graphics subsystems.

The first quad SLI cards will have their GPUs clocked at 500MHz, and each GPU will get 512MB of 600MHz memory. Clock speeds for these puppies are lower than for the 7900 GTX because we’re looking at two GPUs and their memory chips sandwiched close together on two circuit boards, creating power and heat concerns. Dialing the clock speed back a little bit helps greatly on that front.

A fully configured quad-SLI rig will be a powerful thing indeed. If you like to count it this way, a quad-SLI graphics subsystem will have 96 pixel shader pipes, 32 vertex pipes, 64 ROPs, and 2GB of RAM onboard. More notably, the setup will have a total of 15.6 gigatexels/s of fill rate and 153.6GB/s of memory bandwidth. Such a system will probably require a power supply between 800 and 1000W.

Yep, a kilowatt of power. Not a typo.

How much will the privilege of spinning your power meter at this rate cost you? Probably a whole truckload of cash, because quad SLI will initially be available only through PC system builders like Voodoo and Alienware, who like to charge about the price of a Mazda for a well-equipped PC. If you do happen to have a Mazda burning a hole in your pocket, though, you should be able to order one of these things today. Quad-SLI graphics cards for us PC DIY types will be sold as separate products at a later date.

 
The evil that lurks in front of my keyboard
We’ve had only a short period of time with the new GeForce cards and even less with the Radeon X1800 GTO, so we had to impose some serious limitations on our testing and recycle some results from older articles in order to bring you performance info today. You can see the configurations we used for testing below, and you’ll note that we used newer drivers from ATI and NVIDIA on these newer cards—a necessity in order to get them to work. We simply didn’t have time to retest older cards with these same drivers, though.

Now, I am aware that in reporting these test results, I am committing a grievous wrong. Have a look, for instance, at where this ranks on at the Internet Scale of Mortal Evil. Starting at the top and moving down:

  • Hitler
  • Clubbing innocent baby seals
  • Testing video cards with mismatched drivers
  • DRM

Not good company to keep; I know. I’m sure we will manage to test most of these cards against one another with like drivers among brands in a future article. Sadly, we couldn’t do it this time around.

When we do conduct a new set of tests, we will probably change some other things about how we tested, as well. Bonehead that I am, I decided for some reason to start testing Quake 4 in its Ultra Quality mode that doesn’t use texture compression. This mode doesn’t get you much in terms of image quality, but it can make life difficult for video cards with less than 512MB RAM onboard. Awesome.

Also, the Radeon CrossFire platform we used for testing was based on ATI’s Radeon Xpress 200 CrossFire Edition chipset, a very long name that stands for eight PCI-E lanes per slot. The new CrossFire Xpress 3200 has now arrived onboard the Asus A8R32-MVP motherboard, and using it should boost CrossFire frame rates by a few percentage points. Something to keep in mind.

We also had to restrict our multi-GPU testing of today’s new products to the GeForce 7900 GTX. We’ll test additional SLI configs and perhaps even connectorless X1800 GTO CrossFire at a later date.


We tested a few cards for this one

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Athlon 64 X2 4800+ 2.4GHz
System bus 1GHz HyperTransport
Motherboard Asus A8N32-SLI Deluxe ATI RD480 CrossFire reference board
BIOS revision 0806 080012
North bridge nForce4 SLI X16 Radeon Xpress 200P CrossFire Edition
South bridge SB450
Chipset drivers SMBus driver 4.50 SMBus driver 5.10.1000.5
Memory size 2GB (2 DIMMs) 2GB (2 DIMMs)
Memory type Crucial PC3200 DDR SDRAM at 400MHz Crucial PC3200 DDR SDRAM at 400MHz
CAS latency (CL) 2.5 2.5
RAS to CAS delay (tRCD) 3 3
RAS precharge (tRP) 3 3
Cycle time (tRAS) 8 8
Hard drive Maxtor DiamondMax 10 250GB SATA 150 Maxtor DiamondMax 10 250GB SATA 150
Audio Integrated nForce4/ALC850 with Realtek 5.10.0.5900 drivers Integrated SB450/ALC880 with Realtek 5.10.00.5188 drivers
Graphics GeForce 6800 GS 256MB PCI-E
with ForceWare 81.98 drivers
Radeon X1800 XL 256MB PCI-E 
with Catalyst 8.203-3-060104a-029367E drivers
Dual GeForce 6800 GS 256MB PCI-E
with ForceWare 81.98 drivers
Radeon X1800 XT 512MB PCI-E
with Catalyst 8.203-3-060104a-029367E drivers
XFX GeForce 7800 GT 256MB PCI-E
with ForceWare 81.98 drivers
Radeon X1800 CrossFire + Radeon X1800 XT 512MB PCI-E
with Catalyst 8.203-3-060104a-029367E drivers
Dual XFX GeForce 7800 GT 256MB PCI-E
with ForceWare 81.98 drivers
Radeon X1900 XTX 512MB PCI-E
with Catalyst 8.203-3-060104a-029367E drivers
MSI GeForce 7800 GTX 256MB PCI-E
with ForceWare 81.98 drivers
Radeon X1900 CrossFire 512MB PCI-E
with Catalyst 8.203-3-060104a-029367E drivers
Dual MSI GeForce 7800 GTX 256MB PCI-E
with ForceWare 81.98 drivers
Radeon X1900 CrossFire + Radeon X1900 XTX 512MB PCI-E
with Catalyst 8.203-3-060104a-029367E drivers
GeForce 7800 GTX 512 512MB PCI-E
with ForceWare 81.98 drivers
All-in-Wonder X1900 256MB PCI-E
with Catalyst 8.203-3-060104a-029367E drivers
Dual GeForce 7800 GTX 512 512MB PCI-E
with ForceWare 81.98 drivers
 Radeon X1800 GTO 256MB PCI-E
with Catalyst 8.223-060207a3-31101C drivers
GeForce 7600 GT 256MB PCI-E
with ForceWare 84.11 drivers
 
GeForce 7900 GT 256MB PCI-E
with ForceWare 84.11 drivers
 
GeForce 7900 GTX 512MB PCI-E
with ForceWare 84.11 drivers
 
Dual GeForce 7900 GTX 512MB PCI-E
with ForceWare 84.11 drivers
 
OS Windows XP Professional (32-bit)
OS updates Service Pack 2, DirectX 9.0c SDK update (December 2005)

Thanks to Crucial for providing us with memory for our testing. 2GB of RAM seems to be the new standard for most folks, and Crucial hooked us up with some of its 1GB DIMMs for testing. Although these particular modules are rated for CAS 3 at 400MHz, they ran perfectly for us at 2.5-3-3-8 with 2.85V.

All of our test systems were powered by OCZ PowerStream 520W power supply units. The PowerStream was one of our Editor’s Choice winners in our last PSU round-up.

Unless otherwise specified, the image quality settings for both ATI and NVIDIA graphics cards were left at the control panel defaults.

The test systems’ Windows desktops were set at 1280×1024 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Pixel-pushing power
We’ve already talked about some of these numbers in our discussions of the individual cards, but let’s put them all together and have a look at them in context. Pixel and texel (textured pixel) fill rates aren’t as singularly important as they used to be, but they’re still very relevant to overall performance in many applications. Texel fill rate, in particular, still matters quite a bit.

  Core clock
(MHz)
Pixels/
clock
Peak fill rate
(Mpixels/s)
Textures/
clock
Peak fill rate
(Mtexels/s)
Memory
clock (MHz)
Memory bus
width (bits)
Peak memory
bandwidth (GB/s)
Radeon X1600 XT 590 4 2360 4 2360 1380 128 22.1
GeForce 6800  325 8 2600 12 3900 700 256 22.4
GeForce 6600 GT 500 4 2000 8 4000 1000 128 16.0
Radeon X800 400 12 4800 12 4800 700 256 22.4
GeForce 6800 GS 425 8 3400 12 5100 1000 256 32.0
GeForce 6800 GT 350 16 5600 16 5600 1000 256 32.0
Radeon X800 XL 400 16 6400 16 6400 980 256 31.4
Radeon X1800 GTO 500 12 6000 12 6000 1000 256 32.0
GeForce 7600 GT 560 8 4480 12 6720 1400 128 22.4
GeForce 6800 Ultra 425 16 6800 16 6800 1100 256 35.2
GeForce 7800 GT 400 16 6400 20 8000 1000 256 32.0
All-In-Wonder X1900 500 16 8000 16 8000 960 256 30.7
Radeon X1800 XL 500 16 8000 16 8000 1000 256 32.0
Radeon X850 XT 520 16 8320 16 8320 1120 256 35.8
Radeon X850 XT PE 540 16 8640 16 8640 1180 256 37.8
XFX GeForce 7800 GT 450 16 7200 20 9000 1050 256 33.6
Radeon X1800 XT 625 16 10000 16 10000 1500 256 48.0
Radeon X1900 XT 625 16 10000 16 10000 1450 256 46.4
GeForce 7800 GTX 430 16 6880 24 10320 1200 256 38.4
Radeon X1900 XTX 650 16 10400 16 10400 1550 256 49.6
GeForce 7900 GT 450 16 7200 24 10800 1320 256 42.2
GeForce 7800 GTX 512 550 16 8800 24 13200 1700 256 54.4
GeForce 7900 GTX 650 16 10400 24 15600 1600 256 51.2

The GeForces have a clear edge in peak multitextured fill rate here, with the 7900 GT and GTX coming in ahead of the Radeon X1900 XTX in the theoretical numbers and in the synthetic texel fill rate test. Despite their similar die sizes, the Radeon X1600 XT has much less texturing capacity than the GeForce 7600 GT. Even the Radeon X1800 GTO has less.

Notice how, in the pixel fill rate test, the GeForce 7900 GTX comes in behind the 7800 GTX 512. I’d bet that’s caused by the 7900 GTX’s lower memory bandwidth.

 

Quake 4
We tested Quake 4 using our own custom-recorded timedemo. The game was running at its “Ultra” quality settings with 4X antialiasing enabled.

OpenGL games have long been NVIDIA’s domain, and that hasn’t changed yet. The GeForce 7900 GTX sets the standard for both single and dual-GPU configurations here. The GeForce 7600 GT outruns the Radeon X1800 GTO, too, but by an increasingly slim margin as the screen resolution increases.

 

Half-Life 2: Lost Coast
This new expansion level for Half-Life 2 makes use of high-dynamic-range lighting and some nice pixel shader effects to create an impressive-looking waterfront. We tested with HDR lighting enabled on all cards.

Newer video drivers seem to work against the GeForce 7900 GTX at lower resolutions, as it falls behind the GeForce 7800 GTX 512. At higher resolutions, the 7900 GT manages to outperform the Radeon X1900 XTX, but not by much. The absolute fastest config here, however, is the Radeon X1900 CrossFire rig.

Jump down the results a bit, and you’ll find the GeForce 7900 GT a few ticks faster than the 7800 GTX, pretty much as one would expect. Name a possible competitor somewhere near its price point—the Radeon X1800 XT, Radeon X1800 XL, or the AIW X1900—and it has them all beaten.

The Radeon X1800 GTO and the GeForce 7600 GT remain locked in a very tight contest, but the 7600 GT retains a small lead here.

 

F.E.A.R.
We tested the next few games using FRAPS and playing through a portion of the game manually. For these games, we played through five 60-second gaming sessions per config and captured average and low frame rates for each. The average frames per second number is the mean of the average frame rates from all five sessions. We also chose to report the median of the low frame rates from all five sessions, in order to rule out outliers. We found that these methods gave us reasonably consistent results.

F.E.A.R.’s graphics quality options were all set to maximum for our testing. Computer performance was set to medium.

Based on the top three scores, I’d wager that we’re running into a frame rate cap in F.E.A.R. at about 90 frames per second. The Radeon X1900 CrossFire system, the 7800 GTX 512 SLI setup, and the 7900 GTX SLI all look to be hitting the same wall. Among the single cards, though, the Radeons are cleaning up—at every price point along the way. Even the Radeon X1600 XT nearly catches the GeForce 7600 GT.

Battlefield 2
We’re testing BF2 at an insanely high resolution because it runs really well on just about any of these cards at lower resolutions. Also, BF2 has a built-in frame rate cap of 100 FPS. We didn’t want to turn off the cap, but we did want to see some differences in performance between the cards.

ATI comes out looking good in BF2, as well, with the Radeon X1900 XTX edging out the GeForce 7900 GTX in the single-card sweeps. The X1800 GTO plays a smoother game of BF2 than the GeForce 7600 GT, too.

Guild Wars
Like the two above, we played this game manually and recorded frame rates with FRAPS. In this case, we’re playing an online game, so frame rates were subject to some influence from an uncontrollable outside factor. Regardless, I think the numbers below reflect performance pretty well.

Anything we tested, except for the Radeon X1600 XT, will run Guild Wars at a fairly acceptable frame rate for this type of game at this mega-high res.

 

3DMark06

Wow, this one is close. The GeForce 7900 GTX is ahead by a nose at 1600×1200, but once we get up to 2048×1536, the Radeon X1900 XT comes out on top. The GTO and 7600 GT are similarly closely matched, with a sliver of an advantage to the 7600 GT throughout. As for the GeForce 7900 GT, it’s bracketed by the Radeon X1800 XT and the AIW X1900.

 

3DMark06 – Synthetic tests

These different GPU architectures from ATI and NVIDIA respond somewhat differently to 3DMark’s synthetic tests. Perhaps the only constants here are that the GeForce 7900 GTX is always the fastest single card and the Radeon X1900 XT trails by a whisker-thin margin.

 

Power consumption
We measured total system power consumption at the wall socket using a watt meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The idle measurements were taken at the Windows desktop with AMD’s Cool’n’Quiet CPU clock throttling function enabled. The cards were tested under load running Half-Life 2: Lost Coast at 1600×1200 resolution with 16X anisotropic filtering and HDR lighting enabled. We turned off Cool’n’Quiet for testing under load.

All of the graphics cards named below except for the two CrossFire setups were tested for power consumption on the Asus A8N32-SLI mobo. We were forced by the driver lockout to use the ATI CrossFire reference mobo instead for the CrossFire cards, so power consumption for the CrossFire systems will vary due to the difference in motherboards. Also, please note that the Radeon X1900 XT shown here is actually the CrossFire master card, so its power consumption is probably slightly higher than a non-CrossFire X1900 XT that lacks the additional chips needed for image compositing.

I believe that our Radeon X1800 XL and XT cards are not wholly representative of the idle power consumption of retail cards, either. Our cards—including the GTO—are early review units that lack the idle clock throttling we’ve observed in retail Radeon X1800 cards. Our Radeon X1900 review samples, however, do settle down to somewhat lower clock speeds at idle in order to save power and cut down on heat. Unfortunately, our AIW X1900 does not.

Idle power consumption for the new GeForces comes out looking relatively good, even if our results are potentially iffy because of questions about possible clock throttling on retail Radeons. The 7600 GT draws less power at idle than Sapphire’s retail X1600 XT, even though the GT is a much stronger performer. All of NVIDIA’s 90nm GPUs draw less power than the older GPUs they supplant.

When running Half-Life 2: Lost Coast, the idle clock issues evaporate, and we have a very clear sense of relative system-level power draw under load. As a result, the 90nm GeForce cards really come into their own. All of them draw less power than other products in their classes, but the high-end 7900 GTX may be the most remarkable; it pulls less power than a Radeon X1800 XT or an AIW X1900, let alone the higher-end Radeon X1900s. There’s a 74W chasm in system power use between the 7900 GTX and the X1900 XTX on the same motherboard. Among the mid-range cards, the 7600 GT’s advantage over the X1800 GTO is 20W—not as dramatic, particularly given the differences in memory interfaces and chip size at work here, but still noteworthy.

 
Conclusions
The performance race between NVIDIA and ATI is very tight overall, especially at the very high end, when the GeForce 7900 GTX squares off against the Radeon X1900 XTX—close enough that I couldn’t declare a clear overall winner. Both cards are incredible performers, and neither of them has shown any great weaknesses in our tests. The 7900 GTX does seem to scale a little bit better in a dual-GPU configuration with SLI than the X1900 XTX does with CrossFire, but we’ll have to test CrossFire performance with ATI’s new dual 16-lane chipset before drawing any definitive conclusions.

The value propositions for these products will depend quite a bit on how relative street prices shake out in the coming weeks and months. That’s especially true for the GeForce 7900 GT. ATI has positioned the Radeon X1800 XL 512MB and the X1800 XT 256MB against it, and we haven’t tested either of those configurations yet. Based on what we do know about the performance of the Radeon X1800 XL 256MB and the Radeon X1800 XT 512MB, I’d say the GeForce 7900 GT is in pretty good shape regardless. At about $299, a non-“factory overclocked” model should offer more card for the money than the incumbent GeForce 7800 GT, until now a favorite of ours at this price point.

If you want to talk about real value, though, move a step or two down to the GeForce 7600 GT and the Radeon X1800 GTO. The GeForce 7600 GT, in particular, looks to be a great performer for the price and the new king of the sweet spot in PC graphics, much like the GeForce 6600 GT was before it. The Radeon X1800 GTO offers higher-grade hardware for another fifty bucks, but the 7600 GT achieves equivalent or better performance overall, raising the question of why the GTO should command a higher price. There is, however, one intriguing point that I’d like to note: the 7600 GT came out ahead in our timedemo tests and in 3DMark, but the X1800 GTO was faster in games we measured with FRAPS. I wish we’d had time to test with a broader range of games in order to establish whether this is a trend. We’ll have to keep our eyes on that question.

The efficiency issue
NVIDIA has made much of the fact that they have a more efficient GPU architecture than ATI right now, and it’s true that NVIDIA’s GeForce 7-series desktop GPUs generally achieve higher performance per watt and more performance per die area than ATI’s current desktop graphics processors. That’s undeniable. Whether and how much this fact matters to you is something you’ll have to decide.

Obviously, someone working to build a super-quiet gaming rig or the like will want to take these things into account. GeForce 7600 GT and 7900 series cards will consume less power and throw off less heat inside your PC than their Radeon counterparts. ATI has addressed this problem to some degree by using a dual-slot cooler on its high-end cards that funnels most out air directly out of the back of the case, but at the end of the day, there are few true substitutes for a cooler-running chip.

NVIDIA’s smaller chips might also make for less expensive products from NVIDIA and its partners. I would be surprised if the GeForce 7600 GT doesn’t make the same migration over its lifetime that the GeForce 6600 GT did, from $199 down to $149 and below. With its much larger die and 256-bit memory interface, the Radeon X1800 GTO isn’t likely to make the same transition. ATI will have to replace it with something else, and the Radeon X1600 XT certainly isn’t up to the task.

ATI disputes the importance of arcane issues like GPU die size, and at a pretty basic level, they’re right to do so. Most folks just want to buy the best graphics card for the money. But ATI wasn’t talking down the importance of die size during the Radeon X800 era when people were asking them why they chose to limit their GPU’s precision to 24 bits per color channel rather than 32; they were talking up efficient architectures and best use of die area quite eloquently.

The Radeon X1000 series’ die sizes are relatively large, of course, because ATI went and built a new architecture with more precision and other new features. Some of those are pretty cool, like the ability to combine high-dynamic range rendering with multisampled antialiasing, or finer granularity for pixel shaders with dynamic branching. From a tech standpoint, the fact that they’ve decoupled so many stages of the traditional graphics pipeline from one another is pretty slick, too.

Yet I wonder about whether they have struck the right balance of on-chip resources for today’s applications or those coming in the near future. The Radeon X1600 XT, for instance, has three times the number of pixel shader processors that it has texture address units and render back ends, as does the Radeon X1900. The Radeon X1800 series, by contrast, has a 1:1 ratio between these units—including the Radeon X1800 GTO. This leads to the odd situation where the Radeon X1800 GTO technically has less pixel shader power than the Radeon X1600 XT. (Both have 12 pixel shader units, but the X1600 XT is clocked higher.) Neither GPU seems to have an optimal ratio of pixel shader power to pixel and texel fill rate for today’s games. Judging by what we’ve seen, the best mix is probably somewhere in between, at a 2:1 or 3:2 ratio. (NVIDIA is at 3:2 for their architecture, but that’s a different animal, so comparisons aren’t entirely apt.) ATI seems to be casting about a little bit, trying to find the right mix.

I’m curious to see where they’ll land, and whether new GPU revisions or upcoming games will make the Radeon X1000 series GPUs look relatively more efficient over time. Whatever happens, they seem unlikely to catch NVIDIA on the efficiency front in this generation of GPUs. How much that matters, it’s tough to say—especially in graphics, where the next gen is always just around the corner. 

Comments closed
    • GeForce6200
    • 13 years ago

    The amp is 13

    • GeForce6200
    • 13 years ago

    Curious. Would a 420watt ps work for a 7600gt for a while. My system specs are.
    MSI Neo-4f Motherboard
    AMD Athlon3000(939)
    2X512 pc2700
    Gigabyte GeForce 6200
    and a HD, with two DVD drives

    Should I wait till DX 10 comes out. My 6200 works great right now. Plays BF2 on high.(no aa)

      • absinthexl
      • 13 years ago

      Check the amperage output on the +12v rails. Wattage doesn’t mean much of anything anymore.

        • GeForce6200
        • 13 years ago

        I think it is 13 amp. Not sure. It is just a simple raidmax. Think I should get 550 Antec?

    • DrDillyBar
    • 13 years ago

    Sweet. /me loves options.

      • DrDillyBar
      • 13 years ago

      Kudos on the cards pic

    • Fighterpilot
    • 13 years ago

    As a follow up on the whole “X” labelling thing….
    Forceware,Detonator and Catalyst are all cool names for drivers and such…imagine how much more fun it would be to read the latest TR report that said”We can confirm that the new ATI BallCrusher series cards are faster then the current NVidia Buttkickers” LOL
    Or perhaps “Nvidias latest Predator card proved superior to the ATI Obliterator in BF3 etc….sure beats the crap outta all those stupid and confusing Xs and strings of meaningless numbers.

    Number75#….I feel your pain.I had an FX5600S as my last Nvidia card and it was…ahem…less than stellar.

    • Vera
    • 13 years ago

    hooray! Time to get rid of this Ti4800SE <sucks

      • BoBzeBuilder
      • 13 years ago

      its better than my fx5600 < sucks so bad; its funny. lol

        • ludi
        • 13 years ago

        The FX5600 “Ultra” (whatever /[

    • rwolf
    • 13 years ago

    It will be interesting to see if ATI will use GDDR4 and Virtual memory in the near future since support for those features is in the new memory controller.

    • Bensam123
    • 13 years ago

    You know 32x AA is very nice but what I’m more worried about is the optmizations people have seem to forgot about that have been in video cards since the Radeon 8500s. You know where anything 10ft infront of you in game is blurry and looks like graphics from four years ago?

    I would trade ANY amount of AA to get that crap turned off. Games would look so much nicer if ‘optmizations’ like that were optional or just didn’t exist.

    Then again these ‘next gen’ cards wouldn’t be spitting out these kinds of framerates would they?

    • Fighterpilot
    • 13 years ago

    Wow lucky u slipped in with the quick Ninja edit then….I was just about to press the post button lol
    Also Im not so pedantic that you need to go to decimal places on the FPS numbers,rounding off is near enough..and furthermore I think that “crushed” sounds way more fun than “It outperforms by 10.7fps”
    Obliterated is another goody…I gotta remember that one for the Conroe release day tests results lol

      • Convert
      • 13 years ago

      Yes, I was looking at the xt and not xtx numbers for the average fps in fear. At least it is easy to admit you read the wrong numbers off (especially when all that was missing was the letter x), unlike admitting when being a fanboy has screwed your judgment. I decided I might as well throw in the actual calculated number too since you fanboys live and die over fractions of a single fps.

        • Fighterpilot
        • 13 years ago

        Well I have to agree the whole “X” thing is way outta hand…so no bad there. I mean “X marks the spot” was a goody and even “The X Files” was tolerable as a program name but the mega overuse of it recently is getting lamer by the minute.
        Is it supposed to make us get all giddy with connotations of mystery and supernatural powers/performance or something?
        Also Im really just an ATI/Intel fan .. not a fanboy as I dont post abusive or disparaging comments about AMD/Nvidia products,on the contrary a quick look at my post history will reveal I often recommend both to forum users looking for opinions and I think that both companies make good products.

    • crose
    • 13 years ago

    Interesting review but with performance not being that big of a difference it would have been more interesting to take a closer look at HD playback capability (H.264) and PureVideo vs. Avivo. Is Avivo’s transcoding out of beta now?

    • zqw
    • 13 years ago

    Then to find out that the driver has an *[http://www.tomshardware.com/2006/03/09/ati_and_nvidias_same_day_mega_launch_mayhem/page17.html<]§ Is there any more info on that anywhere? (Tom's didn't explain it.)

      • Crackhead Johny
      • 13 years ago

      Are you sure the explanation wasn’t between some of the ads?

        • zqw
        • 13 years ago

        I didn’t want to read it, but I did text search for overclock and clock on all the pages. Also nothing in the forums.

    • rgreen83
    • 13 years ago

    Never heard anyone complain about there being too many types of cars for sale!

    • kfc
    • 13 years ago

    7900GT w/512MB would get me interested.

    256MB + DX9 when 512MB and DX10 will be soon needed just isnt worth it to me. That is unless I didnt care about UT2K7, Crysis and other games are shortly around the corner.

    This is not the time to upgrade boys!

      • Flying Fox
      • 13 years ago

      It depends on where you come from. If you are on a GF4 Ti4200 then it is prime time to upgrade. I still don’t see how 512MB is a “must” yet, may be next year.

    • Convert
    • 13 years ago

    Nice cards and nice review. The 7600/7900 series looks pretty solid, most of all though I am happy to see these cards at retail at low price points.

    • Fighterpilot
    • 13 years ago

    This is the long awaited 1900 series beater and it still gets crushed in F.E.A.R and BF2?

      • Lazier_Said
      • 13 years ago

      66 to 63 is crushed?

        • Convert
        • 13 years ago

        You will have to excuse him, he is a ATI fanboy. 3fps in the world of a fanboy is 300. That is, if his brand is winning by 3fps otherwise if it is nvidia>ati then it’s more like 0.

          • Fighterpilot
          • 13 years ago

          If you had actually read the results I mentioned you might note that it makes me an ATI Factboy 😉

            • Convert
            • 13 years ago

            Of course I read the results. There is a 10.7fps difference in fear, too.

            So by you, the fanboy, 3.2fps and 10.7fps is “crushed”. I guess in games where nvidia is +10.7fps ahead must mean ati is “obliterated”?

            You never actually claimed any real numbers anyways. When you are a fanboy you would rather make wide sweeping remarks like “crushed” instead of saying 3fps. Which isn’t “crushed” and what is actually in question here. So you *[

    • totoro
    • 13 years ago

    Is the 7800GT to 7900GT upgrade really worth it?
    I could sell my old (like 2 months) card and get really close to a 7900GT price.

      • kfc
      • 13 years ago

      If you can sell the 7800gt and only lose $50 or so i’d say go for it. Any more and its not worth it.

      The only reason to sell the 7800 now is that some sucker might actually buy it for a pretty penny. Only downside is the next gen cards will be a larger upgrade than a 7900gt. DX10 as well as a host of other features.

        • totoro
        • 13 years ago

        Thanks, that’s about what I figured.
        I think I’ll hang onto it for a while, then.

      • Forge
      • 13 years ago

      Just think of the 7900GT as a 7800GTX, it should make your thoughts simpler.

    • Hattig
    • 13 years ago

    Nice.

    I can afford the lower end ones there. If I had an up to date system, anyway.

    It would be nice to see some price/performance graphs though. There are so many options now my head spins.

    • Zenith
    • 13 years ago

    “g[

      • totoro
      • 13 years ago

      Chuck Norris cannot be rendered by any machine.
      The roundhouse kicks would pull it apart from within.

        • ludi
        • 13 years ago

        I hope Our Man Chuck is really enjoying his fifteen minutes, because by the time this is over, the mere mention of any fragment of his name will be enough to induce epillepsy.

          • eitje
          • 13 years ago

          it’s been more than 15 minutes. 😛
          in fact, i’m kind of sad that this meme has spread all the way to TR. 🙁

            • totoro
            • 13 years ago

            The worst was seeing it in Guild Wars.
            Funny, but in a why are they here? sort of way.

            • absinthexl
            • 13 years ago

            Cyril must be banging his head against the keyboard right now, as a SomethingAwful moderator…

    • kfc
    • 13 years ago

    3DMark06 – Vertex Shader – Simple – 1280×1024

    GeForce 6800 GS SLI is the fastest card. Beats 7900GTX SLI and 1900XTX CrossFire. How?

    §[< https://techreport.com/reviews/2006q1/geforce-7600-7900/3dm-vertex-simple-1280.gif<]§ Other than that good review. So far I like TR and Guru3D's take on it. [H] sucks ass.

    • Dposcorp
    • 13 years ago

    Excellent article.
    I look forward to picking up a second used 7800GT to try out sli for the first time.

    A used 7800GT at $200 or less would rock.

    • Jigar
    • 13 years ago

    i was expecting 79 series to be 32 pipline ………. anyways card looks good….. Lets see how it performs against x1900xtx….

      • Flying Fox
      • 13 years ago

      Read the article, and you’ll find out. 😉

    • A_Pickle
    • 13 years ago

    Are we going to see a Pentium D vs. Athlon 64 X2 vs. Core Duo benchmark anytime?

    -Pikl

      • Flying Fox
      • 13 years ago

      Is the G and C in GPU vs CPU that similar to the fanboys?

        • A_Pickle
        • 13 years ago

        No… I was asking if there would be a review. There is one, extremely limited review of the Core Duo online. I was just curious. Sorry I asked.

        -Pikl

          • Flying Fox
          • 13 years ago

          y[

            • A_Pickle
            • 13 years ago

            Sometimes Mr. Wasson passes by this area…. so… I asked. Should I have done it elsewhere?

            -Pikl

            • Flying Fox
            • 13 years ago

            Do it in the forums or PM?

            • A_Pickle
            • 13 years ago

            He has a forum user account?

            …yeah… I guess that… would make sense… being that he… mm. I feel dumb.

            -Pikl

      • Buub
      • 13 years ago

      Cornrow in yer FACE d00d!

        • Delta9
        • 13 years ago

        Cornrow. Too funny.

    • Alanzilla
    • 13 years ago

    My little brother did most (if not all) of the work on the memory subsystem for the 79xx series chips. Whee!

      • Forge
      • 13 years ago

      Long time no see! Tell your little brother thanks for the good work!

    • flip-mode
    • 13 years ago

    Yum,

    Very tempted to try EVGA’s trade up program and switch from the 6800gs to the 7600gt.

      • Forge
      • 13 years ago

      If you have the receipt and box, and it’s less than three months since purchase, go for it! eVGA gives you full retail price in credit.

        • flip-mode
        • 13 years ago

        That’s cool! I’m actually wondering if it will be worth it. Retail price was 179 for my card. Scott didn’t do ANY OVERCLOCKING TESTS, and I’m worried about that small cooler and lack of suplimental power connector limiting overclocking. So the question is, will the performance of an OCd 7600gt as compared to an OCd 6800gs be worth $20 and the hassle of the exchange (and the hassle itself is definitely a component of the calculation).

        Scott, was the omission of overclocking tests due to the bug I’m hearing about, a lack of time, or no interest in overclocking reference cards. Knowing me you’ve probably aready addressed this in the article or in another comment and I missed it.

    • seraph47
    • 13 years ago

    four new video cards in one review?

    argh! info overload 😛

    • SpotTheCat
    • 13 years ago

    it would be funny to have a computer using a horsepower, 746 watts IIRC.

      • Forge
      • 13 years ago

      746W is one horsepower? Cool.

    • eitje
    • 13 years ago

    q[

      • Damage
      • 13 years ago

      Baby seals should never go clubbing. They won’t be innocent for long.

    • Forge
    • 13 years ago

    As soon as prices on eVGA’s 690/1760 ‘Superclocked’ 7900GTX gets a price, I’ll either be swapping up to a pair of those at a minor fee, or swapping out for two 7900GTXes and a minor *refund*.

    Life is good.

    • PRIME1
    • 13 years ago
    • deathBOB
    • 13 years ago

    Excellent. I love how well matched ATI and Nvidia are, and how prices are ridiculously low even at launch. Competition FTW

    • MorgZ
    • 13 years ago

    What a cracker jack of a review, great reading.

    The nVidia 7900 series looks like they have a very good product in the 7900GT which is sure to be a product which is going to make the gaming market who are looking to upgrade very interested. The 7900 series seems to address some of the problems nVidia have had in the past with keeping up with ATi cards when u add AA / AF into the mix at high resolutions. ATi probably just edge nVidia out on the performance crown but most buyers are becoming privvy to the fact that its the mid range card performance which is more important.

    Ultimately the winner of this round will be which company can give best value for money and it looks like nVidia might be able to drop their prices more than ATi.

    I will never buy a quad sli rig but i cant wait to see the benchmarks for it nonetheless!

    • madlemming
    • 13 years ago

    Quad SLI, wow.
    Needing a kilowatt of power is crazy, and booking at the specs of PCP&C’s kilowatt PSU; it looks like theres going to be a wall pretty soon. The draw for that PSU at full load is 15 amps, so unless we start plugging computers into 20 amp outlets, It’s going to be hard to get more juice.

      • Lazier_Said
      • 13 years ago

      Seeing TR’s test system pull 267 watts wall current with one 7900GTX and 375 watts with two, one would expect a system with four cards to draw in the order of 600 watts, or a smidge over 5 amps.

      At 80 to 85% efficiency the actual DC consumption would be around 500 watts.

      The necessity of either a 1000 watt power supply or a new standard in home outlet wiring seem dubious.

    • Deathright
    • 13 years ago

    Does anyone really need a Quad-Sli?

    • Chrispy_
    • 13 years ago

    I came here as soon as I saw 7900GT’s for sale at my favourite etailer.

    It’s a nice relief to see them here in the UK, in stock before I realise they’re released. It makes a change from the spate of recent paper-launching.

    • poslo
    • 13 years ago

    great review

    • Shintai
    • 13 years ago

    I just love the fact, that the insane price rises on GFX cards have stopped.

    • Vrock
    • 13 years ago

    Oooh, I want a 7900GT. But I don’t *need* one, as this overclocked 6800NU is doing just fine. Darn practicality…argh!!!

    • Ruiner
    • 13 years ago

    And this brings the number of video card types to 100….

    wtf?

      • Vrock
      • 13 years ago

      Why complain? Choice is good for the consumer. So long as you educate yourself, that is.

        • ludi
        • 13 years ago

        For performance users, video cards are NOT toothpaste, where any brand/type will do roughly the same thing with only minor variations in Degree of Pearly Polish, and people who have a very specific need will be given a specific prescription option.

        Having this many video card variations on the market is just bloomin’ confusing, and corresponding education becomes every more laborsome. I gave up following the videocard market closely not too long after the FX-series and Radeon 8xxx/9xxx boards began spawning Foxworthy-esque family trees, because the effort/reward ratio for knowing that information got too large. (I also don’t like the thought of video cards procreating because I’m fairly certain both Ruby and the Nvidia Ying-Yang Pixies would crack their own pelvic bones like the San Andreas if they attempted to carry into the third trimester.)

    • Krogoth
    • 13 years ago

    7900GT = Most factory clocked 7800GTXs for the masses! 🙂

    AS5+7900GT= T3h mad overclocks that can probably yield close to 7900GTX performance. Althought, I suspect that the memory will be the bottleneck.

      • Lord.Blue
      • 13 years ago

      You’d think so, until you saw this:
      XFX GeForce 7900GT 256MB DDR3 XXX Edition
      Memory Clock:
      1.65 GHz
      Clock rate:
      560MHz
      Memory Bus:
      256 Bit
      §[< http://www.newegg.com<]§ price: $359.99

        • mesyn191
        • 13 years ago

        OMG that is a awesome deal….

        Wish I had some money to play with rather than pay taxes. ;(

          • Lord.Blue
          • 13 years ago

          Apparently you’re not the only one who thought it was an awesome deal, they’re sold out.

    • Usacomp2k3
    • 13 years ago

    😮

  • Pin It on Pinterest

    Share This