The GeForce 8800 in SLI

WHEN WE REVIEWED the GeForce 8800, I said we’d test the GPU in an SLI configuration “as soon as we can.” I will admit that I’ve dabbled in CPUs a little too much, and our look at GeForce 8800 SLI has been delayed. However, I also wondered in that same review: “who needs two of these things right now?” That’s a pretty good question given the GeForce 8800 GTX’s astounding pixel-slinging performance, and something of a de-motivator for one considering looking into GeForce 8800 SLI.

But now I have seen the light. It’s wider than it is tall, modulated by a series of filters, and about 30″ from corner to corner. I’m talking, of course, about Dell’s 3007WFP LCD. We need not invent a reason for GeForce 8800 SLI since display makers have already invented a darn fine one. A four-megapixel monster like this one cries out for the fastest possible graphics subsystem to drive it, and the GeForce 8800 in SLI seems like a perfect match. We’ve had a bundle of fun testing the two together and exploring the world of uber-high-res widescreen gaming.

We’ve also dug a little deeper into GeForce 8800 antialiasing, to see how it compares to single- and multi-GPU antialiasing modes. Even the vaunted quad SLI makes an appearance to take on dual GeForce 8800 GTXs for extreme bragging rights supremacy. The power meter in Damage Labs has been spinning like a Hillary Clinton campaign staffer in the wake of the Obama announcement. Read on to see how we put all of that power to use.

G80 SLI: Powerful, yet unrefined
The GeForce 8800 is still a very new product, so running pair of them in SLI is a funny mix of extreme, err, extremeness and as-yet-untapped potential. First, let’s talk about the extremeness. We already know that a single GeForce 8800 GTX graphics card performs more or less on par with two of its fastest competitor, the Radeon X1950 XTX, running in tandem. With 680 million transistors, the G80 graphics processor has a formidable appetite for power and cooling. In GTX form, the GeForce 8800 has 128 stream processors running at 1.35GHz, a 384-bit path to its 768MB of memory, is 10.5″ long, and has two six-pin PCIe power connectors on a single card, like so:


The GeForce 8800 GTX’s twin power plugs

Running two of these cards together in a single system will require a grand total of four PCIe auxiliary power connectors, more than even most high-end power supplies can handle. We were able to get our test systems going in GTX SLI using our OCZ GameXStream 700W PSUs and a pair of four-pin Molex to PCIe adapters, but doing so ate up a connection on every Molex-equipped power lead on the PSU. We even had to share one lead with our DVD drive, a less-than-optimal solution—and this was on a test system with only one hard drive and no extra accessories. Those who are serious about building a system with dual GeForce 8800 GTX cards would do better to go with something like this one-kilowatt beast from BFG Tech.


Four PCIe connectors sprout from BFG’s 1kW power supply

The BFG Tech 1kW PSU comes with four six-pin PCIe leads out of the box, along with enough rated capacity to power five or six 100W light bulbs in addition to your PC. Unfortunately, like most 1kW PSUs, this BFG Tech one doesn’t have a larger, quieter 120mm cooling fan.

The GTX’s little brother, the GeForce 8800 GTS, doesn’t require such extreme measures, since it comes with only one PCIe plug per card. These cards are no longer than a Radeon X1950 XTX or a GeForce 7900 GTX, either. Still, with 96 stream processors clocked at 1.2GHz and 640MB of memory behind a 320-bit interface, the GTS isn’t exactly warm milk—more like Diet Mountain Dew: toned down, but still extreme.


A pair of BFG Tech GeForce 8800 GTS cards

Now, let’s talk about that untapped potential. Current GeForce 8800 drivers support dual-card SLI configurations, but no more than that. Yet every GeForce 8800 card comes with a pair of SLI connectors on board.


Dual connectors promise big things in the future

A dual-card GeForce 8800 SLI rig will only use one connector per card. In theory, the additional connectors could be used in offset fashion to create daisy-chained configurations of three, four, or more cards, once proper driver support is available. The fact that the G80 uses an external TMDS and RAMDAC chip to drive displays even suggests the possibility of cards a la the GeForce 7950 GX2 with dual GPUs and one display chip or even of “headless” GPU-only cards expressly intended for use with SLI. (Of course, daisy-chained cards would in all likelihood have to be equipped with single-slot coolers in order to fit into any standard-sized motherboard.) I expect the cards and drivers will materialize over time, but they’re probably not top priorities for the green team at present. They are, after all, winning the performance sweeps quite handily, and their G80 drivers remain a work in progress.

A more glaring omission is something I simply expected to see with GeForce 8800 SLI out of the box: SLI antialiasing. Both Nvidia’s SLI and ATI’s CrossFire can use multiple GPUs to achieve higher degrees of edge antialiasing than is possible with a single GPU. These antialiasing modes function as load-balancing methods that sacrifice raw performance for improved image quality. On the GeForce 7 series, SLI AA can deliver up to 16 samples with two GPUs and up to 32 samples in quad SLI. Similarly, CrossFire’s dual-GPU Super AA modes reach up to 12 samples. Surely, I thought, with G80’s nifty 16X coverage-sampled antialiasing, we’ll see SLI CSAA modes up to 32X in dual-GPU configurations. Turns out that’s not the case, at least not yet. Nvidia’s drivers haven’t yet enabled any SLI AA modes on the GeForce 8800. Support for SLI AA on the 8800 is in the works, but it’s not here yet, and we don’t have any ETA for it at present.

Fortunately, that’s not a major problem, given the G80 GPU’s excellent native AA support. We’ll compare SLI AA and SuperAA to GeForce 8800 antialiasing in the following pages, and you’ll see what I mean.

One display to rule them all

1600×1200 is a subset of the 3007WFP’s max resolution

Ok, so I resisted jumping on the mega-display bandwagon when ATI and Nvidia first started talking up “Extreme HD gaming” and the like. I first saw the Dell UltraSharp 3007WFP at CES last year, and I thought, “Hey, that’s gorgeous, but I’d rather have a used car for that kind of money,” and went on my merry way. I didn’t really consider the likes of this monitor or Apple’s big Cinema Displays as a live option for most folks, so grabbing one for video card testing seemed like a stretch. In other words, I’m really cheap.

I had other reasons to resist, too, including the fact that LCDs have traditionally been slower and had less color contrast than the best CRTs. Even more importantly, multiscan CRTs are capable of displaying multiple resolutions more or less optimally, while LCDs have a single, fixed native resolution.

However, LCD prices continue to plummet, and I recently found myself staring at a price of just under $1200 for a 3007WFP. That isn’t exactly cheap, but then we’re talking about reviewing a graphics subsystem that costs roughly the same amount. To mate it with a lesser display would be, well, unbalanced.

And the progress LCDs have made over CRTs has been tremendous lately. Not only have they made huge strides in color clarity and reached more-or-less acceptable response times, but they have also simply eclipsed CRTs in terms of resolution. In fact, every single common PC display resolution is a subset of the 3007WFP’s native 2560×1600 grid, so the 3007WFP can show lower resolutions in pixel-perfect 1:1 clarity as a box in the middle of the screen, if scaling is disabled.

That realization pretty much quelled any graphics-geek or display-purist objections I had to using an LCD for graphics testing, and I was at last dragged into the 21st century with the rest of the world—$1200 poorer, but with one really sweet monitor on the GPU test bench.

Make no mistake: the 3007WFP is worth every penny. It’s that good. Once you see it in person, sitting on your desk, it will force you to reevaluate your opinions on a host of issues, including the need for multiple monitors, the aspect ratio debate, the prospects for desktop computers versus laptops (’tis bright!), and whether your eight-megapixel digital photos are really sharp enough.

Moreover, the 3007WFP (or something like it) really is necessary to take advantage of two GeForce 8800 cards in SLI in the vast majority of current games. Prior to the 3007WFP, our max resolution for testing was 2048×1536 on a 21″ CRT. In our initial GeForce 8800 review, we found that the GeForce 8800 GTX could run the gorgeous and graphically intensive The Elder Scrolls IV: Oblivion at this resolution with the game’s Ultra High Quality presets, 4X AA with transparency supersampling, and high quality 16X anisotropic filtering at very acceptable frame rates. The GTX hit an average of 54 FPS and a low of 41 FPS, while the GTS averaged 37 FPS and bottomed out at 27 FPS The GTX even achieved a very playable 45 FPS with 16X CSAA added to the mix. Who needs a second GPU when a single graphics card can crank out visuals like that at playable frame rates? Going for 8800 SLI doesn’t make sense in the context of a more conventional display, at least not with today’s game titles.

Width Height Mpixels
640 480 0.3
720 480 0.3
1024 728 0.8
1280 720 0.9
1280 960 1.2
1280 1024 1.3
1400 1050 1.5
1680 1050 1.8
1600 1200 1.9
1920 1080 2.1
1920 1200 2.3
1920 1440 2.8
2048 1536 3.1
2560 1600 4.1

The 3007WFP’s 2560×1600 resolution, though, raises the ante substantially. The table to the left shows a number of common PC and TV display resolutions along with their pixel counts. As you can see, four megapixels is a class unto itself—well above the three megapixels of our previous max, 2048×1536, or the more common two-megapixel modes like 1600×1200 or 1920×1080. The screen’s 16:10 (or 8:5, if you’re picky) aspect ratio also mirrors the shape of things to come, with the growing popularity of wide-aspect displays in everything from laptops to desktops to HDTVs. In fact, our recent poll suggested 45% of our readers already use a wide-aspect 16:9 or 16:10 primary display with their PCs. Given that fact, I had hoped to conduct testing for this article in a trio of 16:10 resolutions: 1680×1050, 1920×1200, and 2560×1600. That would have given us resolutions of 1.8, 2.3, and 4.1 megapixels, all with the same aspect ratio. I have my doubts about whether the shape of the viewport will impact performance in any noticeable way, but I wanted to give it a shot. However, I ran into problems with both games and video drivers supporting wide-aspect resolutions consistently, so I had to revert to another option, testing at 1600×1200, 2048×1536, and 2560×1600. That gives us a tidy look at performance scaling at two, three, and four megapixels.

Insects crawl out from under the multi-GPU woodwork
While we’re on the subject of expensive things that are nice to have, we should take a moment to note some of the problems you’re buying into if you go this route. I can’t tell you how many times Nvidia and ATI have extolled the virtues of running “extreme HD gaming” involving high resolutions, wide aspect ratios, and multiple GPUs in the past couple of years. Too many to count. Yet when we took the plunge and went head-on into this world, we ran into unexpectedly frequent problems.

Of course, you’re probably already aware that multi-GPU support in games is spotty, because it typically requires a profile in the video card driver or special allowances in the game code itself. On top of that, not all games scale well with multiple GPUs, because performance scaling depends on whether the game is compatible with one of the various possible load-balancing methods. Nvidia and ATI have worked to encourage game developers to make their applications compatible with the best methods, such as alternate-frame rendering, but top new games still may not work well with SLI or CrossFire. This drawback is largely accepted as a given for multi-GPU configs today.

However, we ran into a number of other problems with wide-aspect multi-GPU gaming during our testing with SLI and CrossFire, including the following:

  • Nvidia’s control panel has settings to disable the scaling up of lower resolutions to fit the 3007WFP’s full res, but this option doesn’t “stick” on the latest GeForce 8800 drivers. Getting a native 1:1 display at lower resolutions on this monitor currently isn’t possible on the GeForce 8800 as a result. Nvidia confirmed for us that this is an unresolved bug. They are planning a fix, but it’s not available yet. This one isn’t an issue on the GeForce 7 or with the corresponding display settings in ATI’s Catalyst Control Center for the Radeon X1950.

  • Nvidia’s G80 drivers also offer no option for 1680×1050 display resolutions, either for the Windows desktop or for 3D games—another confirmed bug. I’m unsure whether this problem affects monitors with native 1680×1050 resolutions like the Dell UltraSharp 2007WFP, but it’s a jarring omission, nonetheless.

  • When picking various antialiasing modes via the Nvidia control panel on the GeForce 8800 with SLI, we found that Quake 4 crashed repeatedly. We had to reboot between mode changes in order avoid crashes.

  • With Nvidia’s latest official GeForce 7-series drivers, 93.71, we found that quad SLI’s best feature, the SLI 8X AA mode, did not work. The system appeared to be doing 2X antialiasing instead. We had to revert to the ForceWare 91.47 drivers to test quad SLI 8X AA.

  • At higher resolutions with 4X AA and 16X aniso filtering enabled, Radeon X1950 CrossFire doesn’t work properly in Oblivion. The screen appears washed out, and antialiasing is not applied. ATI confirmed this is a bug in the Catalyst 7.1 drivers. We tried dropping back to Catalyst 6.12 and 6.11, with similar results, and ATI then confirmed that this bug has been around for a while. We had to go back to Cat 6.10 in order to test in Oblivion.

  • This isn’t entirely the fault of the graphics chip makers, but in-game support for wide-aspect resolutions is still spotty. For instance, the PC version of Rainbow Six: Vegas lacks built-in wide-aspect resolution options, despite the fact that it’s a port from the HDTV-enabled Xbox 360. That may be one symptom of a larger problem, that R6: Vegas is a half-assed PC port, but the game comes with an ATI logo on the box. Similarly, Battlefield 2142 has no widescreen support, and ships with an Nvidia “The way it’s meant to be played” logo on its box. Apparently, the extreme HD gaming hype hasn’t yet seeped into these firms’ developer relations programs.

  • It’s still not possible to drive multiple monitors with SLI or CrossFire enabled. As I understand it, this is a software limitation.

  • Finally, Nvidia’s first official Windows Vista driver release will not include SLI support.

I also think Nvidia has taken a pronounced step backward with its new control panel application for a number of reasons, including the fact that they’re buried the option to turn on SLI’s load balancing indicators, but I will step down off of my soapbox now. The point remains that support for widescreen gaming at very high resolutions with multiple GPUs doesn’t appear to be a high priority for either Nvidia or ATI. Perhaps that’s understandable given the impending debut of Windows Vista and DirectX 10, which will require new drivers for all of their GPUs. Still, those who venture into this territory can expect to encounter problems. These issues are probably more plentiful with the GeForce 8800 because it’s still quite green. If you’re planning on laying out over $1100 on a pair of graphics cards, you might be expecting a rock-solid user experience. Based on what I’ve seen, you should expect some growing pains instead.

Now that I’ve wrecked any desire you had to build an SLI gaming rig, let’s repair that a little by looking at some performance numbers.

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core 2 Extreme X6800 2.93GHz Core 2 Extreme X6800 2.93GHz
System bus 1066MHz (266MHz quad-pumped) 1066MHz (266MHz quad-pumped)
Motherboard Asus P5N32-SLI SE Deluxe Asus P5W DH Deluxe
BIOS revision 0305 0801
North bridge nForce4 SLI X16 Intel Edition 975X MCH
South bridge nForce4 MCP ICH7R
Chipset drivers ForceWare 6.86 INF Update 7.2.2.1007
Intel Matrix Storage Manager 5.5.0.1035
Memory size 2GB (2 DIMMs) 2GB (2 DIMMs)
Memory type Corsair TWIN2X2048-8500C5
DDR2 SDRAM
at 800MHz
Corsair TWIN2X2048-8500C5
DDR2 SDRAM
at 800MHz
CAS latency (CL) 4 4
RAS to CAS delay (tRCD) 4 4
RAS precharge (tRP) 4 4
Cycle time (tRAS) 15 15
Hard drive Maxtor DiamondMax 10 250GB SATA 150 Maxtor DiamondMax 10 250GB SATA 150
Audio Integrated nForce4/ALC850
with Realtek 5.10.0.6200 drivers
Integrated ICH7R/ALC882M
with Realtek 5.10.00.5345 drivers
Graphics GeForce 7900 GTX 512MB PCIe
with ForceWare 93.71 drivers
Radeon X1950 XTX 512MB PCIe + Radeon X1950 CrossFire
with Catalyst 7.1 drivers
Dual GeForce 7900 GTX 512MB PCIe
with ForceWare 93.71 drivers
Radeon X1950 XTX 512MB PCIe
with Catalyst 7.1 drivers
Dual GeForce 7950 GX2 1GB PCIe
with ForceWare 93.71 drivers
GeForce 8800 GTS 640MB PCIe
with ForceWare 97.92 drivers
Dual GeForce 8800 GTS 640MB PCIe
with ForceWare 97.92 drivers
GeForce 8800 GTX 768MB PCIe
with ForceWare 97.92 drivers
Dual GeForce 8800 GTX 768MB PCIe
with ForceWare 97.92 drivers
OS Windows XP Professional (32-bit)
OS updates Service Pack 2, DirectX 9.0c update (December 2006)

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by OCZ GameXStream 700W power supply units. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults.

The test systems’ Windows desktops were set at 1280×960 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Quantifying the insanity
Shader arithmetic is becoming ever more important as games take better advantage of GPU programmability, but basic pixel fill rates and texturing capabilities remain an important component of overall performance. Delivered performance in these categories is also very much tied to memory bandwidth, so we’ll take a look at that, as well. Here are the theoretical peak numbers for single graphics cards; optimally, these numbers would double in SLI or CrossFire, with perfect scaling.

Core
clock
(MHz)
Pixels/
clock
Peak
fill rate
(Mpixels/s)
Textures/
clock
Peak
fill rate
(Mtexels/s)
Effective
memory
clock (MHz)
Memory
bus width
(bits)
Peak memory
bandwidth
(GB/s)
GeForce 7900 GTX 650 16 10400 24 15600 1600 256 51.2
Radeon X1950 XTX 650 16 10400 16 10400 2000 256 64.0
GeForce 8800 GTS 500 20 10000 24 12000 1600 320 64.0
GeForce 7950 GX2 2 * 500 32 16000 48 24000 1200 2 * 256 76.8
GeForce 8800 GTX 575 24 13800 32 18400 1800 384 86.4

Note that the GeForce 7950 GX2 as listed already includes two G71 GPUs with their associated memory subsystems. You’ve got to then double those numbers for quad SLI configs. Nevertheless, the GeForce 8800 GTX has more memory bandwidth than the GX2, so dual GTXs in SLI will have more available memory bandwidth than a quad SLI rig. Yikes.

The GeForce 8800 GTS, meanwhile, doesn’t compare favorably to the GeForce 7900 GTX in terms of pixel and texel fill rates, but you might suspect that won’t be an issue when it comes time to run the latest games. Let’s see how well these single and multi-GPU configs deliver on their theoretical promise in a synthetic benchmark.

Both GeForce 8800 SLI systems come close to their theoretical peaks for multitextured fill rate here, and those are very high indeed. They’re not the highest of the bunch, though. The quad SLI rig is the fastest, and the GeForce 8800 GTS SLI setup trails the GeForce 7900 GTX in SLI. Trouble is, we’re about to go prove that doesn’t really matter.

Quake 4
In order to make sure we pushed the video cards as hard as possible, we enabled Quake 4’s multiprocessor support before testing. We used the game’s “playnettimedemo” to play back our gaming session with the game engine’s physics and game logic active.

Now we come to the boring part of our commentary, when there’s little left to do but stop and wonder. Ooh. Ahh!

Nearly all of the multi-GPU systems tested deliver playable frame rates at 2560×1600, and the single 8800 GTX’s 52.5 FPS is plenty fast, as well. For what it’s worth, the 8800 GTX in SLI clearly outpaces the quad SLI system based on dual 7950 GX2s, though both are more than fast enough. Typically, quad SLI is held back by the three-buffer limit in DirectX 9, but that’s not a problem in the OpenGL-based Quake 4. Even so, the 8800 GTXs in SLI in are faster.

F.E.A.R.
We’re using F.E.A.R.’s built-in “test settings” benchmark for a quick, repeatable comparison.

F.E.A.R. is a lot to handle at 2560×1600, and the GeForce 8800 GTX in SLI handles it best, followed by the quad SLI system. If you were running a GeForce 8800 GTS SLI setup, you’d probably want to drop back to 2X AA or to a lower resolution or lower quality setting in this game. CrossFire configs have long had trouble with F.E.AR., and the problems continue here. The Radeon X1950 XTX CrossFire system turns in a decent average frame rate, but its minimum frame rate is quite low—and exactly the same, at 14 FPS, as a single Radeon X1950 XTX.

Half-Life 2: Episode One
The Source game engine uses an integer data format for its high dynamic range rendering, which allows all of the cards here to combine HDR rendering with 4X antialiasing.

Both GeForce 8800 cards can run Half-Life 2: Episode One just fine without the assistance of a second GPU. The GTS averages over 60 frames per second with HDR lighting, 4X AA, and 16X aniso. SLI does scale up from there, but I’m not sure Alyx at 100Hz is any better than Alyx at 60Hz.

The Elder Scrolls IV: Oblivion
We tested Oblivion by manually playing through a specific point in the game five times while recording frame rates using the FRAPS utility. Each gameplay sequence lasted 60 seconds. This method has the advantage of simulating real gameplay quite closely, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered. We set Oblivion’s graphical settings to “Ultra High Quality.” The screen resolution was set to 2560×1600 resolution, with HDR lighting enabled. 16X anisotropic filtering and 4X AA was forced on via the cards’ driver control panels. Since the G71 GPU can’t do 16-bit floating-point texture filtering and blending in combination with antialiasing, it had to sit out these tests.

Again, the GeForce 8800 GTX doesn’t look to need any additional help in order to run a recent game, even one as good-looking as Oblivion, at 2560×1600 resolution. In order to further stress these configs, I turned up the quality levels on everything. All quality sliders in the game were completely maxed out. In the cards’ driver control panels, I enabled 8X antialiasing. For the GeForce 8800s, that means 8X coverage sampled AA, and for the Radeon X1950 XTX CrossFire, it means 8X Super AA. I also turned on transparency supersampling on the GeForce 8800s and the corresponding high-quality adaptive AA mode on the Radeons, and I set texture filtering to its highest available quality setting. The cards then performed like so:

Push hard enough, and even a GeForce 8800 GTX will start to show signs of stress. With these settings, SLI makes the difference between definitely smooth, playable frame rates and borderline ones. Although, honestly, I think the 8800 GTX feels quite playable in Oblivion at these settings, and if it’s a problem, turning down transparency supersampling gets frame rates up to more than acceptable levels. To give you an idea of the sort of visuals we’re talking about, here’s a screenshot of an outdoor area in Oblivion loaded with vegetation. I used all of the settings above, except with 16X CSAA to make edges look a little smoother. Dual 8800 GTXs in SLI handle this scene at over 30 FPS.


Oblivion at 2560×1600 with HDR lighting, 16X aniso, 16X CSAA, transparency supersampling, and HQ filtering.
Click for a full-sized version.

Rainbow Six: Vegas
This game is a new addition to our test suite, notable because it’s the first game we’ve tested based on Unreal Engine 3. This game doesn’t support 2560×1600 resolution out of the box, so we used the files available here to get it working at that res. Also, the game doesn’t have an SLI profile yet, but at Nvidia’s suggestion, I renamed the executable to “UTGame.exe” in order to get it working with SLI.

As with Oblivion, we tested with FRAPS. This time, I played through a 60-second portion of the “Border Town” map in the game’s Terrorist Hunt mode, with all of the game’s quality options cranked.

Note to self: when deciding test resolutions for lots of different configs, do not base the settings on what’s playable with the GeForce 8800 GTX in SLI. Man, playing this game on any of the other configs was painful, and it was practically impossible on the quad SLI system. Doh. Obviously, the GeForce 8800 GTX SLI system has everything else outclassed, so it’s able to run this game smoothly at 2560×1600 when nothing else can.

Ghost Recon Advanced Warfighter
We tested GRAW with FRAPS, as well. We cranked up all of the quality settings for this game, with the exception of antialiasing, since the game engine doesn’t take to AA very well.

Here, again, SLI raises frame rates just enough to assure fluid playability.

3DMark06

This will be a quick one, since I don’t have much to say about these results. They pretty much confirm what we’ve seen elsewhere.

The GeForce 8800 SLI systems seem to hit a limit in the simple vertex shader test at 307 FPS. Even though the individual 8800 GTS and GTX cards are faster than a single Radeon X1950 XTX, the CrossFire system comes out on top overall. In the complex vertex test, the 8800s in SLI reign supreme.

GeForce 8800 versus SLI AA and Super AA: the patterns
Antialiasing can get pretty complicated these days, but we can still sort through what the various GPUs are doing, by and large. The table below shows AA sampling patterns for the various modes used by each GPU or multi-GPU scheme. Samples for antialiasing routines are captured at a sub-pixel level, inside the area covered by a pixel. Basically, these patterns show from where inside of the pixel each of the samples are taken.

The multisampling AA routines in current GPUs rely on several different sample types, as well. The first of those is the sub-pixel color and depth (Z) information conferred by the sample point’s location in a polygon. The second is the polygon coverage information itself, and the third is the color information conferred by any textures or shader programs. Multisampling saves on bandwidth and fill rate by grabbing a smaller number of texture/shader samples per pixel than it does color/Z and coverage samples. The GPU then sorts out how to blend the texture and color information for the various sub-pixel fragments using the coverage information. For most of the GPUs involved here, the samples in the table below show only two sample types: texture/shader samples, which are green, and color/Z/coverage samples, which are pink.

The GeForce 8800’s coverage sampled AA is a further optimization of multisampling that discards a portion of the color/Z information that the GPU collects but preserves the information about which polygons cover the sample points. For instance, the 8X CSAA mode stores one texture/shader sample, four color/Z samples, and eight coverage samples. Doing so generally allows for a sufficient amount of color information, along with more accurate blending thanks to the additional coverage info. If you’re unfamiliar with CSAA, I suggest reading the section on it in my GeForce 8800 review. CSAA generally offers a very nice combination of performance and antialiasing quality for edge pixels.

The trick with the GeForce 8800 is figuring out where the coverage samples are located, since our usual FSAA Viewer tool doesn’t show them. Fortunately, we’ve managed to snag a tool that shows these coverage sample points, and I’ve overlaid them with our usual FSAA Viewer results in the table below for the GeForce 8800’s 8X, 8xQ, 16X, and 16xQ antialiasing modes. The smaller red dots in those patterns are the coverage sample locations.

I’ve also included results for the various multi-GPU antialiasing modes below. Many of them involve higher numbers of texture/shader samples, because they are the result of blending pixels from two different GPUs produced by their regular AA methods. Since the GeForce 8800 doesn’t yet have SLI AA support, it has to rely on its own native single-GPU AA modes.

GeForce
7900 GTX
GeForce
7900 GTX
SLI
GeForce
7950 GX2
SLI
GeForce
8800 GTX
Radeon
X1950 XTX
Radeon
X1950 XTX
CrossFire
2X

4X

6X

8x

8xS/8xQ
/8X/10X

12X

14X

16X

16xQ

32X

So we have lots of dots. What do they mean? First and foremost, I’d say we’ve learned that above 4X, the names given to the various modes—8X, 10X, and the like—don’t always mean the same things in terms of sample types and sizes. Comparing between them is difficult.

The new information here for us in the CSAA coverage sample patterns from the GeForce 8800, which are rather interesting. We can see immediately that the G80’s 8xQ mode is indeed simply a traditional 8X multisampling mode, where coverage information corresponds exactly to the location of color/Z samples.

The sample pattern for the CSAA 8X mode is notable, too. Four of the coverage samples correspond with the four color/Z samples, while the additional coverage samples are all located close to the pixel center, grouped around the texture/shader sample point. Nvidia seems to be betting that additional coverage information from the center of the pixel, near the texture/shader sample point, will produce the best results.

The 16X and 16xQ CSAA modes take a different approach, with no exact correspondence between color/Z sample points and coverage sample points. These two coverage sample patterns are both largely quasi-random, but they’re also different from one another. Nvidia claims these patterns were chosen intentionally to get the best results from the number of color/Z samples in each mode. In the 16X CSAA mode, four of the coverage samples come just the outside of the color/Z sample points. In the 16xQ mode, that’s not the case, but one of the coverage samples does appear to correspond with the texture/shader sample location.

Obviously, the older GPUs store more texture and color/Z samples than the GeForce 8800, especially in their multi-GPU AA modes. This fact should make them relatively slower—as should the fact that the multi-GPU AA modes aren’t particularly efficient methods of load balancing. The question is: how does the GeForce 8800 stack up in terms of image quality?


GeForce 8800 versus SLI AA and Super AA: image quality
Fortunately, the image quality question is fairly easy to answer by looking at some screenshots. The images below were taken from Half-Life 2 and blown up to exactly 4X their original size to make the individual pixels easier to see. I’ve chosen to focus on this little section of the screen because it shows us high-contrast edges at three different angles, all of which are relatively difficult cases for antialiasing.

In order to save space in the table below, I’ve taken some liberties with grouping the higher AA modes together. As we noted on the previous page, comparing between the modes exactly is difficult, so please indulge me. The groupings aren’t meant to suggest exact equivalency between the various AA modes.

Antialiasing quality
GeForce 7900 GTX SLI GeForce 7950 GX2 SLI Radeon X1950 XTX
CrossFire
GeForce 8800 GTX
No AA

2X

4X

6X

8xS 8xS SuperAA 8X 8X

SLI 8X SLI 8X SuperAA 10X 8xQ

Super AA 12X

SLI 16X SLI 16X Super AA 14X 16X

SLI 32X 16xQ

The relevant comparisons here are in the 8X and 16X modes, roughly speaking. You may see things differently, but I happen to think the GeForce 8800’s CSAA 8X mode matches up well against the SLI AA 8X and Super AA 8X modes, perhaps even comparing favorably to them. The additional color/Z samples in the G80’s 8xQ mode don’t seem to add much, if anything.

At 16X, the G80’s CSAA still holds up well against the Radeon’s 14X SuperAA and the GeForce 7 series’ SLI 16X modes, although the G71’s SLI 16X AA does look awfully good. Again, 16xQ doesn’t look much better than 16X, despite the additional color samples. I’d say quad SLI’s 32X mode produces the smoothest gradients of all, befitting its larger sample size.

Overall, then, the GeForce 8800’s coverage sampled AA manages to fend off two or even four previous-gen GPUs working in tandem, despite storing fewer color/Z samples. That’s very impressive, and as you might imagine, it leads to very good things when performance and image quality intersect.

GeForce 8800 versus SLI AA and Super AA: performance
Here’s how the various single and multi-GPU setups scale across their respective antialiasing modes. I’ve broken the results out into three separate graphs due to the difficulty of directly comparing the GPUs’ various AA modes. All of the graphs have the same scale, though, and we can draw some conclusions based on these performance results and the image quality info from the last page.

There’s very little performance hit associated with the CSAA 8X and 16X modes of the GeForce 8800, and given their image quality, I’d say they merit direct comparison to the multi-GPU 8X and 16X modes on the older cards. That gets kind of ugly, though:

The long and the short of it is that the G80’s excellent and very efficient coverage sampled AA puts it on top, even without special SLI AA modes in Nvidia’s current drivers.

Power consumption
We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. Out of necessity, we’re using a different motherboard for the CrossFire system, but for our power and noise tests, we tested the single Radeon X1950 XTX into the same motherboard as the rest of the single-card and SLI configs. Otherwise, the system components other than the video cards were kept the same.

The idle measurements were taken at the Windows desktop with SpeedStep power management enabled. The cards were tested under load running Oblivion using the game’s Ultra High Quality settings at 2560×1600 resolution with 16X anisotropic filtering. SpeedStep was disabled for the load tests.

The GeForce 8800 isn’t exactly easy on the juice, and SLI only exacerbates the situation. Still, the 8800 GTX SLI rig manages to draw less power under load than the Radeon X1950 XTX CrossFire system, remarkably enough. Noise levels and cooling
We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 14″ from the test system at a height even with the top of the video card. The meter was aimed at the very center of the test systems’ motherboards, so that no airflow from the CPU or video card coolers passed directly over the meter’s microphone. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured, including CPU and chipset fans. We had temperature-based fan speed controls enabled on the motherboard, just as we would in a working system. We think that’s a fair method of measuring, since (to give one example) running a pair of cards in SLI may cause the motherboard’s coolers to work harder. The motherboard we used for all single-card and SLI configurations was the Asus P5N32-SLI SE Deluxe, which on our open test bench required an auxiliary chipset cooler. The Asus P5W DH Deluxe motherboard we used for CrossFire testing, however, didn’t require a chipset cooler. In all cases, we used a Zalman CNPS9500 LED to cool the CPU.

Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a cards’ highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

We measured the coolers at idle on the Windows desktop and under load while running Oblivion at 2560×1600 with 16X aniso.

The GeForce 8800’s cooler remains very impressive for its ability to cool a 680 million transistor chip without making a racket. The picture doesn’t change too much in SLI, fortunately.

Conclusions
You really do need a four-megapixel display like the Dell 3007WFP in order to take full advantage of GeForce 8800 SLI with today’s games. Even then, a single GeForce 8800 GTX is often fast enough to drive a 2560×1600 monitor quite well without the aid of a second GPU—witness our test results in Quake 4, Half-Life 2: Episode One, and even Oblivion. Heck, even the G80’s CSAA 8X and 16X modes have so little performance penalty, one doesn’t really need SLI for them. The big exception is Rainbow Six: Vegas, which is brutal at 2560×1600 on everything but a pair of 8800 GTXs. As the first Unreal Engine 3 game we’ve tested, it may be an indicator of things to come, but I’m not quite sure. It may also just be a lousy port from the Xbox 360. That said, more intensive games are always coming, and there will likely be a reason to upgrade to a second GeForce 8800—even a second GTX—at some point in the next year or so. For now, though, you may want to keep a PCIe slot open and wait. That said, I’ve played through decent chunks of both Rainbow Six: Vegas and Oblivion with a pair of 8800 GTXs in SLI on the Dell 3007WFP, and it’s a glorious thing, having smooth-as-glass frame rates with incredible image quality on a massive, detailed display. If you have to means to treat yourself to such a setup, the visceral experience certainly won’t disappoint.

I wish I could say the same for the driver support, but Nvidia doesn’t yet have all of the wrinkles ironed out of GeForce 8800 SLI in concert with wide-aspect displays. I do expect most of the problems I’ve noted to be fixed eventually, but with Vista imminent, I fear those fixes may be on the backburner for longer than usual. We’ll have to see. When they come, they may be accompanied by all kinds of other new goodies for 8800 SLI, including things like CSAA 32X antialiasing and three- or four-GPU SLI daisy chaining. At that point, we’ll have to hope new games are out to harness that power properly. An eight-megapixel display would be well and good, but at this point, I’d definitely rather have better pixels than more of them.

Comments closed
    • Chrispy_
    • 13 years ago

    Great article Scott, especially the Oblivion bit, finding a single application and cranking out test settings that can really stress these things properly.

    I’m still muddling along with my old 1800XT but I can’t help feel, with my 2560×1024 desktop that 4xAA is enough. When you have THAT MUCH screen real estate, as long as you are sitting further than 6 inches away from your screen 32x AA really doesn’t seem to work for me.

    At about 35″ from left to right, I have to sit three feet away from my screens just to get both edges in view, and even though my eyesight is pretty good, I sometimes don’t even notice that I’m only running 2x instead of 4x.

    Sure, I can see a definite difference between ATI’s 4x AA and nVidia’s 32x AA, but if you had to quantify it, where no AA was zero and 32x was 1, I’d give the ATI 4x about a 7 or an 8. Maybe even a 9 when sitting 3 feet from my screen.

    Maybe your next poll should be “How much of your GPU’s AA do you use?” options being “AA gives me slideshows”, “AA makes baby Jesus cry”, and “I can’t wait for 4096x FSAA” 🙂

    • moose17145
    • 13 years ago

    So what did everyone think of that Oblivion picture? HOLY CRAP!!! I had to open that thing across all 4 of my monitors just to see all of it! And it was GORGEOUS! I can only imagine playing oblivion in a resolution like that with all the settings cranked up that high and still getting 50 or more fps. unbelievable! (sorry to all who did not enjoy Oblivion)

      • Beomagi
      • 13 years ago

      there’s a thread on techpowerup “what’s more powerful xbox360 gpu or 8800”

      I just had to post that picture 🙂

      case closed!

    • slot_one
    • 13 years ago

    oblivion looks like a friggin photo… :O

    • DrDillyBar
    • 13 years ago

    Wow. That’s pretty impressive. 30″ would be the way to go for a review on dual 8800’s.

    • Freon
    • 13 years ago

    Wow. Looking back, what you paid for that 3007FP is close to what I spent on the early 2000FP. Hopefully in 6 years the 30″ will be as cheap as the 20″ now.

    • My Johnson
    • 13 years ago

    Wow, pretty pictures…

    (oblivion shot)

    • PenGun
    • 13 years ago

    It is a whole different ball game. I’m playing all kinds of games over on my new Sony 34XS955. I have a GT6600 but my TV does games better at 720p than 1080i as you get 60 fps with a progressive scan and only 30 fps interlaced.

    It must be just sweet at those high resolutions but I am having loads of fun with my 34″ at 1280×720. Suits the card quite well actually. I will get an 8000 series card soon though for better quality.

    D2X-XL Descent on this thing is a panic, I’m getting nothing done ;).

    • Damage
    • 13 years ago

    <iframe src=”http://digg.com/api/diggthis.php?u=http://digg.com/hardware/Two_GeForce_8800_GTX_cards_tested_in_SLI_mode” height=”82″ width=”52″ frameborder=”1″ scrolling=”no”></iframe>

    • Bensam123
    • 13 years ago

    “That may be one symptom of a larger problem, that R6: Vegas is a half-assed PC port..”

    Bonzai! Bonzai! Bonzai!

    You should visit the official forums sometime and see all the pissed off people who will agree with you.

    It would be nice if we could see some TrippleHead2Go action in there as well…

      • SonicSilicon
      • 13 years ago

      I second a TripleHead2Go comparison!
      It would be rather interesting to see how extreme aspect ratios (3:1 ; 24:5) play a role in graphical performance.

      • Pupitmiser
      • 13 years ago

      Yeah, one thing I have to say about the triplehead2go (I have one). I have a BFG 7900 GTX and all the games you showed running at 2560×1600, which is only .1 MP more than the triplehead resolution, lagged horribly. On my system this is definately not the case. Even in Dark Messiah cranked up to Ultra Quality I get about 40-50 FPS. Given, that’s less than what I’d like, but I think the 7900 GTX deserves a little more credit. One last thing is that, at least 3840×1024, I tend to sit about 20″ away from my monitors. Anti-aliasing isn’t as much of an issue as it used to be on smaller displays when you’re squinting for more detail. If you’d like I can post some screenshots with a fraps meter.

    • albundy
    • 13 years ago

    damnn, thats some nice hi def gaming. but there is no way that i would get a 30 incher. Saved my money for a BenQ FP241W 24 incher. As long as it does 1080p, super low latency, rotates, I’m happy!

    • SpotTheCat
    • 13 years ago

    Damage – Have you tried any multi-monitor configurations with the new beast? Your 2001FP is the same width as the 30″ is tall, put it on it’s side and have a REALLY nice rig for flight sim.

      • Freon
      • 13 years ago

      Do they still make you turn off multi-monitor to turn on SLI capability?

    • Chaos-Storm
    • 13 years ago

    So far it looks like the 8800 series has the most consistant and best scaling sli out of any card i’ve seen. I think that everyone’s arguements about sli not reaching ful potential because of other bottlenecks are not valid, because those occur with a single card too. They can also always be rectified by turning up graphic settings, resolutions or playing more demanding games or getting patches that make it more graphically intensive (oblivion, for instance has many mods that make it look even better)

    The reason i’m saying this is because many people spend money on much stupider things that give much less gaming benefit that SLI such as high end memory modules, and high speed grade CPU’s, when lower grades can be easily overclocked.

    This is not to say that getting 2 8800GTX is not ridiculous overkill for most users at this point. But saying that SLI is useless, or a gimmick does not change the fact that its the only thing that will allow the latest games to be played at 2500×1600. I mean, i don’t even want SLI because of the incredible cost associated with it and all the extra heat and power requirements. But it certainly has its place, and is a much more sensible gaming option that shelling out 1000 for a CPU alone.

    • moose17145
    • 13 years ago

    y[

    • PRIME1
    • 13 years ago

    I guess they had Crysis running at full rez at CES and SLI ran it smoothly.

    • ChangWang
    • 13 years ago

    Just my opinion, but SLI and Crossfire is a crock for what you get. Now everyone is entitled to their own opinion, but am I the only one that feels that this late in the game, with as long as SLI has been out, that SLI should give you at least a 75% performance gain 100% of the time? I mean, to spend all that cash on hardware only to have issues with your favorite title just ain’t right…

      • thecoldanddarkone
      • 13 years ago

      8800gtx almost 95% gain….

        • ChangWang
        • 13 years ago

        In EVERY title, including those not in the TR review…

          • shank15217
          • 13 years ago

          You want 75% in all games however some games are not gpu limited and after a certain framerate it becomes academic. Certainly the SLI setups get better fps the higher resolution you go but it also has a limit. There are too many factors, sorry it just doesn’t make sense what you are asking.

            • ChangWang
            • 13 years ago

            Which is why I feel it’s a waste. Thank you for proving my point.

      • SGWB
      • 13 years ago

      SLI and CrossFire were designed to win benchmarks. That’s about it. For the 0.05%(*) of PC gamers who will buy these setups it’s all about bragging rights.

      *this statistic was pulled out of thin air. It may be BS. Please use with caution.

        • ChangWang
        • 13 years ago

        Agreed! Couldn’t have said it better myself

      • swaaye
      • 13 years ago

      Hey, if one has the money to burn, more power to ’em. I’d buy SLI/Crossfire in a flash if I had the cash. That’s all it really comes down to.

      But, if I had that kind of $$, I’d buy other things first honestly. To each his own, however.

    • Shining Arcanine
    • 13 years ago

    Did anyone else notice that SLI scaled very close to the theoretical maximum in most of the games when the screen resolution and eye candy are set to their maximum?

      • thecoldanddarkone
      • 13 years ago

      8800gtx scaled VERY good.

      • continuum
      • 13 years ago

      Yes, and that’s a very good thing. Drool. Now I just need $2400 for a 30″ and two 8800GTX’s…

      But I had myself convinced I needed 3TB of harddisk space and an Adaptec 4805SAS controller more than a 30″ and two 8800GTX’s. **SIGH**

      Stop making things so difficult for me, Damage! :-p

    • Shinare
    • 13 years ago

    TR does enough GPU reviews to compile a “VGA Charts” type of thing. I’d be more willing to accept the results of such a thing coming from TR rather than Tom’s Hardware. Sadly, though, when I need to compare older generation hardware with more current low to midrange hardware, thats the place I have to go.

      • thecoldanddarkone
      • 13 years ago

      I do the same thing, it makes it so I know my old videocard is still playable on a newer game.

    • Dposcorp
    • 13 years ago

    BTW, anyone in SE Michigan wanna buy my Sony 24″ CRT? 🙂

      • thecoldanddarkone
      • 13 years ago

      I can’t go and see or even buy it, but whats the resolution?

        • Bauxite
        • 13 years ago

        Um CRT, it has none? :p

        Since its a 24″ and theres maybe one actual (computer) tube ever mass produced at that size… (HP etc use same one afaik)

        Maxes out around 2304×1440/2048×1536 depending on cables, vidcard, luck etc, the “ideal” rez is around 1920×1200 or a little less depending on preference/source/use.

          • drfish
          • 13 years ago

          I run mine at 2048×1280 (just to stick it to those snooty 24″ LCD punks) at 85Hz…

          Why are you selling yours Dposcorp? You get it from the same lady Gleek and I got ours from?

            • thecoldanddarkone
            • 13 years ago

            I would want one but PHEEAAARRR the weight, well that and I am broke (mostly the broke part). 😛

          • Lazier_Said
          • 13 years ago

          The 24″ Sony has a screen size of about 19 x 12 inches with an 0.23-0.27mm variable stripe pitch. Average that to 0.25mm.

          19 inches is 483mm. 483 / 0.25 = 1932 physical phosphor columns.

          So running 2048×1280 or 2304×1440 would not seem to do any good. 1920×1200 would be the highest useful resolution.

          • nagashi
          • 13 years ago

          Depends on the price, but yes actually. email me: mike@enertiadesign.com or mthorn@iqguys.com plz!!

    • Spotpuff
    • 13 years ago

    I thought NONE of the dell xx07 monitors can scale images correctly; they could only display images to fill the whole monitor. This is one of the main reasons I’m happy with my 2405FPW as I still play a lot of Warcraft 3 and having to stare at that image stretched is not fun.

    Nvidia/ATI were supposed to fix this in their drivers by sending black bars alongside the images but from what I’ve heard the settings don’t work, which is, well, stupid, but anyways, if you play some 4:3 games these are bad monitors to get.

    OTOH, more and more games support widescreen now, so it isn’t an issue unless you play older games.

    This review however seems to show that the monitor can display 4:3 images or ones that don’t fill the whole screen ok.

      • Damage
      • 13 years ago

      The video drivers’ scaling settings give you multiple options, they all appear to work fine on the GeForce 7 and Radeon X1950. I only encountered problems on the GeForce 8800.

      • Usacomp2k3
      • 13 years ago

      I think that was only a problem when using a 1080p source. (at least from the research I did on my 2407 that is in the mail 🙂

        • Spotpuff
        • 13 years ago

        Ah, sweet… I still enjoy my 2405; enjoy your 2407 🙂 Definitely one of the best purchases I’ve made for my PC.

          • Usacomp2k3
          • 13 years ago

          Oh, I will, don’t worry.

      • Dposcorp
      • 13 years ago

      It is dug.

      • BoBzeBuilder
      • 13 years ago

      I don’t understand this digg thing but dugg anyways.

        • indeego
        • 13 years ago

        If you want massive traffic on your site (estimates are 30-60,000 hits minimum per frontpage article) you beg for digg on your blog/articleg{<.<}g

      • swaaye
      • 13 years ago

      DUG it is!

    • Lazier_Said
    • 13 years ago

    Re: Noise measurements.

    Is one X1950XTX really 1.3db *[

      • Usacomp2k3
      • 13 years ago

      Could be the waves were partially cancelling each other out, due to the peak/troughs at a specific frequency, creating a ‘dead zone’ where the meter was located.
      /me shooting from his hip

        • Damage
        • 13 years ago

        Doubt it. The CrossFire cards were, by necessity, tested on a different motherboard, which probably accounts for the difference in results.

    • Dposcorp
    • 13 years ago

    Awsome review,.
    Thanks Scott and the rest of the gang.

    Thanks for the Rez chart on page 2.

    I hate to give you more work, but my only request would be to add the following rezs to your testing: 1366 x 768 and 1920×1080.

    It seems all the larger, well priced LCDs that I can afford, from 32″ to 42″ are one of those two rezs, and I am leaning towards a 37″ 1920×1080 monitor.

    The Westy LVM-37W3 the that PerfectCr talks about here is one option:

    §[< https://techreport.com/forums/viewtopic.php?t=46027&start=0&postdays=0&postorder=asc&highlight=< ]§ I am also looking at a couple of Sceptre models at Costco: §[<http://www.costco.com/Browse/Product.aspx?Prodid=11118339&whse=BC&topnav=&browse=< ]§ §[<http://www.costco.com/Browse/Product.aspx?Prodid=11098479&whse=BC&topnav=&browse=<]§ Again, thanks for the great review.

      • Damage
      • 13 years ago

      I know it’s not exactly what you’re wanting, but 1920×1080 has a similar number of pixels to 1600×1200, as the chart in the review shows. As I said, I doubt the shape of the viewport greatly impacts performance, so you can probably use our 16×12 results as a proxy for 1920×1080 when making your decisions.

      As for 1366×768, well, you don’t need a GeForce for that–you need an S3 ViRGE. 🙂 That’s lower res than my kids’ Etch-a-Sketch.

        • Jigar
        • 13 years ago

        Darn i was going to purchase 32 inch TV (Widescreen) which supports 1368 X 768 res ;(. Should i go ahead or drop the idea..

          • Damage
          • 13 years ago

          Your call. I’m just saying you don’t need GeForce 8800 (esp. not in SLI) to drive a screen like that. It’s not very high res. May be fine for a TV, but it’s not much challenge for a high-end PC GPU.

        • Dposcorp
        • 13 years ago

        Scott, thanks.

        I guess my Math-Fu failed me that time, as your comment makes great sense.

        Looks like 1080P is it then.

        Just waiting for the Scepte 37″ 1080P set to come back in stock at Costco.

        I am waiting to buy there, as I hear their warranty policy is great, and I like that kind of stuff.

        Also, like the Westy, there is a lot of info about the Sceptres all over the web.

        41, Bauxite, I really don’t want to drop to those lame rez screens, but man, at the way the prices are dropping, I might get one as a second screen.

        • morphine
        • 13 years ago

        In the review you mentioned that nV’s drivers have problems with the 1680×1050 res not working… Is that true for SLI or does the problem exists in a single-card configuration as well?

        I’m a couple weeks away from getting my GTX and I have a 22″ 1680×1050, so you left me VERY worried there.

          • thecoldanddarkone
          • 13 years ago

          I don’t know, I have a 7900gs and it works fine for the widescreen games I play.

          • crazybus
          • 13 years ago

          I think it’s just an issue of adding custom resolutions, which has been broken for all the 9x.xx forceware drivers on my 6600GT argh!!!!

          • Satyr
          • 13 years ago

          My single 8800GTS and 1680×1050 Viewsonic VX2025wm work beautifully together, so I wouldn’t worry :).

          • d0g_p00p
          • 13 years ago

          I have a 20″ WS 1680×1050 running on SLi 7800GT’s and I have never had a problem.

          You can also check
          §[< http://www.widescreengamingforum.com/wiki/index.php?title=Main_Page< ]§ and §[<http://www.widescreengaming.net/wiki/Main_Page<]§ and check it out.

          • Kindjal
          • 13 years ago

          I’m running the 8800GTS at 1680×1050 in WinXP without any problems. HL:2 Lost Coast can natively display 1680×1050 and it looks great with all the settings maxed out.

      • Bauxite
      • 13 years ago

      Don’t touch any “television” lcds unless they are 1080p…also on many of the 768ps you’re paying for a tuner you (probably) don’t need, which is why many are close in price. (the rest is -[

        • Inkedsphynx
        • 13 years ago

        REad what he wrote, the westy is 1080p

          • Bauxite
          • 13 years ago

          Read what he wrote yourself:

          y[

    • Usacomp2k3
    • 13 years ago

    Why does HL2 run better @ 2560×1600 then 2048×1536?

      • Damage
      • 13 years ago

      Doh, switched a couple of graph labels on the bar graphs. I’ve fixed them now. Sorry.

        • Usacomp2k3
        • 13 years ago

        No problem, I was just surprised. I have seen anomylies in the past where cards have done better at 1600×1200 than 1280×1024 or something, so I thought it might have been something like that. Thanks 🙂

    • Krogoth
    • 13 years ago

    Excellent review, I especially like the AA section. The G80 architech rocks at AA and AF. The review is solid proof that SLI is only good for people that have huge mointors.

    My recommended solution is just get a single 8800GTS. It is powerful enough for almost anything you throw at it, while retaining a sane MSRP for its performance.

      • Thresher
      • 13 years ago

      And getting a saner price just about every day. The price on the GTS cards has fallen a good $100 since introduction. The GTX cards haven’t dropped much at all.

    • firestorm02
    • 13 years ago

    I’m getting kinda sick of seeing all the attention high end cards get. Paying $400, $500 or even $600 per card is crazy! I would think the majority of TR readers would be more interested in more in-depth midrange card reviews. It would be nice to include older cards in the tests to help determine if upgrades are worth it (6600GT vs. 7600GT vs. 8600GT kind of thing). I’m not trying to say reviews of this kind aren’t important but rather that they should not be given highest priority.

      • eloj
      • 13 years ago

      Seconded. Reviews should assume that the people reading may be up to three generations behind. I have a friend who wanted to upgrade from an 9600XT to a 7600GT (AGP), and finding direct comparisons was a chore.

      One big problem is comparing yesteryears midrange to todays low-end, which might make sense if you’re “Okay, I can’t even play game A because it lacks Feature X, but I’m content with its performance when it works, wonder if this new entrylevel card with Feature X is about as fast, I could upgrade to that for next to nothing..”

      • davidedney123
      • 13 years ago

      Even for people who don’t buy these cards they should be of interest – technology the debuts in these cards will be in the mid range box shifters soon enough. TR does review the mid range cards, but it’s hard for the reviewer to be excited about a product that does the same thing a high end card did 12 – 24 months ago, just cheaper.

        • firestorm02
        • 13 years ago

        y[<"...product that does the same thing a high end card did 12 - 24 months ago, just cheaper. "<]y I don’t know aboot that. Look at the current 7th gen cards. You don’t get 7900GT performance from a 7600GT card 12 months later. What you get is striped down 7900GT technology. I’m not saying we should abandon high end card reviews but I would like to see more focus on the mid range cards for an accurate ‘apples to apples’ comparison.

          • Beomagi
          • 13 years ago

          more of a comparison between the 7600gt and 6800u, the x1900gt and x1800xt, the 6600gt/x700 and the 5800u/9800.

    • donkeycrock
    • 13 years ago

    can we get a moratorium on the word uber

      • firestorm02
      • 13 years ago

      EDIT: wrong reply, sorry

    • wof
    • 13 years ago

    Didn’t we conclude that SLI is a gimmick in a previous poll already ?

    SLI probably needs to be done at a lower hardware level if it’s ever going to get good stability. AFR and such junk is a joke imo.

      • wierdo
      • 13 years ago

      Yeah sorta, more or less… but my main beef is that they make charts harder to read when they’re mixed in with regular non-SLI solutions. I wouldn’t mind seeing SLI but just keepem in a seperate chart or graph and away from the single card numbers, is that too much to ask? 😛

      I liked this article because it didn’t mix SLI numbers with non-SLI, it was just all SLI, good! 😛 I hope this happens more often so we can have an easier time reading single card solutions’ charts and graphs 😀

    • Jigar
    • 13 years ago

    The supermacy of 8800GTX attracts me to again jump into PC gaming, though i dont know how long this supermacy is going to last.

    Oh one more thing SLI — SUCKS 😉

    • marvelous
    • 13 years ago

    sli is not practical for most users… To 600watt power supply to heat and power consumption…

    It’s a turn off… Then again a single 8800gtx is powerful as is…

      • Anomymous Gerbil
      • 13 years ago

      That is a myth. You do not need a 600W PSU to run these cards in SLI, as evidenced by several reviews on teh intarweb.

        • marvelous
        • 13 years ago

        O really… What power supply would you need? 1 single GTX needs 30amp on 12 volt rail… I wonder what 2 of these will need… 450watt usually carry 32amp on the 12 volt rail and is the minimum you need to run a single card so I suspect you need around 600watts…

        Not to mention it eats extra 200 watts of power just running these things in sli… Cost is also a major factor…

          • Anomymous Gerbil
          • 13 years ago

          You are of course free to suspect whatever you like, but please refer to reviews on the web that show that 600W simply isn’t required for SLI setups, even with high-end CPUs.

        • Ardrid
        • 13 years ago

        Your statement is way off. You may not need a 600W PSU to run ONE GTX but you’re certainly going to need that much to run TWO (this doesn’t even get into the discussion regarding sufficient amps on your 12V).

          • Lazier_Said
          • 13 years ago

          TR measured 375 watts at the wall for this 8800GTX SLI system.

          Assuming a very generous 85% PSU efficiency, that is 320 watts DC output.

          A 20% margin would still be under 400 watts. That most power supplies which can actually deliver 400 watts are labelled as 600W is a marketing problem.

          Seasonic claims their 500W supply will deliver 34A (408W) on 12V. I would be shocked if this would fail to run TR’s test system. Their 460W supply is rated at 32A (384W) on 12V and I would be surprised if that would not handle the TR system as well.

            • Anomymous Gerbil
            • 13 years ago

            Thank you. It’s amazing what myths people continue to believe in; in this case, even when they can read reviews of quad-CPU SLI setups running stably on ~450W PSUs at various sites.

            I guess it’s much more fun for some people to never let the facts get in the way of their prejudices.

            • Ardrid
            • 13 years ago

            I tell you what then, go out and run an 8800 GTX setup with your 450W PSU and tell me how it fares. You think Damage is using a 700W OCZ for shits and giggles? Obviously you must know something he doesn’t since he didn’t use your fabled 450W PSU, right? I guess HardOCP is wrong too. They’re all in conspiracy with the PSU manufacturers to propagate the myth, right?

            “So it comes down to, do you need a new power supply? I would suggest nothing less than a 500 watt PSU for our system in question and would feel a lot more comfortable with the headroom of 600 watt +.

            SLI? I would not even consider anything less than a 700 watt beast that is certified on the

          • Anomymous Gerbil
          • 13 years ago

          See above, and below.

    • dimak
    • 13 years ago

    I thought you needed two graphics cards to drive these large monitors. Anyone know if this is true?

      • haugland
      • 13 years ago

      No, but you do need dual dvi connectors AFAIK…

        • Anomymous Gerbil
        • 13 years ago

        No, a single connector is all you need, driven by a card with a “dual link” DVI output. Many high-end cards have this capability. I am driving my 30″ with a single DVI cable from a single 7900GT.

      • Anomymous Gerbil
      • 13 years ago

      No, you just need cards with “dual link” DVI outputs.

    • toxent
    • 13 years ago

    Good review. I have to say that the 8800 series has really made me reevaluate my opinion of nvidia. It seems they’ve got their head screwed on right, and now with a little competition from ATI(AMD, DAAMIT, or whatever) we should see cheaper prices on these new monsters too.

    I do have one question that puzzles me though. With the raw power of these cards (and presumably its new ATI competitor) we have high frame rates at high resolutions. So much so in fact, that we can now play just about /[

      • Thresher
      • 13 years ago

      Agreed.

      When I turn my gaming computer on in my little office, the ambient room temperature goes up 2 degrees within a few minutes. That’s nuts.

        • Jigar
        • 13 years ago

        i would suggest u to turn the heater level down for while 😉

    • redpriest
    • 13 years ago

    Oh yes, I almost forgot – try playing a bluray movie on that beautiful monitor. It’s enough to make you want to cry.

    • redpriest
    • 13 years ago

    There are still a lot of bugs in the SLI driver with existing games. I had to shut off SLI just because the stuttering I was getting in an MMO I play (Everquest2) was just straight up miserable.

Pin It on Pinterest

Share This