Home SLI antialiasing debuts

SLI antialiasing debuts

Scott Wasson
In our content, we occasionally include affiliate links. Should you click on these links, we may earn a commission, though this incurs no additional cost to you. Your use of this website signifies your acceptance of our terms and conditions as well as our privacy policy.

WHEN ATI FIRST ANNOUNCED its CrossFire multi-card graphics platform to the world in late May, the clever folks on the red team had a few interesting new twists to offer in their answer to NVIDIA’s SLI. One of the more appealing features of the CrossFire platform was to be a “super antialiasing” mode, allowing two graphics cards to team up in order to produce higher quality antialiasing than available on a single card alone. We liked the idea, noting that “CrossFire rigs may provide image quality benefits even in games where fill rate and geometry throughput aren’t normally at a premium.” Since there are quite a few current games that don’t really take advantage of even a single high-end graphics card, using the extra power of a second card to improve image quality makes sense.

It seems NVIDIA liked the idea, too, since they announced shortly before the launch of the GeForce 7800 GTX that they would be bringing a similar antialiasing mode to SLI systems via a driver update. Now, in an amusing bit of one-upsmanship, NVIDIA is ready to deliver drivers that will make SLI antialiasing widely available today, while ATI’s CrossFire platform is still missing in action, apparently delayed.

We managed to snag an early copy of NVIDIA’s new drivers and put SLI antialiasing to the test. Keep reading for a look at the image quality and performance of SLI antialiasing, as well as an explanation of the methods behind the madness.

Turning on SLI antialiasing
Before we go on, let me pause to acknowledge that the following discussion will assume a fair amount of prior knowledge of current graphics technology. If you’re not up to speed on such things, you may want to read over our initial look at SLI technology and our follow-up article that explains some of SLI’s limitations. You’ll then probably want to peruse our review of the new GeForce 7800 GTX graphics card, which is currently the most powerful card capable of operating in SLI.

Now, let’s dig in.

There are two new SLI antialiasing modes, 8X SLI and 16X SLI. Both are made available on SLI systems after the installation of NVIDIA’s new graphics driver. The early driver we tested was version 77.74, but 77.76 is the revision number of the first public release of this driver. Once the driver is installed, you’ll have to apply the SLI Coolbits Registry hack in order to see SLI antialiasing as an option. To enable one of the SLI AA modes, the user must pick “SLI antialiasing” from the list of SLI rendering modes in the NVIDIA control panel, like so:

Once SLI AA is enabled, the user can then choose either 8X or 16X mode via the antialiasing slider.

If that method of enabling SLI AA seems a little convoluted to you, rest easy. NVIDIA says it has plans to integrate access to the SLI antialiasing modes more organically into the regular AA-mode slider in the control panel.


Inside the SLI antialiasing modes
In this first driver to expose the feature, SLI AA is presented among the list of SLI rendering modes because that’s literally what it is: a new SLI rendering mode, distinct from the other options, including split-frame rendering (where the screen is subdivided horizontally between the two GPUs) and alternate-frame rendering (where the GPUs trade off rendering full frames in an interleaved fashion). NVIDIA says SLI AA works in all 3D games, whether they use the Direct3D or OpenGL API. SLI AA distributes the load between two graphics cards by having each card render half of the samples necessary for the final, antialiased image and then combining them together.

Take, for instance, the case of the 8X SLI antialiasing mode. This mode is actually a doubled-up version of NVIDIA’s 4X multisampled antialiasing method. The diagram below helps illustrate how it works.

Inside 8X SLI AA. Source: NVIDIA.

Each card renders the scene with 4X multisampled antialiasing, but their sample positions are slightly offset, or jittered, to make them distinct from one another. The two resulting sets of samples are then combined, yielding eight coverage samples and two texture samples, which are blended to achieve the final pixel color. The result should be the equivalent of NVIDIA’s familiar “8xS” antialiasing mode, except that the sampling patterns for SLI AA are a little bit different.

The SLI 16X AA mode is essentially the same thing, except it uses a jittered and doubled version of the 8xS mode to achieve four texture samples and sixteen coverage samples. We can take a look at the resulting sample patterns by using a simple AA pattern testing application. The red dots below represent coverage samples, and the green dots represent texture samples.

  4X  6X/8X  16X
GeForce 6800/7800

SLI Antialiasing  

Radeon X850 XT PE


To clarify, the row labeled “GeForce 6800/7800” shows sample patterns for those cards’ regular, non-SLI AA modes. The 6X/8X column shows patterns for NVIDIA’s 8xS mode for the GeForce 6800/7800 cards, for the SLI 8X mode, and for ATI’s 6X multisampled mode on the Radeon X800 series.

The sample pattern of the SLI 8X mode is pretty much exactly what NVIDIA’s diagram told us to expect. NVIDIA says this pattern is superior to the 8xS mode, shown directly above it in the table, because the samples in SLI 8X mode are not vertically aligned.

SLI 16X’s pattern is consistent with a doubled and offset version of 8xS, as expected.

These are real, robust antialiasing modes. The SLI AA modes involve multiple texture samples, so there is an element of supersampling to them. That means the antialiasing effect will not be confined entirely to object edges, as pure multisampled AA modes typically are (more or less). SLI AA is true full-scene antialiasing, and object interiors are modified, as well.

Also, the jittering of NVIDIA’s base patterns serves to create non-grid aligned sample patterns for SLI AA. That’s good, because the human eye is very adept at pattern detection, and defeating that ability is one of the keys to good antialiasing. However, NVIDIA’s GPUs appear to have limited programmability when it comes to AA sample patterns, and that weakness creates some potential problems here. The pairs of samples in 16X SLI AA, for instance, are spaced very closely together, probably because the 8xS sample pattern leaves very little room for an offset within the boundaries of a pixel. This tight proximity of sample points will probably limit the effectiveness of 16X SLI AA somewhat, despite its very high sample size.

ATI’s antialiasing hardware is more flexible, as the funky sparse sample pattern of the Radeon X850 XT’s 6X mode hints. The ability to vary sample patterns on the fly has allowed ATI to create its temporal AA feature, and it will likely allow CrossFire’s Super AA mode the luxury of superior sample patterns.

Setting some performance expectations
Because the SLI AA modes are, in one sense, simply another means of splitting up the rendering workload between two cards, SLI AA won’t necessarily always provide a magical performance boost over existing SLI rendering modes. The SLI AA modes may or may not be more efficient than other SLI modes at divvying the work between two GPUs, depending on the nature of the application. I believe that alternate-frame rendering (AFR) will still likely be the best overall SLI mode for raw performance, because with AFR, the GPU can split up both the vertex and pixel processing work in a way that benefits overall performance. Like split-frame rendering (SFR), the SLI AA modes should scale only in terms of pixel throughput.

The primary benefit of the SLI AA modes, in fact, isn’t raw performance, but the ability to use the additional fill rate of an SLI system to improve image quality. The 16X SLI AA mode, in particular, is more intensive than any single-card AA mode, and it should be more of a performance drain, as well. In games where SLI’s extra pixel-pushing power would otherwise be essentially wasted, 16X SLI AA will be a nice bonus.

With that said, there are a number of comparisons we’ll want to make when looking at the benchmark numbers in order to get a handle on SLI AA’s performance picture. Among them:

  • SLI AA modes versus single-card equivalents — Yes, there should be some overhead to SLI operation, but it will still be interesting to see how the SLI AA modes compare to their single-card equivalents. For example, how does 4X AA on a single GeForce 7800 GTX match up to SLI 8X AA on a pair of 7800 GTX cards in SLI? Or single-card 8xS mode versus SLI 16X mode? With a second card taking on the additional sampling workload, one would expect them to perform fairly similarly.
  • SLI with antialiasing versus equivalent SLI AA — SLI 8X mode, I’m looking at you. How does your performance compare with the regular 8xS AA mode running on a pair of cards in SLI? Is it more efficient to use the default SLI rendering mode (whether it be SFR or AFR, depending on the profile in the NVIDIA drivers) with 8xS AA, or is SLI 8X mode faster? NVIDIA says SLI 8X mode has the advantage of smaller memory footprint, because each card is only doing 4X AA. This 8X mode could be the sweet spot for SLI AA.
  • SLI 16X AA versus utterly, incorrigibly CPU-bound games — Some games, including notables like Half-Life 2, we’ve found are almost completely performance limited by CPU or system throughput, not by the graphics card. This is particularly the case, quite often, with a couple of GeForce 7800 GTX cards in SLI. Very few games need all of that power, so their performance often doesn’t change as we crank up the resolution and visual quality. We’ve even resorted to testing at ultra-high resolutions like 2048×1536, just to show performance differences between high-end graphics solutions, but those comparisons aren’t always entirely fair due to GPU architecture limitations.

    So, can 16X SLI mode tease out some contrasts between, say, a pair of 6800 Ultras and a pair of 7800 GTX cards? More importantly, can either of those configs run today’s games at playable frame rates with SLI 16X AA?

Those are some of the things we’ll want to consider. Now, let’s see how SLI AA actually looks onscreen.


Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least twice, and the results were averaged.

Our test system was configured like so:

Processor Athlon 64 FX-57 2.8GHz
System bus 1GHz HyperTransport
Motherboard Asus A8N-SLI Deluxe
BIOS revision 1011
North bridge nForce4 SLI
South bridge
Chipset drivers SMBus driver 4.45
IDE driver 5.18
Memory size 1GB (2 DIMMs)
Memory type OCZ EL PC3200 DDR SDRAM at 400MHz
CAS latency (CL) 2
RAS to CAS delay (tRCD) 2
RAS precharge (tRP) 2
Cycle time (tRAS) 5
Hard drive Maxtor DiamondMax 10 250GB SATA 150
Audio Integrated nForce4/ALC850
with NVIDIA 4.60 drivers
Graphics GeForce 6800 Ultra 256MB PCI-E with ForceWare 77.74 drivers Dual GeForce 6800 Ultra 256MB PCI-E with ForceWare 77.74 drivers GeForce 7800 GTX 256MB PCI-E with ForceWare 77.74 drivers Dual GeForce 7800 GTX 256MB PCI-E with ForceWare 77.74 drivers Radeon X850 XT Platinum Edition PCI-E  with Catalyst 5.7 drivers
OS Windows XP Professional (32-bit)
OS updates Service Pack 2

Thanks to OCZ for providing us with memory for our testing. If you’re looking to tweak out your system to the max and maybe overclock it a little, OCZ’s RAM is definitely worth considering.

Unless otherwise specified, the image quality settings for both ATI and NVIDIA graphics cards were left at the control panel defaults.

The test systems’ Windows desktops were set at 1280×1024 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.


Image output
Here are some screenshots of the various AA modes available on these cards, including SLI AA. We’ll start with the single-card stuff with relatively low sample rates, just for comparison. The images below are from 3DMark03. They are are magnified to 4X their original size so that we can better see how antialiasing is modifying the color of each pixel. Our small sample scene should provide a decent mix of difficult cases for edge AA, including near-horizontal and near-vertical edges.

These are low-compression JPEG images. You can click the images to open up a lossless PNG version of them.

No AA – GeForce 7800 GTX

No AA – Radeon X850 XT PE

2X AA – GeForce 7800 GTX

2X AA – Radeon X850 XT PE

4X AA – GeForce 7800 GTX

4X AA – Radeon X850 XT PE

6X AA – Radeon X850 XT PE

8xS AA – GeForce 7800 GTX

SLI 8X AA – GeForce 7800 GTX

SLI 16X AA – GeForce 7800 GTX

Folks can make what they want of the images above, but I’ll give you my take, for what it’s worth. Obviously, once the sample size climbs above four, differences between the images become increasing harder to discern. That’s a fact of life with higher AA modes: there are diminishing returns.

That said, I do see some real differences in certain places, such as the long, near-horizontal lower edge of the nearest plane’s wing, where a good AA routine can produce a nice, smooth gradient. There’s also an interesting near-vertical edge at the rear of the tail fin of the plane in the top right portion of the picture. In both of those places, it looks to me like 8X SLI AA produces smoother edges than NVIDIA’s usual 8xS mode. Those same two spots reveal visible differences between SLI 8X and SLI 16X, too. 16X mode produces smoother edges and finer gradients.

Of course, I also think that ATI’s 6X mode looks very nice, on par with SLI 8X mode at least. ATI’s sparse sample pattern and gamma-correct blends probably have a lot to do with that. The GeForce 7800 GTX does have the ability to do gamma-correct blends, but that feature doesn’t currently work with SLI AA. NVIDIA says it’s looking into the issue, but I’m unsure whether to expect that they will be able to resolve it.

In any event, SLI AA does seem to provide better antialiasing than NVIDIA’s stock AA modes. SLI 8X AA looks quite a bit better than the standard 4X AA mode. SLI 16X’s advantage over 8xS is noticeable, but subtler.


Doing the math
We can tease out the differences between the stock and SLI AA modes by using a paint program to calculate the mathematical differences between pixel values in the images. Below are the results of “diff” operations comparing both SLI AA modes to their single-card counterparts. I’ve applied some gamma adjustment (a value of 3.0 in Paint Shop Pro) in order to make the differences more visible.

Difference between 4X AA and SLI 8X AA – GeForce 7800 GTX

Difference between 8xS AA and SLI 16X AA – GeForce 7800 GTX

These images confirm our impressions from the previous page that the difference between 4X AA and SLI 8X AA is more dramatic than the difference between 8xS AA and 16X SLI AA. I suspect that the difference between 8xS and SLI 16X would be greater if the jittered sample patterns for SLI 16X were offset further.


Doom 3

Here’s our first look at SLI AA performance, and it’s not a very promising result. Doom 3 doesn’t react well to SLI AA; scores are lower for SLI 8X than for a single 7800 GTX card in 8xS mode. The GeForce 6800 Ultra follows a similar pattern.

I was concerned that perhaps my decision to test SLI AA at 1600×1200 resolution might be the source of some kind of problem, so I decided to try a little experiment, testing various AA mode at several resolutions to see how they scale. Here’s what I found.

It turns out that performance in Doom 3 at 1600×1200 is reasonably consistent with what we see at lower resolutions. Whatever the source of SLI AA’s struggles in Doom 3, it’s not a resolution-related problem.

SLI rendering modes do tend to differ in their effectiveness with different applications, which is why NVIDIA has built profiles into its drivers for various games. The profile-selected SLI mode would appear to be quite a bit more efficient in Doom 3 than SLI AA. Perhaps the picture will be different with other games?


Far Cry

Far Cry is a different story than Doom 3—at least, it is with the GeForce 7800 GTX. The pair of 7800 GTX cards in SLI comes out quite a bit faster with SLI 8X AA than with 8xS. SLI 16X mode isn’t really playable at 1600×1200, though.

The GeForce 6800 Ultra suffers from the same fate here that it did in Doom 3. SLI AA mode just doesn’t perform very well.


Half-Life 2

Look closely at the numbers above. Things look good at first, but then you see essentially identical performance numbers out of the SLI rigs with 4X multisampling and the SLI rigs supposedly running in SLI 8X mode. The numbers are also basically identical for the SLI systems in 8xS mode and supposedly in SLI 16X mode.

So is SLI AA working in Half-Life 2? Let’s take a closer look.


Does SLI AA work in Half-Life 2?
So either SLI AA is working or it isn’t, right? I wish it were that easy. We need to sort through several issues in order to come to an answer.

One would think we should start by looking at visual quality in the game and determining whether the SLI AA modes are doing their thing. I started out there myself, and I was fairly certain that at least SLI 8X mode was working, because my screenshots of SLI 8X mode looked different from those in 4X AA mode. However, it turns out that there is a notable visual difference in Half-Life 2 between 4X AA invoked via the in-game menu versus 4X AA turned on via the NVIDIA control panel. Have a look:

4X AA – Invoked via in-game video settings menu

4X AA – Invoked via control panel

The electrical wires and cables look quite a bit thicker and darker when 4X AA is enabled via the in-game menu. If 4X is enabled via the control panel, the wires look much softer and finer.

The output from 4X AA invoked via the control panel also happens to look essentially identical to the output from SLI 8X mode.


In fact, I can’t see a difference between 4X AA via the control panel and SLI 8X AA. To assist my tired eyes, we can use a paint program “diff” operation to highlight the color differences between the pixels. The result of that operation looks like so:

Difference between 4X AA (via control panel) and SLI 8X AA

Only those electrical wires, which tend to sway in the wind when the game is running, show any differences at all between the screenshots from the 4X AA and SLI 8X AA modes. None of the fixed objects differ in the least between 4X and SLI 8X. In other words, SLI 8X mode doesn’t appear to be working properly, based on these screenshots; the output is the same as that of 4X AA.

Similarly, SLI 16X mode is visually indistinguishable from 8xS mode.

8xS AA


Difference between 8xS AA and SLI 16X AA

Again, only those swaying wires differ between our 8xS screenshot and our SLI 16X screenshot.

When I approached NVIDIA about this problem, they suggested to me that the screenshots themselves might be faulty. That’s a reasonable question to raise, because getting proper screen captures of NVIDIA antialiasing modes has long been a tricky proposition. GeForce cards tend to do some AA post-processing on scan-out, and as I understand it, the exact image that’s output to the screen may never be available in a frame buffer. In the case of SLI AA modes, the issue was further complicated by the fact that I had to use the FRAPS utility to get a screenshot, because the usual Windows PrintScreen key function was only capturing an all-black screen.

However, I’ve peered into my monitor at this scene from Half-Life 2 for quite a while now in the various AA modes. In fact, I even switched to a better monitor, just to be sure I was seeing things well enough. After a lot of looking, I am confident that the screenshot images you see above are faithful representation of what’s onscreen in the game with these various AA modes enabled.

Given the visual evidence and the distinct lack of a performance difference between 4X AA and SLI 8X AA, and between 8xS AA and SLI 16X AA, I am fairly confident that the SLI AA modes simply aren’t working in Half-Life 2. Perhaps it’s a driver bug, an application-specific incompatibility, or some quirk of my test system setup. I don’t know, but I wasn’t able to make it work.

Update: NVIDIA has confirmed that this is a bug that affects Half-Life 2, and should have a fix in a future driver.

The Chronicles of Riddick: Escape from Butcher Bay
Just to make sure we’re giving SLI AA a fair shake, I decided to run some quick tests with a couple more games on our GeForce 7800 GTX SLI rig.

Unreal Tournament 2004
UT2004 is a poster-boy for CPU-bound games, and ought to offer a bit different perspective on SLI AA performance.

Both of these games follow the general performance pattern established by the other games. UT2004 is one case where a game hits playable frame rates with SLI 16X AA.

I’m a little bit ambivalent about SLI antialiasing, as you might imagine. The first thing that should be said is that this is essentially a “free” additional feature of SLI systems, and for the price, it’s a nice option to have. SLI AA will allow CPU-bound games to take advantage of the extra power of a dual-graphics system to provide a little bit higher image quality. That’s good.

The degree to which SLI AA is useful will probably vary quite a bit from one user to the next, depending on the demands of the games being played, the resolution and display type involved, and the sensitivity of the individual to visual anomalies like edge aliasing. I can envision a user not too different from myself getting regular use out of SLI AA—and SLI 8X AA in particular—if he were using a pair of reasonably fast graphics cards mated to a high-res LCD display.

However, SLI AA has its limitations, including what I would consider unexpectedly poor performance in many cases and an apparent incompatibility with Half-Life 2. I haven’t had time to test SLI AA yet as extensively as I’d like, but my brief experience with it over the past couple of days has been rather shaky. I wouldn’t call SLI AA a sensible tradeoff of performance for image quality in most situations. SLI 8X mode is too often slower than NVIDIA’s regular 8xS mode and offers a near-imperceptible improvement in AA quality. SLI 16X mode only makes sense in heavily CPU-constrained games that offer acceptable performance almost no matter what the graphics subsystem is doing. Perhaps the picture will look different given more time.

Fortunately, we should have the perfect opportunity to do some additional testing, presumably soon, when ATI’s CrossFire solution arrives. CrossFire’s Super AA may well offer some image quality benefits over SLI AA thanks to the flexibility of ATI’s antialiasing hardware. The funny thing is that when CrossFire does arrive, SLI AA will already be here, waiting for it.