Naturally, we were interested in the possibility of uncovering such a scandalous thing. We’ve dug into such juicy stories in the past, and the results have sometimes been rather enlightening. So we fired up our test rigs, busted out our sleuthing skills, and set out to see what exactly was happening with ATI’s drivers in DOOM 3. Read on to see what we found.
The problem with ATI’s drivers in DOOM 3 has to do with texture filtering, and it is more visible on some textures than others. Textures with high-contrast patterns on them, like the metal grates in the game’s Mars base, tend to show the problem most vividly. There is a clearly visible transition between mip map levels, as if trilinear filtering were not happening as it should. Here’s an example.
This is in DOOM 3’s High Quality mode, where 8X anisotropic filtering and trilinear filtering are both supposed to be active. However, as you can see, there’s a mip map transition line running across the middle of the grate on the floor in our example screenshot. This problem is visible throughout the game whenever a similar texture is used on the floor. The screenshot shows the problem, but it’s more obvious in motion. Once you’ve noticed it, it’s rather distracting, like the transition lines were with bilinear-only filtering on Voodoo cards back in the Quake 2 days. Having those things track on the floor out in front of me as I moved was annoying.
We used the Catalyst 4.9 beta drivers that ATI released specifically for DOOM 3 to take this screenshot, but the exact same problem is visible with ATI’s latest official drivers, Catalyst 4.8. (By the way, all the images from DOOM 3 in this article have been brightened up a bit by using a gamma 1.4 on them in Paint Shop Pro. DOOM 3’s own gamma and brightness settings don’t affect screenshot output, and the game’s output is a little too dark by default for our purposes.)
One way around this problem is to use DOOM 3’s console commands to flip out of trilinear filtering and then back into it. The command to turn off trilinear and just run bilinear filtering in DOOM 3 is similar to what you’d use in old Quake engine games, but slightly different:
That will put the renderer into bilinear filtering (and shouldn’t affect anisotropic filtering). Then you can flip back into trilinear mode with this command:
Once you’ve done that, the transition line on the floor magically vanishes:
There’s no need to issue a “vid_restart” command to reset the rendering engine. In fact, if you do that, the transition line comes back again.
For those of you having trouble seeing the difference, here’s the output from a mathematical “diff” operation between the two images:
So, is that the end of the story? What happens if you benchmark the ATI card before and after toggling filtering modes on the command line? Is the Radeon X800 Pro doing less work at the game’s default settings? We ran some timedemos, using our own trusty “trdemo2” demo, and found this:
Uh oh. The ATI card produces the correct image after being told to do trilinear filtering in the game’s console, but that’s accompanied by a big performance drop. Could it be that ATI’s drivers are compromising image quality for performance? Is the console command somehow defeating this optimization? Or could it be that the console commands are overriding the adaptive trilinear filtering algorithm that ATI snuck into newer Radeon GPUs?
Before we dig into what’s happening and why, we should put this filtering problem into context by showing how it affects other textures that are less likely to show obvious seams between mip maps. Here’s a pretty good example of how much this problem affects most textures in the game.
You can see some minor aliasing in one of the texture seams at the top of our sample images, but that’s about it. There are more mathematical differences between the images that show up in the full “diff” image, but they’re not massive. Whatever’s going on here doesn’t harm image quality too greatly, but it does cause problems in certain cases.
Apparently, something is wrong with ATI’s drivers, because the Radeon X800 Pro isn’t producing the correct image in DOOM 3 at the game’s defaults. There might also be something fishy going on, because performance drops after toggling the filtering mode in the game console. We contacted ATI and asked for an explanation, unsure what to expect.
ATI’s answer, it turns out, was intriguing. The problem with image quality, they said, is due to a known bug in the Catalyst 4.8 and early 4.9 beta drivers. Unlike most games, they claimed, DOOM 3 doesn’t turn on anisotropic texture filtering globally. Instead, it sets filtering on a per-texture basis, requesting filtering only on those textures that require it. Other textures in the game store important data that the game uses to render scenes, but don’t show up on screen. These textures won’t benefit from filtering. What’s more, they said, DOOM 3 is rather unique among OpenGL applications in this requesting filtering on a case-by-case basis instead of doing so globally.
As a result, DOOM 3’s behavior exposes a bug in ATI’s OpenGL driver. The driver isn’t reacting properly to DOOM 3’s requests. The driver sets a minification filter, but not a magnification filterso full trilinear blending between mip maps isn’t applied. ATI pledged to get us a new build of the Catalyst 4.9 beta drivers that included a fix for the problem, and said the fix should have a negligible impact on performance.
So far, so good. Why, then, does using the console to toggle between bilinear and trilinear modes appear to solve this problem?
ATI said the performance drop is a result of a quirk of DOOM 3’s. Using the console switch sets filtering globally, rather than on a texture-by-texture basis. Setting filtering globally, they said, has the same effect as forcing anisotropic filtering on in the ATI driver’s control panel. Texture filtering then affects textures that shouldn’t be filtered, like the game’s specular lookup table and cube map textures, slowing the game down without really benefitting image quality. (Incidentally, they noted, this same problem is one of the big reasons the DOOM 3 shader patch making its way around the web improves performance so dramatically in some casesspecifically, in those cases where users have forced anisotropic filtering on using the ATI control panel. By substituting math operations for using the specular lookup map, the patch sidesteps the unnecessary filtering happening to the specular lookup texture.)
In other words, according to ATI, the big performance drop-off we’re seeing after the console commands is a quirk of the game, and doesn’t necessarily mean anything sinister is happening in ATI’s drivers.
So are they full of it?
The explanation sounded good, but we wanted to be sure ATI wasn’t trying to use its technical mojo to pull the wool over our eyes, so we had to come up with some means of verifying what they were saying. We decided on a few obvious ways of testing their claims.
- Test the new driver Of course, we’ll have to see whether the new driver fixes the image quality problem in DOOM 3, and we’ll want to check and see whether the fix results in a big performance loss. Ideally, if what ATI is saying is true, our Radeon X800 Pro test card should produce the proper image quality and still perform about like it did with the earlier Catalyst 4.9 beta drivers.
- Test bilinear filtering If DOOM 3’s console settings really do turn on anisotropic filtering globally, we’d expect performance in DOOM 3 to suffer somewhat even if we just issued the command to switch into bilinear filtering mode, because the switch to bilinear should invoke anisotropic filtering on the specular lookup and cube maps, as ATI claimed.
- Test control panel aniso ATI claimed that setting anisotropic filtering plus trilinear using in DOOM 3’s console command was functionally equivalent to forcing on global anisotropic filtering using the ATI control panel. We can test that claim straightforwardly by turning on 8X aniso in the control panel and running some timedemos.
- Test NVIDIA hardware If the console commands in DOOM 3 force global anisotropic filtering, then we should see a performance drop after issuing those same commands on a GeForce 6800 GT, right? And what happens when we turn on 8X anisotropic filtering in the NVIDIA control panel? Is the effect the same as it is on ATI hardware? Also, how does image quality before and after the console command toggle affect image quality on NVIDIA cards?
- Use counter-mojo We asked an NVIDIA engineer to check and see whether DOOM 3’s console commands were requesting filtering for textures that don’t need it, like the specular lookup map. He said he’d look into it and get back to us.
Equipped with a broad set of options, we set out to put ATI’s explanations to the test.
Our testing methods
Both the ATI and NVIDIA cards were left at their driver default settings for image quality, with the exception that we turned off vertical refresh sync on all cards.
Our test system was configured like so:
|Processor||Athlon 64 3800+ 2.4GHz|
|System bus||HT 16-bit/800MHz downstream
HT 16-bit/800MHz upstream
|North bridge||K8T800 Pro|
|Chipset drivers||4-in-1 v.4.51
|Memory size||1GB (2 DIMMs)|
|Memory type||Carsair XMS3200LL DDR SDRAM at 400MHz|
|RAS to CAS delay||3|
|Hard drive||Seagate Barracuda V ATA/100 120GB|
|Graphics|| Radeon X800 Pro 256MB AGP
GeForce 6800 GT 256MB AGP
|OS||Microsoft Windows XP Professional|
|OS updates||Service Pack 2 RC2, DirectX 9.0c|
We used NVIDIA’s ForceWare 61.77 drivers the GeForce card, and we used ATI’s CATALYST drivers as noted with the Radeon card.
The test systems’ Windows desktop was set at 1152×864 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
If you have questions about our methods, hit our forums to talk with us about them.
Image quality on the GeForce 6800 GT
Let’s start by looking at image quality on NVIDIA’s GeForce 6800 GT. We’ll see how it looks at the game defaults, then issue the console commands to switch to bilinear and back to trilinear, then see if anything has changed.
To the naked eye, there’s really no difference between these images. You can’t see much of anything in the “diff” image, either. However, there are subtle mathematical differences between the images, as cranking the gamma way up on the diff image reveals. Taken alone, these results are ambiguous, but we’ll want to keep them in mind as we collect more info.
Image quality on the Radeon X800 Pro with 8.051 drivers
Next, we’ll take a look at image quality in DOOM 3 with the updated Catalyst 4.9 beta driversrevision 8.051 that ATI supplied to us.
Clearly, ATI’s new drivers have fixed the image quality problem we noticed before, eliminating the obvious mip map banding present in the 4.8 and early 4.9 beta drivers.
After the console toggle, there aren’t any obvious differences in the picture. However, the “diff” once more shows some slight mathematical differences between the images, as we saw on the GeForce 6800 GT. Could it be that the purported switch to global anisotropic filtering is responsible for these differences?
Now for the moment of truth. Here are the results of our expanded benchmarks.
Let’s break things down bit by bit to see what we’ve learned.
First, the new 8.051 driver that alleviates the image quality problem in DOOM 3 doesn’t seem to harm performance at the game’s default settings, which is good news for ATI. Yes, the score does drop by one frame per second, probably because that magnification filter is doing its thing, but that’s it. These results seem to jibe with what ATI was telling us.
Next, look at the scores on both ATI and NVIDIA hardware for the game defaults, where trilinear filtering is at work, and then after we’ve issued the command for bilinear filtering from the game console. Performance drops with only bilinear filtering enabled, lending credence to the theory that DOOM 3 is turning on global anisotropic filtering whenever its console commands for filtering are used.
Also, notice how turning on 8X anisotropic filtering via the control panel has the exact same effect as issuing the DOOM 3 console command for trilinear filtering. This is true with both ATI driver revisions and on the GeForce 6800 GT.
Not long after we’d gotten these results, our NVIDIA contact phoned in with confirmation that DOOM 3’s console commands do indeed request anisotropic filtering for all textures, including the specular lookup map. An OpenGL trace program had confirmed this behavior.
So all of our tests and our counter-mojo line up to support ATI’s explanation for the DOOM 3 image quality problems and performance anomalies we’ve seen. ATI does have a bug in its Catalyst 4.8 and 4.9 beta drivers currently available to the public, and that bug affects image quality in DOOM 3. However, ATI is not trading off image quality for performance; this is an honest-to-goodness software glitch, and nothing more.
The bug also seems to be squashed in later builds of the Catalyst 4.9 driver. I’d like to see ATI update its hotfix for DOOM 3 with a newer version of the 4.9 beta, so that Radeon owners can enjoy DOOM 3 without the image quality problems we’ve noted. I’d also like to see DOOM 3 patched, if possible, to tolerate in-game filtering switches and control panel-based anisotropic filtering without suffering a big performance drop. Also, anyone using DOOM 3 for video card testing will want to keep in mind that the game requests filtering differently when in-game console commands are used.
The future: ATI and app-specific optimizations
Incidentally, we learned something else during the course of our research for this article. Although we didn’t find any application-specific optimizations in ATI’s current drivers, sources familiar with ATI’s plans indicated to us that ATI will be moving in the direction of application detection and optimization in its future driver revisionsa big departure from the company’s current policy. So, if in the future DOOM 3 is making a request for filtering on textures that ought not to be filtered, ATI’s drivers might not turn on filtering for those textures. One of the goals of ATI’s policy shift will be addressing just this sort of problem.
ATI’s decision to use app detection was also apparently influenced by its use of adaptive filtering algorithms. After the world learned of ATI’s adaptive trilinear filtering algorithm used in the Radeon 9600 Pro and newer GPUs, the company challenged people to point out obvious image quality problems caused by this algorithm. Some folks apparently found some cases where ATI’s filtering isn’t as good as “full” trilinear filtering, so ATI will use application detection to address those problems on a case-by-case basis.
Of course, the underlying reason to do app detection encompasses these sorts of workarounds and special-case fixes, but it’s also something more than that. Application-specific code in video drivers can be used to achieve better performance, even in cases where the application and the GPU aren’t doing anything unexpected or problematic. NVIDIA explicitly embraced this approach some time ago, adopting some broadly permissive guidelines to prevent the worst sorts of abuses of this practice.
We understand ATI will be using a “controlled” application detection method that should be a little more open than NVIDIA’s, because users will have the option of disabling app-specific optimizations via the ATI control panel, should they wish to do so. App detection won’t be a part of the upcoming Catalyst 4.9 drivers slated for release in September, but it will be included in the following version, Catalyst 4.10.