A problem with ATI’s drivers in DOOM 3?

RECENTLY, WE RECEIVED a tip, some information claiming ATI’s drivers exhibited some odd behavior in DOOM 3 and that the company might be compromising image quality for performance. What’s more, this problem seemed to be specific to DOOM 3, raising the possibility that ATI was guilty of engaging in an application-specific optimization—a practice the firm has forsworn.

Naturally, we were interested in the possibility of uncovering such a scandalous thing. We’ve dug into such juicy stories in the past, and the results have sometimes been rather enlightening. So we fired up our test rigs, busted out our sleuthing skills, and set out to see what exactly was happening with ATI’s drivers in DOOM 3. Read on to see what we found.

The problem
The problem with ATI’s drivers in DOOM 3 has to do with texture filtering, and it is more visible on some textures than others. Textures with high-contrast patterns on them, like the metal grates in the game’s Mars base, tend to show the problem most vividly. There is a clearly visible transition between mip map levels, as if trilinear filtering were not happening as it should. Here’s an example.


Radeon X800 Pro with Catalyst 4.9 beta – Game defaults
(Click for full-screen lossless PNG version)

This is in DOOM 3’s High Quality mode, where 8X anisotropic filtering and trilinear filtering are both supposed to be active. However, as you can see, there’s a mip map transition line running across the middle of the grate on the floor in our example screenshot. This problem is visible throughout the game whenever a similar texture is used on the floor. The screenshot shows the problem, but it’s more obvious in motion. Once you’ve noticed it, it’s rather distracting, like the transition lines were with bilinear-only filtering on Voodoo cards back in the Quake 2 days. Having those things track on the floor out in front of me as I moved was annoying.

We used the Catalyst 4.9 beta drivers that ATI released specifically for DOOM 3 to take this screenshot, but the exact same problem is visible with ATI’s latest official drivers, Catalyst 4.8. (By the way, all the images from DOOM 3 in this article have been brightened up a bit by using a gamma 1.4 on them in Paint Shop Pro. DOOM 3’s own gamma and brightness settings don’t affect screenshot output, and the game’s output is a little too dark by default for our purposes.)

One way around this problem is to use DOOM 3’s console commands to flip out of trilinear filtering and then back into it. The command to turn off trilinear and just run bilinear filtering in DOOM 3 is similar to what you’d use in old Quake engine games, but slightly different:

image_filter GL_LINEAR_MIPMAP_NEAREST

That will put the renderer into bilinear filtering (and shouldn’t affect anisotropic filtering). Then you can flip back into trilinear mode with this command:

image_filter GL_LINEAR_MIPMAP_LINEAR

Once you’ve done that, the transition line on the floor magically vanishes:


Radeon X800 Pro with Catalyst 4.9 beta – After trilinear console toggle
(Click for full-screen lossless PNG version)

There’s no need to issue a “vid_restart” command to reset the rendering engine. In fact, if you do that, the transition line comes back again.

For those of you having trouble seeing the difference, here’s the output from a mathematical “diff” operation between the two images:


Radeon X800 Pro with Catalyst 4.9 beta – Diff between default and post-toggle output
(Click for full-screen lossless PNG version)

So, is that the end of the story? What happens if you benchmark the ATI card before and after toggling filtering modes on the command line? Is the Radeon X800 Pro doing less work at the game’s default settings? We ran some timedemos, using our own trusty “trdemo2” demo, and found this:

Uh oh. The ATI card produces the correct image after being told to do trilinear filtering in the game’s console, but that’s accompanied by a big performance drop. Could it be that ATI’s drivers are compromising image quality for performance? Is the console command somehow defeating this optimization? Or could it be that the console commands are overriding the adaptive trilinear filtering algorithm that ATI snuck into newer Radeon GPUs?

 

Another example
Before we dig into what’s happening and why, we should put this filtering problem into context by showing how it affects other textures that are less likely to show obvious seams between mip maps. Here’s a pretty good example of how much this problem affects most textures in the game.


Radeon X800 Pro with Catalyst 4.9 beta – Game defaults
(Click for full-screen lossless PNG version)


Radeon X800 Pro with Catalyst 4.9 beta – After trilinear console toggle
(Click for full-screen lossless PNG version)


Radeon X800 Pro with Catalyst 4.9 beta – Diff between default and post-toggle output
(Click for full-screen lossless PNG version)

You can see some minor aliasing in one of the texture seams at the top of our sample images, but that’s about it. There are more mathematical differences between the images that show up in the full “diff” image, but they’re not massive. Whatever’s going on here doesn’t harm image quality too greatly, but it does cause problems in certain cases.

 

ATI’s explanation
Apparently, something is wrong with ATI’s drivers, because the Radeon X800 Pro isn’t producing the correct image in DOOM 3 at the game’s defaults. There might also be something fishy going on, because performance drops after toggling the filtering mode in the game console. We contacted ATI and asked for an explanation, unsure what to expect.

ATI’s answer, it turns out, was intriguing. The problem with image quality, they said, is due to a known bug in the Catalyst 4.8 and early 4.9 beta drivers. Unlike most games, they claimed, DOOM 3 doesn’t turn on anisotropic texture filtering globally. Instead, it sets filtering on a per-texture basis, requesting filtering only on those textures that require it. Other textures in the game store important data that the game uses to render scenes, but don’t show up on screen. These textures won’t benefit from filtering. What’s more, they said, DOOM 3 is rather unique among OpenGL applications in this requesting filtering on a case-by-case basis instead of doing so globally.

As a result, DOOM 3’s behavior exposes a bug in ATI’s OpenGL driver. The driver isn’t reacting properly to DOOM 3’s requests. The driver sets a minification filter, but not a magnification filter—so full trilinear blending between mip maps isn’t applied. ATI pledged to get us a new build of the Catalyst 4.9 beta drivers that included a fix for the problem, and said the fix should have a negligible impact on performance.

So far, so good. Why, then, does using the console to toggle between bilinear and trilinear modes appear to solve this problem?

ATI said the performance drop is a result of a quirk of DOOM 3’s. Using the console switch sets filtering globally, rather than on a texture-by-texture basis. Setting filtering globally, they said, has the same effect as forcing anisotropic filtering on in the ATI driver’s control panel. Texture filtering then affects textures that shouldn’t be filtered, like the game’s specular lookup table and cube map textures, slowing the game down without really benefitting image quality. (Incidentally, they noted, this same problem is one of the big reasons the DOOM 3 shader patch making its way around the web improves performance so dramatically in some cases—specifically, in those cases where users have forced anisotropic filtering on using the ATI control panel. By substituting math operations for using the specular lookup map, the patch sidesteps the unnecessary filtering happening to the specular lookup texture.)

In other words, according to ATI, the big performance drop-off we’re seeing after the console commands is a quirk of the game, and doesn’t necessarily mean anything sinister is happening in ATI’s drivers.

So are they full of it?
The explanation sounded good, but we wanted to be sure ATI wasn’t trying to use its technical mojo to pull the wool over our eyes, so we had to come up with some means of verifying what they were saying. We decided on a few obvious ways of testing their claims.

  • Test the new driver — Of course, we’ll have to see whether the new driver fixes the image quality problem in DOOM 3, and we’ll want to check and see whether the fix results in a big performance loss. Ideally, if what ATI is saying is true, our Radeon X800 Pro test card should produce the proper image quality and still perform about like it did with the earlier Catalyst 4.9 beta drivers.
  • Test bilinear filtering — If DOOM 3’s console settings really do turn on anisotropic filtering globally, we’d expect performance in DOOM 3 to suffer somewhat even if we just issued the command to switch into bilinear filtering mode, because the switch to bilinear should invoke anisotropic filtering on the specular lookup and cube maps, as ATI claimed.
  • Test control panel aniso — ATI claimed that setting anisotropic filtering plus trilinear using in DOOM 3’s console command was functionally equivalent to forcing on global anisotropic filtering using the ATI control panel. We can test that claim straightforwardly by turning on 8X aniso in the control panel and running some timedemos.
  • Test NVIDIA hardware — If the console commands in DOOM 3 force global anisotropic filtering, then we should see a performance drop after issuing those same commands on a GeForce 6800 GT, right? And what happens when we turn on 8X anisotropic filtering in the NVIDIA control panel? Is the effect the same as it is on ATI hardware? Also, how does image quality before and after the console command toggle affect image quality on NVIDIA cards?
  • Use counter-mojo — We asked an NVIDIA engineer to check and see whether DOOM 3’s console commands were requesting filtering for textures that don’t need it, like the specular lookup map. He said he’d look into it and get back to us.

Equipped with a broad set of options, we set out to put ATI’s explanations to the test.

 

Our testing methods
Both the ATI and NVIDIA cards were left at their driver default settings for image quality, with the exception that we turned off vertical refresh sync on all cards.

Our test system was configured like so:

Processor Athlon 64 3800+ 2.4GHz
System bus HT 16-bit/800MHz downstream
HT 16-bit/800MHz upstream
Motherboard Asus A8V
BIOS revision 1006
North bridge K8T800 Pro
South bridge VT8237
Chipset drivers 4-in-1 v.4.51
ATA 5.1.2600.220
Memory size 1GB (2 DIMMs)
Memory type Carsair XMS3200LL DDR SDRAM at 400MHz
CAS latency 2
Cycle time 6
RAS to CAS delay 3
RAS precharge 2
Hard drive Seagate Barracuda V ATA/100 120GB
Audio Integrated
Graphics Radeon X800 Pro 256MB AGP
GeForce 6800 GT 256MB AGP
OS Microsoft Windows XP Professional
OS updates Service Pack 2 RC2, DirectX 9.0c

We used NVIDIA’s ForceWare 61.77 drivers the GeForce card, and we used ATI’s CATALYST drivers as noted with the Radeon card.

The test systems’ Windows desktop was set at 1152×864 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

If you have questions about our methods, hit our forums to talk with us about them.

 

Image quality on the GeForce 6800 GT
Let’s start by looking at image quality on NVIDIA’s GeForce 6800 GT. We’ll see how it looks at the game defaults, then issue the console commands to switch to bilinear and back to trilinear, then see if anything has changed.


GeForce 6800 GT with ForceWare 61.77 – Game defaults
(Click for full-screen lossless PNG version)


GeForce 6800 GT with ForceWare 61.77 – After trilinear console toggle
(Click for full-screen lossless PNG version)


GeForce 6800 GT with ForceWare 61.77 – Diff between default and post-toggle output
(Click for full-screen lossless PNG version)

To the naked eye, there’s really no difference between these images. You can’t see much of anything in the “diff” image, either. However, there are subtle mathematical differences between the images, as cranking the gamma way up on the diff image reveals. Taken alone, these results are ambiguous, but we’ll want to keep them in mind as we collect more info.

 

Image quality on the Radeon X800 Pro with 8.051 drivers
Next, we’ll take a look at image quality in DOOM 3 with the updated Catalyst 4.9 beta drivers—revision 8.051— that ATI supplied to us.


Radeon X800 Pro with 8.051 beta – Game defaults
(Click for full-screen lossless PNG version)

Clearly, ATI’s new drivers have fixed the image quality problem we noticed before, eliminating the obvious mip map banding present in the 4.8 and early 4.9 beta drivers.


Radeon X800 Pro with 8.051 beta – After trilinear console toggle
(Click for full-screen lossless PNG version)


Radeon X800 Pro with 8.051 beta – Diff between default and post-toggle output
(Click for full-screen lossless PNG version)

After the console toggle, there aren’t any obvious differences in the picture. However, the “diff” once more shows some slight mathematical differences between the images, as we saw on the GeForce 6800 GT. Could it be that the purported switch to global anisotropic filtering is responsible for these differences?

 

Benchmark results
Now for the moment of truth. Here are the results of our expanded benchmarks.

Let’s break things down bit by bit to see what we’ve learned.

First, the new 8.051 driver that alleviates the image quality problem in DOOM 3 doesn’t seem to harm performance at the game’s default settings, which is good news for ATI. Yes, the score does drop by one frame per second, probably because that magnification filter is doing its thing, but that’s it. These results seem to jibe with what ATI was telling us.

Next, look at the scores on both ATI and NVIDIA hardware for the game defaults, where trilinear filtering is at work, and then after we’ve issued the command for bilinear filtering from the game console. Performance drops with only bilinear filtering enabled, lending credence to the theory that DOOM 3 is turning on global anisotropic filtering whenever its console commands for filtering are used.

Also, notice how turning on 8X anisotropic filtering via the control panel has the exact same effect as issuing the DOOM 3 console command for trilinear filtering. This is true with both ATI driver revisions and on the GeForce 6800 GT.

Not long after we’d gotten these results, our NVIDIA contact phoned in with confirmation that DOOM 3’s console commands do indeed request anisotropic filtering for all textures, including the specular lookup map. An OpenGL trace program had confirmed this behavior.

 
Conclusions
So all of our tests and our counter-mojo line up to support ATI’s explanation for the DOOM 3 image quality problems and performance anomalies we’ve seen. ATI does have a bug in its Catalyst 4.8 and 4.9 beta drivers currently available to the public, and that bug affects image quality in DOOM 3. However, ATI is not trading off image quality for performance; this is an honest-to-goodness software glitch, and nothing more.

The bug also seems to be squashed in later builds of the Catalyst 4.9 driver. I’d like to see ATI update its hotfix for DOOM 3 with a newer version of the 4.9 beta, so that Radeon owners can enjoy DOOM 3 without the image quality problems we’ve noted. I’d also like to see DOOM 3 patched, if possible, to tolerate in-game filtering switches and control panel-based anisotropic filtering without suffering a big performance drop. Also, anyone using DOOM 3 for video card testing will want to keep in mind that the game requests filtering differently when in-game console commands are used.

The future: ATI and app-specific optimizations
Incidentally, we learned something else during the course of our research for this article. Although we didn’t find any application-specific optimizations in ATI’s current drivers, sources familiar with ATI’s plans indicated to us that ATI will be moving in the direction of application detection and optimization in its future driver revisions—a big departure from the company’s current policy. So, if in the future DOOM 3 is making a request for filtering on textures that ought not to be filtered, ATI’s drivers might not turn on filtering for those textures. One of the goals of ATI’s policy shift will be addressing just this sort of problem.

ATI’s decision to use app detection was also apparently influenced by its use of adaptive filtering algorithms. After the world learned of ATI’s adaptive trilinear filtering algorithm used in the Radeon 9600 Pro and newer GPUs, the company challenged people to point out obvious image quality problems caused by this algorithm. Some folks apparently found some cases where ATI’s filtering isn’t as good as “full” trilinear filtering, so ATI will use application detection to address those problems on a case-by-case basis.

Of course, the underlying reason to do app detection encompasses these sorts of workarounds and special-case fixes, but it’s also something more than that. Application-specific code in video drivers can be used to achieve better performance, even in cases where the application and the GPU aren’t doing anything unexpected or problematic. NVIDIA explicitly embraced this approach some time ago, adopting some broadly permissive guidelines to prevent the worst sorts of abuses of this practice.

We understand ATI will be using a “controlled” application detection method that should be a little more open than NVIDIA’s, because users will have the option of disabling app-specific optimizations via the ATI control panel, should they wish to do so. App detection won’t be a part of the upcoming Catalyst 4.9 drivers slated for release in September, but it will be included in the following version, Catalyst 4.10. 

Comments closed
    • dactyl
    • 15 years ago

    horrors. ati is trying to sell video cards

    • Engell
    • 15 years ago

    This is all fine and dandy, but I get a different error, the infamous “VPU Recover” error and mostly get it while playing Doom 3. Many people I hear get this error. Anyone know anything about it?

      • indeego
      • 15 years ago

      Got it once while playing the Hell level. I bumped down detail (it was dragging anyway) and it didn’t happen again, even after bumping it back up once I was past that level. I recommend not O/C’ing box or card, checking chipset drivers, checking graphic drivers, the usualg{<.<}g

        • ExpansionSSS
        • 15 years ago

        so what your basically saying is that you don’t like the drivers because they save your from burning the VPU when you stuff your computer in the center of an office desk with no ventilation, in a room thats 90 degrees, with your vid card OC’d by 40% ?

          • indeego
          • 15 years ago

          Yes… uh…. that is exactly what I was saying….uh…(/me shuffles out of room quickly with nervous glanceg{<.<}g)

      • NeXus 6
      • 15 years ago

      I turn that thing off whenever I install a new set of drivers. It’s not necessary and can cause problems:

      “VPU Recover attempts to stop any sort of graphics-related crashes or freezes from completely locking up or resetting your machine. While this is a valiant idea, in practice VPU Recover has been known to increase the probability of problems and crashes itself when enabled, and more importantly using VPU Recover regularly is a sign that something is wrong with your system and you need to find the source of the problem. I recommend that you untick (disable) VPU Recover to increase system stability, and then seek out and remedy the source of any problems by using the tips in the rest of this guide, such as those in the Troubleshooting section below.”

      Go here to tweak your ATi video card settings for best performance:

      ยง[< http://www.tweakguides.com/ATICAT_1.html<]ยง

    • daniel4
    • 15 years ago

    Scott, if you had used the Humus tweak you wouldn’t have noticed such a huge drop when full trilinear was enabled. You might have even noticed a performance increase over the game default. The way you enable full trilinear is similar to just forcing trilinear through the CP. Maybe you should try some benchmarks using it for reference.

      • Damage
      • 15 years ago

      I mentioned that tweak in the article. Didja read it? ๐Ÿ™‚

        • daniel4
        • 15 years ago

        I skimmed through it, but I’ll read it now :(.

    • Usacomp2k3
    • 15 years ago

    hey damage…what are the odds of some mousover images in articles like this in the future..makes it easy for guys like me to see differences

      • blitzy
      • 15 years ago

      I second that motion!

    • Pete
    • 15 years ago

    Good investigating, Scott. So I guess the 4.10s, being the first with app detection, will also be the first with ATi’s new GUI and ability for app-specific game settings?

    The slight diffs you notice b/w the game and CP tri settings in the GT and second set of Pro shots might be because of filtering applied to the light maps, no?

    The next step is to figure out why in heck the GT is twice as fast as the X800P. It’s still a mystery to me, even with all the clues I’ve received (3DC’s article saying ATi’s Z-rejection isn’t optimal for D3, and knowing that the GT can do double Z-ops per clock than the Pro [tho both should do the same with AA on]).

      • dragonsprayer
      • 15 years ago

      look at the new halflife numbers (forget where i saw them) but there almost the exact reverse ati blows away 6800. it seems ovious that dooms is opimized for nvidia and halflife2 for ati —- doom3 just came out first.

      • Damage
      • 15 years ago

      If you mean filtering applied to the specular lookup and cube maps, yes. I don’t believe D3 uses traditional light maps.

        • Pete
        • 15 years ago

        Right, I was too lazy to verify the names of the maps in question.

    • dragonsprayer
    • 15 years ago

    these tests are for top ati card for us poor folks w/ lower quality cards, i have a 9600xt, we have other problems with this game — freezing and suddenly closing are 2 main problems. I seem to have fix this by increasing my virtual memory to max , split it between 2 drives and seting min and max equal. I am running a 2.4c in hp machine w/ 1 gig ram 8 fans ( temps range from 90-122 degges f). I have my virtual memary at 800/800 on 2 drives for total 1600 megs. increasing the memory on one drive didnt work so if dont have 2 hard drives good luck. Also, i have windows on one drive and doom3 on the other. I should add that i can also play at high graphics level — before med cut out

    • indeego
    • 15 years ago

    Already finished the game, it has zeo replayability, guess my eyes (and brain) were being tortured throughout due to the bad image quality. On to the modsg{

      • Autonomous Gerbil
      • 15 years ago

      I played a few minutes at a friend’s house and wasn’t impressed enough to continue playing it. I’m not sure why people think this game is scary. It’s “scary” the whole time, which makes it kind of normal after a while. Oh well, even if it is boring, at least a whole new generation of games will be coming along behind it.

    • 5150
    • 15 years ago

    /[

    • Ragnar Dan
    • 15 years ago

    q[

    • vortigern_red
    • 15 years ago

    Another thing I should point out is that many people play with the control panel AF anyway as it is set for all their games and so did not notice the IQ problems but, of course, were getting slower frame rates then they could have got if they used in game AF (but with IQ problems)

    In fact Humus himself was using CP AF and that is why he noticed such a big performance increase with his “tweak”.

    I also wonder how no web reviews picked up on this for the raft of Doom3 test we had at launch? Were they using CP AF (H was not I belive) or were all the “IQ is the same on ATI and NV” comments just wrong!

      • blitzy
      • 15 years ago

      you’d need a pretty sharp eye to spot those kind of differences IMO…. extremely subtle

      even with the diff op theres not a lot to see

        • vortigern_red
        • 15 years ago

        That is in a screen shot. All these filtering problems are far more visable in motion.

        EDIT:

        Quote from this article:
        /[

          • hardwarenewbie
          • 15 years ago

          TechReport didn’t caught this problem by themselves. They received a tip from someone so apparently TR missed it as well. How many of us would pay that much attention to the floor while playing the game so we can’t exactly blame them.

            • rxc6
            • 15 years ago

            and you have to factor that the gamma was pumped up so it could be noticed easier… I wouldn’t blame anybody for missing this bug, heck you can barely see a damn thing in the freaking game!!!!!!

            • atryus28
            • 15 years ago

            I did see this “bug” but I had other problems then so I just didn’t look at the floor anymore. I noticed it when I was checking out all the detail in the game. But yes it is only noticable in lighter parts of the game which ends quickly.

    • atryus28
    • 15 years ago

    I was getting worried there for a bit. What’s with them and not knowing how to count with thier drivers? Why is it 4.10 and not 5.0? Gaw!!! Flippin Idiots!!

      • vortigern_red
      • 15 years ago

      Its a software version number not maths. ๐Ÿ™‚ 5.0 will be released in Jan 2005.

      Interestingly I’m typing this now in Firefox 0.9.2!

        • atryus28
        • 15 years ago

        [sarcasm]Oh I must have missed where because it’s software it no longer has to conform to our current number system. [/sarcasm]

        If you paid any attention I too am using firefox, version 0.9.3, however you will notice there is still a second decimal point after the nine. 4.10 and 4.1 are the same thing. They should call them 4.9.1 or maybe 4.1.1 at least.

          • barich
          • 15 years ago

          No, they’re not the same thing. The first number is the year, the second is the release. So 4.10 is the tenth release of 2004. It’s may not be the usual way, but it’s perfectly easy to understand.

          • vortigern_red
          • 15 years ago

          Where have you got the idea that the “.” is a decimal point? It is just a seperator as with my example of Firefox. Which you obviously missed.

            • indeego
            • 15 years ago

            Man you people will find ANYTHING to argue about. and don’t get me wrong: I applaud you on thisg{<.<}g

            • vortigern_red
            • 15 years ago

            LOL.

            I’m thinking this one will become an annual event, it happened last year and now this year. When my grand kids are having the same argument in 25 years time regarding the Cat 29.10, I will be able to proudly say I was there at the begining! ๐Ÿ™‚

            • ExpansionSSS
            • 15 years ago

            These arguements are no different from the SEGA / Nintendo arguements of childhood…

            Sega had the blood, SNES had better graphics. Then Mortal Kombat 2 came out, and Sega lost.

            p.s.

            I’m on high speed so I don’t care…but I think they need to get more manageble on these driver releases… ( IE splitting Catalysts into older / newer categories, so I’m not downloading drivers packed with functionality for a freaking 7500 )

            • Entroper
            • 15 years ago

            Yeah, but SNES had enough buttons for the game, too! ๐Ÿ˜›

            Ah, those were the days… or were they?

            • adisor19
            • 15 years ago

            Yo man, i have that 7500 ! Actually, it’s a 7200.. well, it is an original Radeon 64Meg DDR VIVO that still chuggs along and still needs drivers ๐Ÿ˜€

            So YEAH, keep em drivers comming..

            Adi

            • dragonsprayer
            • 15 years ago

            i have a second (see post 28) computer w/ a radon 7500 w/ celron 2.6 515k 2700 –lots fans — does doom work i havent tried to load it? think it work here? maybe i crankup the virtual menory to 1200/1200? see my post

            • ripfire
            • 15 years ago

            as long they don’t start arguements about the year starting with 0 or 1, it’s all good.

            (btw, it started with year 1. ๐Ÿ˜› )

    • Dposcorp
    • 15 years ago

    Outstanding job Damage. All that hard work is the reason this is my main stop for info.

    A lot of sites just do “reviews & previews.”

    *[

      • MorgZ
      • 15 years ago

      agreed that it is a good investigatory piece of writing.

      However, i dont think you would be soo forgiving if this was nVidia with the *suspect* optimisations.

      Hell the X800 Pro performs awfully with doom3 with or without its special optimisations.

        • indeego
        • 15 years ago

        It performs “fine.” Awful would be a much more dramatic reduction in frames. ATI needs better OpenGL/driver teams, IMO. Nvidia doesn’t have to release b[

        • Dposcorp
        • 15 years ago

        q[< MorgZ said, #3, agreed that it is a good investigatory piece of writing. However, i dont think you would be soo forgiving if this was nVidia with the *suspect* optimisations. Hell the X800 Pro performs awfully with doom3 with or without its special optimisations. <]q You mean me personally? Sure I would; to err is human, and to forgive is divine. I am a Nvidia fan too, as competiton is good. I am basically a fan of whover gives me the most for my money.

          • Autonomous Gerbil
          • 15 years ago

          Since this is only a bug and not a deliberate optimization, I see no need for anyone to be either forgiving or upset – at least not any more than usual for a driver problem. Since it’s only a filtering bug that hurts image quality a little bit, I’m content to just shrug my shoulders and move on. Since this is the first time ATI has needed to update their OGL drivers in a long time, I’ll be more interested in seeing how much they’re able to improve them over the next few updates as we start to think of new games using the Doom 3 engine.

          • MorgZ
          • 15 years ago

          ok fair play.

          Didnt neccessarily mean you personally pal. Just i get a bit irritated by people who are ATI or nVidia fans as opposed 2 the sensible purchaser who simply buys best value for money.

          Wen i referred to the performance being awful i was referring to the X800 Pro vs 6800 GT @ doom3. For equiv priced gfx cards the Pro is significantly slowly, espeically if the hl2 benchmarks (on the cs source beta) seem to rate the cards pretty similar.

      • JustAnEngineer
      • 15 years ago

      I concur. This is an example of the sorts of investigations that hardware review sites should do. Excellent work, Damage.

    • UberGerbil
    • 15 years ago

    Well, my first thought was “it looks like a bug.” And ATI’s explanation is perfectly reasonable. Filtering all textures vs just those specified by an app has never made much difference prior to D3, and that’s exactly the kind of case where you expect to flush out latent bugs.

    On the other hand, app-specific optimizations do give me cause for concern, not least because you have the possibility of revisions to the app conflicting with independent revisions of the drivers.

    This really points to an inadequacy of the API. If OpenGL (or D3D) was sufficiently rich for app writers to express exactly what they want to do, app-specific optimizations in the driver would be unnecessary. As it stands, certain apps will get special attention and all the others will just have to make due. It’s a bad use of the software development resources at nVidia and ATI, and it’s bad software design.

      • arb_npx
      • 15 years ago

      I agree; maybe Id can standardize their “flagged filtering” routine, and put in a GL extension for it; then we can check to see if GL_ID_ARBITRARY_FILTERING is present in the driver. Much like the days when I checked to see if GL_SGIS_MULTITEXTURE was loaded in Q2.

      • Entroper
      • 15 years ago

      This isn’t an inadequacy of the API, it’s a simple driver bug. The API is already sufficiently rich to specify anisotropic filtering on a per-texture basis. The extension to enable AF is specific about this; you have to specify the maximum degree of anisotropy for each texture.

    • Krogoth
    • 15 years ago

    Interesting, althought this “problem” isn’t quite as bad as the Quack 3 incident. But it does cause quite a state of alarm, I wonder if ATI did these ” possible optimizations” on R/RV3xx generation.

      • bwoodring
      • 15 years ago

      Did you finish the article?

      • dragonsprayer
      • 15 years ago

      the game/user experience is dyanamic –think qualtative and not quntative —-not static. ati cards will rock w/ other games just not doom3

Pin It on Pinterest

Share This