You will want to head over to the cold, HardOCP to read this article about trilinear and anisotropic filtering in Unreal Tournament 2003. The guys over there have taken a load of screenshots and explored the question of whether NVIDIA's trilinear filtering methods in UT2003 (with the already-controversial 44.03 drivers on a GeForce FX 5900) are bogus. From what I gather, this is a familiar issue, because earlier versions of GeForce FX drivers used a "lightweight" version of trilinear filtering that didn't look quite as good, but produced better performance. This was a difficult issue to address, because many reviewers (us included) were unsure how to compare the GeForce FX to its competitors and predecessors. Was NVIDIA's lightweight trilinear a welcome improvement on the old way of doing things or a cheap hack designed to get better benchmark scores?
Personally, I wavered on that issue, but eventually inclined toward the camp that said a $400 video card had darn well better produce higher image quality than cheaper, older, or competing cards. The guys at the cold, HardOCP are still wrestling with the issue. I'm not convinced they have it pinned down, though. There's still a leg hanging out there in mid-air.
Jonathan Hung at AMDMB.com tackled the a related issue independently, examining Detonator 44.03 performance and image quality in UT2003. He demonstrates on this page that anisotropic filtering appears to work properly in another game (Morrowind), but on the GeForce FX 5600 with the 44.03 drivers in UT2003, anisotropic filtering is not applied.
I'm having trouble finding the original forum post, but AMDMB acknowledges Dave Baumann and the gang at Beyond3D found out the 44.03 drivers were keying on UT2003 first. In this post, B3D staffer Reverend mentions it:
Again, "cheating" and "optimization" seems to be the buzzwords, as a result of Dave being the first to discover the way NVIDIA drivers are detecting UT2003 and performing texture filtering that does not give the user what he specifies nor what NVIDIA claimed to offer.So I suppose credit goes to Dave for the discovery. There seem to be two separate things going on here. First, NVIDIA's driver is keying on Unreal Tournament 2003 and activating the "lightweight" trilinear filtering method present in pre-44.03 drivers in order to boost benchmark scores. NVIDIA claimed the "Quality" setting in the 44.03 drivers would banish this lightweight trilinear method, and for the most part, that was true. Except where it was untrue. Second, at least on the FX 5600 in AMDMB's tests, 44.03 isn't doing proper anisotropic filtering. This should all ring familiar to those who read our article about NVIDIA's 3DMark03 detection and filtering "optimizations."
The Rev's post also has this interesting tidbit from Tim Sweeney, UT engine programmer guy, about how to avoid such problems in the future:
Long term I would like to see the rendering API defined in terms of exactly reproducable [sic] results, so that there isn't room for debate on things like texture filtering optimizations or mipmap biasing. Then, a given set of triangles and render state passed down to 3D hardware is guaranteed to give the exact same results on all hardware and any software emulation or fallbacks. There are general and non-controversial definitions of how all such things should work, from IEEE 752 for floating point to Fundamentals of Computer Graphics for the definition of trilinear filtering and mipmap factor calculation.And that really is the issue, isn't it? Too much wiggle room in the API leads to subjective evaluations and graphics driver team members getting too smart by half with filtering and shader optimizations.
There is another important issue here, of course, and that's trust. Yes, the API provides a little wiggle room, but NVIDIA is flaunting the situation in multiple and evasive ways in order to make its products look better than they are. All indications are that NVIDIA deceived reviewersand by proxy consumers, investors, and PC makerswhen it told them the 44.03 Quality setting would do away with the funky "lightweight" trilinear method, then enabled that method on a per-application basis anyway. They dug the hole deeper when they claimed a 30% performance increase in UT2003 on the GFFX 5600. And this all happened at the same time as they were concocting multiple methods of inflating scores in 3DMark03 while waging an all-out PR assault on 3DMark's creators. I would think twice before putting my hard-earned cash down for an NVIDIA product in the midst of all of this deception.
Meanwhile, NVIDIA is still spinning and grinning. Dave Salvator has written up his reaction to a recent day-long meeting with NVIDIA about driver optimization issues in an article entitled NVIDIA's Risky Optimization Gamble. (Hmm. Does he sound convinced? Then again, does he write his own headlines?) NVIDIA invited us here at TR to attend this self-same day-long meeting, but they then, bizarrely, effectively reneged by suspending all communication with us as the date approached. No explanation has been offered, but this all seems like SOP at NV HQ by now.
I haven't had a chance to digest Dave's article completely, but it looks like NVIDIA is outlining some principles for "correct" driver optimizations that allow for special-casing individual executables. Dave doesn't appear to like that idea, nor would most of us. Honestly, though, I think NVIDIA is wed to this approach so long as it has silicon with unbalanced shader resourcestoo much integer and not enough floating point to compete well against ATI's R3x0 chips.