MOST OF YOU ARE probably familiar by now with the controversy surrounding the current drivers for ATI's Radeon 8500 card. It's become quite clear, thanks to this article at the HardOCP, that ATI is "optimizing" for better performance in Quake III Arenaand, most importantly, for the Quake III timedemo benchmarks that hardware review sites like us use to evaluate 3D cards. Kyle Bennett at the HardOCP found that replacing every instance of "quake" with "quack" in the Quake III executable changed the Radeon 8500's performance in the game substantially.
The folks at 3DCenter.de followed Kyle's trail and discovered that, on the Radeon 8500, "Quack 3" produces much better image qualitytexture quality in particularthan Quake III. The FiringSquad observed the same behavoir, only they did so in English.
With the publication of these articles, it became a matter of public record that ATI was intentionally sacrificing image quality in Quake III for better benchmark scores. The issue, as far as I was concerned, was settled: ATI was busted.
Only a couple of questions remained: What did ATI have to say for themselves? And how exactly did they implement their cheat? Now we have answers for both.
Yesterday, the FiringSquad asked ATI for their side of the story. ATI's response was interesting. I suggest you go read through the whole interview at the FiringSquad if you haven't already. It's all worth reading.
I'd like to focus on one particular bit of ATI's explanation of the situation. It reads like so:
Most of our optimizations for Quake 3 and other applications have no impact at all on image quality, and therefore it would be pointless to allow users disable them. The current RADEON 8500 driver revision has an issue that prevents it from correctly interpreting the texture quality slider setting in Quake 3. This issue will be corrected in the next driver release.Like some of ATI's previous PR statements, this answer is packed with tricky twists and turns: reader beware. Truth be told, ATI is doing something more than simply misinterpreting the texture quality slider setting. After a little digging, we've zeroed in on what they're doing.
Note that the texture quality setting is just one of many possible ways that users can increase image quality in Quake 3 at the expense of performance; forcing on anisotropic filtering or disabling texture compression are alternative methods. It is also important to note that the image quality obtained using all of the standard image settings in Quake 3 (fastest, fast, normal, and high quality) can be observed to be equal to or better than any competing product (try it!); it is only in the special case of selecting "high quality" AND turning the texture quality slider up to the highest setting that a potential discrepancy appears.
The cheat in action
First, some background. Like Kyle, I was able to modify the Quake III executable to purge it of all instances of the word "quake". In my case, I simply used a hex editor's search-and-replace function to replace all instances of "uake" with "uaff". The result? My own hot new game: quaff3.exe.
I tested quaff3.exe with the Radeon 6.13.3276 drivers on the following setup:
Processor: AMD Athlon XP 1800+ - 1.53GHz on a 266MHz (DDR) busRunning quaff3.exe with the Radeon 8500 produces quite decent image quality. Like so:
Motherboard: Gigabyte GA7-DX
Memory: 256MB PC2100 DDR SDRAM
Audio: Creative SoundBlaster Live!
Storage: IBM 75GXP 30.5GB 7200RPM ATA/100 hard drive
OS: Windows XP Professional
Running the original quake3.exe isn't nearly as pretty. Check it out:
It's not hard to see why the Radeon 8500 produces better benchmarks with this driver "issue" doing its thing. The amount of detailed texture data the card has to throw around is much lower.