Single page Print

Also coming soon: the Radeon 9600 XT and CATALYST: Overdrive
ATI will also be introducing an XT version of its mid-range Radeon 9600 product soon. The new RV360 chip, on which the 9600 XT will be based, is a tweaked version of the chip on the Radeon 9600 Pro. It's essentially a four-pipeline version of the R300 VPU from the original Radeon 9700. Like the current Radeon 9600 Pro, the Radeon 9600 XT will be fabbed with TSMC's 0.13-micron process. Unlike the current 9600, though, TSMC will use a low-k dielectric in fabricating the new RV360 chips, which should reduce the RV360's power requirements and allow for higher clock speeds. As a result, the Radeon 9600 XT should hit core clock speeds upwards of 500MHz, and ATI will pair it up with memory running at over 600MHz.

ATI says the Radeon 9600 XT will have a small, quiet cooler, and it won't require a secondary power connector like the 9800 XT. 9600 XT cards will cost about $199 when they first arrive some time in October, and to sweeten the deal, all new Radeon cards will soon be shipping with Half-Life 2 in the box.

In November, ATI plans to unleash CATALYST: Overdrive, new software with dynamic overclocking capabilities built in. ATI says this is an "intelligent software solution that determines the optimal ASIC speed for maximum performance." Without crashing, of course. ATI also says the increased clock speeds will be "ATI Quality Assured," so you won't void your card's warranty by playing with this feature. As I understand it, only newer Radeon chips with built-in temperature sensors will allow for dynamic overclocking.

I expect real overclockers will find better performance by overclocking their cards manually, but CATALYST: Overdrive should allow casual overclockers a chance at some additional performance for free—and that's what overclocking is all about. Personally, I can't wait to give this feature a shot with a well-cooled Radeon 9600 XT.

Some frank words about our testing
Because of the recent controversies over various graphics benchmarks and game benchmarks, we are somewhat unsure how to handle graphics performance testing properly right now. Both ATI and NVIDIA have cheated on benchmarks in the past, but NVIDIA has been on the warpath against any benchmark that would show its GeForce FX chips in a negative light from the time of that product line's first release. We have seen new NVIDIA driver revisions consistently use a range of tricks, from less-rigorous filtering to outright replacement of shader code, to improve frame rates at the expense of image quality.

We would like to take the time, when a new product like this one is released, to conduct detailed testing with innovative benchmarking methods and exhaustive image quality comparisons. ATI recommends as much to reviewers, as well. However, our Radeon 9800 XT review sample showed up on our doorstep this past Friday morning, so we were not able to spend as much time as we'd have liked testing the 9800 XT against its competitors. As a result, we've had to restrict our testing to a mix of old and new games, plus a couple of interesting new graphics benchmarks.

To better understand all the issues involved in graphics evaluation today, I spent some time last week talking with NVIDIA's Chief Scientist, David Kirk. Mr. Kirk is an intriguing guy, part engineer and all politician, very much capable of telling you exactly what he wants you to hear and nothing more. During the course of our conversation, I took away a couple of important bits of information that are relevant here.

As I questioned him about why the GeForce FX cards seem to perform rather poorly in games with lots of DirectX 9 pixel shaders, Mr. Kirk repeatedly told me not to fixate on issues of color precision and datatypes. He admitted the NV3x chips are very "sensitive" to optimizations, but explained that color precision isn't the issue so much as instruction ordering is. He explained that a better compiler in the GeForce FX driver, translating DirectX API calls into NV3x instructions, could vastly improve performance, and that just such improvements are in store in the 50-series drivers from NVIDIA. When I asked him if he was confident that NVIDIA could wring good performance from the NV3x chips in general cases, without the need for application-specific optimizations, he answered affirmatively, although he conspicuously stopped short of swearing off the need for app-specific work.

I left the conversation with the distinct impression that I needed to test NVIDIA's 50-series drivers. As a result, I've tested the GeForce FX 5900 Ultra with NVIDIA's 51.75 drivers, which are at present a press-only release. I prefer to test with public release drivers, but I've made an exception here.

Also, I've included results for several benchmarks that we know are somehow compromised by application-specific optimizations that reduce image quality for performance. Both ATI and NVIDIA seem to have taken steps to reduce the texture filtering loads on their chips in Unreal Tournament 2003. NVIDIA's legal team apparently scared FutureMark into allowing optimizations in 3DMark03, including wholesale shader replacements, that FutureMark originally disallowed. Other publications have raised questions about image quality compromises in newer NVIDIA drivers when running AquaMark3. The list goes on. Please know we are aware of these issues, and we have included benchmark results because we had to test something. At the very least, you can compare the 9800 XT to the 9800 Pro 256MB, knowing the two are most likely going about their tasks in entirely similar ways in a given test.

Because of the short time we've had with Radeon 9800 XT, we've limited our testing to situations where $499 video cards excel: situations with high fill-rate demands, lots of edge and texture AA, and lots of pixel shaders. In many cases, that means we've limited our testing to 1600x1200 resolution in 32-bit color, with and without 4X edge antialiasing and 8X anisotropic texture filtering. In other cases, with newer games or more complex scenes, lower resolutions sufficed.