Introduction — continued
The results are striking, as you can see. I believe the results were recorded at 1024x768 in 32-bit color. This is with only standard trilinear filtering and no texture or edge antialiasing.
NVIDIA's NV3x-derived chips are way off the pace set by the ATI DirectX 9-class cards. The low-end GeForce FX 5200 Ultra and mid-range GeForce FX 5600 Ultra are wholly unplayable. The high-end GeForce FX 5900 Ultra ekes out just over 30 fps, well behind ATI's mid-range Radeon 9600 Pro (and yes, that score is from a 9600 Pro, not a stock 9600the slide was mis-labeled). The Radeon 9800 Pro is twice the speed of the GeForce FX 5900 Ultra.
Valve even ginned up a value-for-money slide to illustrate the problem with the current price/performance proposition for NVIDIA hardware.
However, NVIDIA has claimed the NV3x architecture would benefit greatly from properly optimized code, so Newell detailed Valve's sojourn down that path. The company developed a special codepath for the NV3x chips, distinct from its general DirectX codepath, which included everything from partial-precision hints (telling the chip to use 16-bit floating-point precision rather than the default 32-bit in calculating pixel shader programs) to hand-optimized pixel shader code.
The "mixed mode" NV3x codepath yielded mixed results, with a fairly decent performance gain on the FX 5900 Ultra, but not near enough of a boost on the FX 5200 Ultra and FX 5600 Ultra.
Newell also expressed skepticism about the payoffs for NV3x-specific optimizations, noting that the optimization process was arduous, expensive, and less likely to produce performance gains as shader techniques advance. What more, he said, smaller developers are not likely to have the resources Valve was able to bring to bear on the problem.
He suggested one way of dealing with the issue would be to treat all NV3x hardware as DirectX 8-class hardware, which would cut down significantly on eye candy and new graphics features, but which could yield more acceptable performance on NV3x chips. Obviously, he noted, one could always cut down visual quality in order to achieve higher performance, but in the case of Half-Life 2, falling back to DX8 will require tangible sacrifices.
Oddly enough, even using the DX8 codepath, the previous-generation GeForce Ti 4600 outperformed the brand-new GeForce FX 5600 Ultra.
|Friday night topic: quadcopters!||20|
|The TR Podcast video 173: Torquing the Titan||1|
|Report: AMD R&D spending falls to near-10-year low||79|
|Deal of the week: Ultra-wide IPS for $750, 16GB DDR4-2666 for $190, plus more||47|
|Broadwell Xeon D lands on Mini-ITX boards||34|
|Half-Life 2: Update mod adds modern polish to old classic||58|
|The TR Podcast is live, so come ask us stuff!||1|
|AMD shows off DirectX 12 performance with new 3DMark benchmark||85|