Introduction — continued
The results are striking, as you can see. I believe the results were recorded at 1024x768 in 32-bit color. This is with only standard trilinear filtering and no texture or edge antialiasing.
NVIDIA's NV3x-derived chips are way off the pace set by the ATI DirectX 9-class cards. The low-end GeForce FX 5200 Ultra and mid-range GeForce FX 5600 Ultra are wholly unplayable. The high-end GeForce FX 5900 Ultra ekes out just over 30 fps, well behind ATI's mid-range Radeon 9600 Pro (and yes, that score is from a 9600 Pro, not a stock 9600the slide was mis-labeled). The Radeon 9800 Pro is twice the speed of the GeForce FX 5900 Ultra.
Valve even ginned up a value-for-money slide to illustrate the problem with the current price/performance proposition for NVIDIA hardware.
However, NVIDIA has claimed the NV3x architecture would benefit greatly from properly optimized code, so Newell detailed Valve's sojourn down that path. The company developed a special codepath for the NV3x chips, distinct from its general DirectX codepath, which included everything from partial-precision hints (telling the chip to use 16-bit floating-point precision rather than the default 32-bit in calculating pixel shader programs) to hand-optimized pixel shader code.
The "mixed mode" NV3x codepath yielded mixed results, with a fairly decent performance gain on the FX 5900 Ultra, but not near enough of a boost on the FX 5200 Ultra and FX 5600 Ultra.
Newell also expressed skepticism about the payoffs for NV3x-specific optimizations, noting that the optimization process was arduous, expensive, and less likely to produce performance gains as shader techniques advance. What more, he said, smaller developers are not likely to have the resources Valve was able to bring to bear on the problem.
He suggested one way of dealing with the issue would be to treat all NV3x hardware as DirectX 8-class hardware, which would cut down significantly on eye candy and new graphics features, but which could yield more acceptable performance on NV3x chips. Obviously, he noted, one could always cut down visual quality in order to achieve higher performance, but in the case of Half-Life 2, falling back to DX8 will require tangible sacrifices.
Oddly enough, even using the DX8 codepath, the previous-generation GeForce Ti 4600 outperformed the brand-new GeForce FX 5600 Ultra.
|The TR Podcast 166 is now available on YouTube||20|
|Chromebooks now come with 1TB of cloud storage for two years||23|
|Deal of the week: Devil's Canyon starting at $179.99, Intel 730 Series for $0.42/GB, and more||32|
|AMD prolongs A-series software deal; price cuts still a work in progress||22|
|Report: Valve lays out new rules for Early Access games||55|
|Intel's 2015 revenue outlook beats Street expectations||53|
|Intel's 3D NAND has 32 layers and 256Gb per die||60|
|Telltale's Game of Thrones game looks pretty good||12|
|Sounds like a good way to conceal the terrible financial performance of the mobile business unit.||+36|