Introduction — continued
The results are striking, as you can see. I believe the results were recorded at 1024x768 in 32-bit color. This is with only standard trilinear filtering and no texture or edge antialiasing.
NVIDIA's NV3x-derived chips are way off the pace set by the ATI DirectX 9-class cards. The low-end GeForce FX 5200 Ultra and mid-range GeForce FX 5600 Ultra are wholly unplayable. The high-end GeForce FX 5900 Ultra ekes out just over 30 fps, well behind ATI's mid-range Radeon 9600 Pro (and yes, that score is from a 9600 Pro, not a stock 9600the slide was mis-labeled). The Radeon 9800 Pro is twice the speed of the GeForce FX 5900 Ultra.
Valve even ginned up a value-for-money slide to illustrate the problem with the current price/performance proposition for NVIDIA hardware.
However, NVIDIA has claimed the NV3x architecture would benefit greatly from properly optimized code, so Newell detailed Valve's sojourn down that path. The company developed a special codepath for the NV3x chips, distinct from its general DirectX codepath, which included everything from partial-precision hints (telling the chip to use 16-bit floating-point precision rather than the default 32-bit in calculating pixel shader programs) to hand-optimized pixel shader code.
The "mixed mode" NV3x codepath yielded mixed results, with a fairly decent performance gain on the FX 5900 Ultra, but not near enough of a boost on the FX 5200 Ultra and FX 5600 Ultra.
Newell also expressed skepticism about the payoffs for NV3x-specific optimizations, noting that the optimization process was arduous, expensive, and less likely to produce performance gains as shader techniques advance. What more, he said, smaller developers are not likely to have the resources Valve was able to bring to bear on the problem.
He suggested one way of dealing with the issue would be to treat all NV3x hardware as DirectX 8-class hardware, which would cut down significantly on eye candy and new graphics features, but which could yield more acceptable performance on NV3x chips. Obviously, he noted, one could always cut down visual quality in order to achieve higher performance, but in the case of Half-Life 2, falling back to DX8 will require tangible sacrifices.
Oddly enough, even using the DX8 codepath, the previous-generation GeForce Ti 4600 outperformed the brand-new GeForce FX 5600 Ultra.
|Nvidia recalls Shield Tablet due to battery fire risk||37|
|Friday Night Shortbread||54|
|Mozilla CEO protests Win10's default application setup process||109|
|Deals of the week: Samsung's 850 EVO 1TB for $310 and more||51|
|Report: new Google Glass is a clip-on model for businesses||10|
|14 million have upgraded to Windows 10 in its first 24 hours||83|
|EVGA X99 Micro 2 mobo offers USB-C in a microATX package||12|
|The Tech Report Podcast is live on Twitch||6|
|Wake-from-sleep vulnerability leaves UEFIs open to attack||48|