GeForce 6800 Ultra image quality investigated

Since you guys are e-mailing about this, I suppose we should address it with a news post. The guys at Driverheaven have put together an article about GeForce 6800 Ultra image quality that purports to show yet more cheating by NVIDIA in the 60.72 drivers for the new card.

They've taken some screenshots from 3DMark and Max Payne and compared the output from the DirectX 9 reference rasterizer, a Radeon, and a GeForce 6800 Ultra. With back-to-back comparisons of images using a Flash jobby, they seem to have uncovered some very minor differences in mip map level of detail between the ATI and reference rasterizer, which look very similar to each other, and the NVIDIA, which looks a little less similar.

Before taking the screenshots, they did turn off NVIDIA's "brilinear" optimizations using the driver checkbox option. However, it looks possible NVIDIA's driver is still employing this optimization in 3DMark03. (To see the difference between regular trilinear and "brilinear," go here.) It's hard to tell because they didn't provide screenshots without "trilinear optimizations" disabled, nor did they try renaming 3DMark03.exe to see what happens. It might also be helpful to see some performance testing were an image quality difference between different configs on the GeForce 6800 Ultra detected.

This potential controversy is a little more flammable because FutureMark recently gave NVIDIA's 60.72 drivers a special dispensation in its monthly media newsletter:

Futuremark has reviewed the ForceWare 60.72 drivers for the GeForce 6800 Series graphics cards at NVIDIA's specific request. We found that the 60.72 drivers used with the GeForce 6800 Series fulfill the optimization guidelines for 3DMark03 Build 340. Please note that the ForceWare 60.72 drivers are approved only for the GeForce 6800 Series graphics cards, and any future approved WHQL drivers will replace the 60.72 non-WHQL drivers.
So if there are driver IQ changes specific to 3DMark, FutureMark either missed them or didn't seem to mind.

Of course, minor differences in mip map level of detail between one graphics card and another, or between a card and the reference rasterizer, aren't necessarily big news. The DirectX graphics API doesn't require exact mathematical precision with respect to output; there's wiggle room built in to the thing. What we can surmise is that NVIDIA is wiggling a little bit more than ATI, for whatever that's worth. Until we know more about some of the questions this article has raised, I'd say this issue's status should hover indeterminately between "tempest in a teapot" and "non-issue."

Tip: You can use the A/Z keys to walk threads.
View options

This discussion is now closed.