Since we have a new graphics review going up here shortly, I should probably address an issue we haven't addressed adequately. After I first published my GeForce FX 5800 Ultra review, there was a fair amount of discussion about whether we should have used NVIDIA's "Quality" or "Application" settings for our benchmarking. I used "Quality" in my testing, and I made the case for having done so in this comment. My basic take on the issue was this: NVIDIA is fudging a little with the "Quality" setting, but generally not in a way that makes a drastic difference to image quality, and in the end, neither the "Application" nor the "Quality" setting would modify my recommendation, which was to buy a Radeon 9800 Pro instead. Basically, it was a very minor issue in the grand scheme.
I still believe that, but upon further reflection, we decided to proceed with using the "Application" setting on GeForce FX cards in subsequent reviews. I'm compelled by the arguments for enforcing standard methods, as much as possible, for things like trilinear filteringespecially when it comes to benchmarking.
I want to emphasize, though, that absolute purity in matters like this one will be an increasingly difficult goal to achieve as graphics chips progress. Graphics has long been about "cheating" visually without getting caught, and until computing power multiplies by many, many orders of magnitude, it will continue to be so.
"Cheating," of course, could mean several things in this context. Hidden surface removal is one sort of "cheat" that everybody likes. Other "cheats" are more nefarious, like the old SiS drivers that outright shrank texture sizes in order to help benchmark scores. Many are in between these two extremes. NVIDIA's "Quality" mode simply isn't a sin on the order of ATI's Quake III driver optimizations or even of the R200/RV250 silicon's complete, utter inability to produce anisotropic filtering and trilinear filtering simultaneously. Still, NVIDIA is cutting corners in its current drivers in order to boost performance, and the visual difference, while generally slight, is real.
|Synaptics Clear ID FS9500 fingerprint sensors slip under phone screens||1|
|TR's 2017 Christmas giveaway: goodies from MSI, Antec, and OCZ||15|
|VESA DisplayHDR attempts to demystify HDR-capable monitors||16|
|BenQ EW277HDR brings HDR10 in reach of mere mortals||5|
|Intel Pentium Gold chips now have Silver siblings||31|
|Acer ProDesigner PE320QK is big on size and color accuracy||2|
|Thermaltake's Nemesis Switch has enough buttons for all your macros||17|
|Zotac Gaming MEK1 PCs have the requisite pieces of flair||9|
|Toshiba's latest hard drives store 14 TB without shingles||66|
|I liked it better when they called these chips "Atom". It was a more clear distinction. "Pentium Gold" is Kaby Lake. "Pentium Silver" is Gemini Lake (...||+11|