Since we have a new graphics review going up here shortly, I should probably address an issue we haven't addressed adequately. After I first published my GeForce FX 5800 Ultra review, there was a fair amount of discussion about whether we should have used NVIDIA's "Quality" or "Application" settings for our benchmarking. I used "Quality" in my testing, and I made the case for having done so in this comment. My basic take on the issue was this: NVIDIA is fudging a little with the "Quality" setting, but generally not in a way that makes a drastic difference to image quality, and in the end, neither the "Application" nor the "Quality" setting would modify my recommendation, which was to buy a Radeon 9800 Pro instead. Basically, it was a very minor issue in the grand scheme.
I still believe that, but upon further reflection, we decided to proceed with using the "Application" setting on GeForce FX cards in subsequent reviews. I'm compelled by the arguments for enforcing standard methods, as much as possible, for things like trilinear filteringespecially when it comes to benchmarking.
I want to emphasize, though, that absolute purity in matters like this one will be an increasingly difficult goal to achieve as graphics chips progress. Graphics has long been about "cheating" visually without getting caught, and until computing power multiplies by many, many orders of magnitude, it will continue to be so.
"Cheating," of course, could mean several things in this context. Hidden surface removal is one sort of "cheat" that everybody likes. Other "cheats" are more nefarious, like the old SiS drivers that outright shrank texture sizes in order to help benchmark scores. Many are in between these two extremes. NVIDIA's "Quality" mode simply isn't a sin on the order of ATI's Quake III driver optimizations or even of the R200/RV250 silicon's complete, utter inability to produce anisotropic filtering and trilinear filtering simultaneously. Still, NVIDIA is cutting corners in its current drivers in order to boost performance, and the visual difference, while generally slight, is real.
|1. BIF - $340||2. Ryu Connor - $250||3. mbutrovich - $250|
|4. YetAnotherGeek2 - $200||5. End User - $150||6. Captain Ned - $100|
|7. Anonymous Gerbil - $100||8. Bill Door - $100||9. ericfulmer - $100|
|10. dkanter - $100|
|Transcend hops on the TLC NAND bandwagon with the SSD 230||1|
|Google's Jamboard takes the whiteboard into the cloud||0|
|Apple puts its AirPods in the oven a little longer||21|
|Microsoft helps hardware companies make VR more affordable||16|
|Intel P3100 M.2 SSD has datacenters in mind||8|
|A technology overview of the Aimpad R5 analog keyboard||9|
|Microsoft Surface Ergonomic Keyboard merges comfort and style||36|
|Surface Studio puts the iMac on notice||77|
|Microsoft Surface Book i7 packs a bigger punch and more batteries||53|
|Absolutely. GCN is pretty much GCN, so the math backs this up: R9 290X = 1GHz x 2816 GCN CUs = 2816 CUGHz (pronounced "cougar hertz") RX 480 = 1.27GHz...||+43|