Texture filtering quality improvements
We were pleased when Cypress and the Radeon HD 5000 series introduced revised texture filtering with some nifty properties, including angle-invariant anisotropic filtering. Although that's just as geeky as it sounds, the real-world impact is noteworthy, because texture filtering has a huge influence on image quality. If objects shimmer, sparkle, or crawl as you move around in a game, yeah, that's probably poor texture filtering.
In other words, "bad filter make Thog's Xbox suck."
We know how to filter textures to eliminate such artifacts, but doing so requires lots of sampling and is, in performance terms, rather expensive. As a result, GPU makers have devised shortcuts, attempting to produce the best compromise between image quality and performance. Some of those filtering algorithms have been pretty complex, and although they haven't all been great in every way, they've allowed us to cope. That's often the name of the game in real-time graphics.
Over time, as transistor budgets have grown, the trade-offs between performance and quality have become less stark. Cypress represented a high-water mark of sorts because it promised to eliminate one of the worse compromises in older filtering algorithms, the fact that surfaces at some angles of inclination weren't filtered as well as others, while improving filtering quality overall.
Trouble is, after the Radeon HD 5000 series had been in the market for while, folks started noticing some problems with Cypress' texture filtering, especially in textures with lots of fine, high-contrast detail. This problem wasn't evident in every caseheck, I never noticed it myself while gaming on a Cypress cardbut it turned out to be quite real. At the press event for Barts, AMD Graphics CTO Eric Demers admitted that it was an issue with Cypress-era hardware.
We've replicated an example he showed from the D3D AF Tester application using a high-frequency checkerboard texture. In the image below, you're looking down a 3D-rendered cylinder with that texture mapped to the interior walls. As the squares in the checkerboard become much too small to represent with a single pixel, the goal of good filtering is to produce a smooth, visually comprehensible representation of what's happening at a sub-pixel level. On Cypress, the image produced looks like so:
There are several very obvious transition points that form rings within the image above. Those obvious transitions represent a failure of blending, and in a game with, say, a very detailed texture of a road stretching out ahead, they have the potential to translate into visible lines that travel ahead of you, looking cheesy. (For those of us old enough to remember the bad old days of bilinear-only filtering on early 3D chips, this effect might induce flashbacks.)
Demers relayed to us the dismay that he and his team had when they realized this problem made it into Cypress hardware. They thought they'd created a very elegant solution for a long-standing challenge, but in certain cases, it wasn't quite perfect. The problem, he said, is not blending between mip levels but a filter transition within a mip level. The transition between two kernels doesn't quite happen as it should. Demers was adamant that Cypress does not cheat on its level-of-detail calculations (a common performance optimization) and that the issue is simply the result of a mistake. Fortunately, the error has been corrected in the Barts filtering hardware, and the result is much smoother transitions.
Nvidia's texture filtering algorithm strikes a somewhat different balance, as you can see below. (All of these tests were produced using the default filtering quality in the video drivers.)
On the Radeons, the checkerboard pattern melts into a gray circle well before the end of the cylinder, whereas the GF104 shows detail all the way to the end, with some intriguing and intricate moire patterns. Those patterns are smoother and more regular than the ones on the Radeons, which translates into less visual noise. The odd thing about the Nvidia result here is that puffy, uh, donut shape in there. (Mmmmm... donuts.) Heck, the donut isn't perfectly round, either; it's more octagonal. Switching on colored mip levels will give us a better sense of what's happening.
Now the donut on the GeForce looks like a big, red stop sign, which highlights the fact that Nvidia applies a little less filtering to objects at certain angles of inclination. Coloring the mip levels also reveals clearly that Nvidia does less accurate blending between mip levels than AMD, which is what causes the donut effect. The color gradients are much finer on the Radeons, and those smoother transitions produce no visible rings in our high-frequency checkerboard sample.
Which is better overall? I'm not sure I can say, and this is a single, static example that's very tough to handle. In games, the differences between the GPUs are much less readily evident. The reality is that AMD and Nvidia appear to be very closely matchedeven more so now that Barts fixes that filter transition problem.
|Friday night topic: Light bulbs? Yep, light bulbs||124|
|Newest Thermaltake Urban case has dual doors||17|
|Deal of the week: Discounted Windows and cheap storage||10|
|MSI gaming barebones has Mini-ITX mobo, external overclocking button||32|
|Fan-made Morrowind remake looks amazing||33|
|Thursday Night Shortbread||41|
|Razer unveils homebrewed mechanical keyboard switches||46|
|Watch Dogs rescheduled for May 27||14|