So what do we make of this?
Crytek's decision to deploy gratuitous amounts of tessellation in places where it doesn't make sense is frustrating, because they're essentially wasting GPU power—and they're doing so in a high-profile game that we'd hoped would be a killer showcase for the benefits of DirectX 11. Now, don't get me wrong. Crysis 2 still looks great and, in some ways at least, is still something of a showcase for both DX11 and the capabilities of today's high-end PCs. Some parts of the DX11 upgrade, such as higher-res textures and those displacement-mapped brick walls, appreciably improve the game's visuals. But the strange inefficiencies create problems. Why are largely flat surfaces, such as that Jersey barrier, subdivided into so many thousands of polygons, with no apparent visual benefit? Why does tessellated water roil constantly beneath the dry streets of the city, invisible to all?
One potential answer is developer laziness or lack of time. We already know the history here, with the delay of the DX11 upgrade and the half-baked nature of the initial PC release of this game. We've heard whispers that pressure from the game's publisher, EA, forced Crytek to release this game before the PC version was truly ready. If true, we could easily see the time and budget left to add PC-exclusive DX11 features after the fact being rather limited.
There is another possible explanation. Let's connect the dots on that one. As you may know, the two major GPU vendors tend to identify the most promising upcoming PC games and partner up with the publishers and developers of those games in various ways, including offering engineering support and striking co-marketing agreements. As a very high-profile title, Crysis 2 has gotten lots of support from Nvidia in various forms. In and of itself, such support is generally a good thing for PC gaming. In fact, we doubt the DX11 patch for this game would even exist without Nvidia's urging. We know for a fact that folks at Nvidia were disappointed about how the initial Crysis 2 release played out, just as many PC gamers were. The trouble comes when, as sometimes happens, the game developer and GPU maker conspire to add a little special sauce to a game in a way that doesn't benefit the larger PC gaming community. There is precedent for this sort of thing in the DX11 era. Both the Unigine Heaven demo and Tom Clancy's HAWX 2 cranked up the polygon counts in questionable ways that seemed to inflate the geometry processing load without providing a proportionate increase in visual quality.
Unnecessary geometric detail slows down all GPUs, of course, but it just so happens to have a much larger effect on DX11-capable AMD Radeons than it does on DX11-capable Nvidia GeForces. The Fermi architecture underlying all DX11-class GeForce GPUs dedicates more attention (and transistors) to achieving high geometry processing throughput than the competing Radeon GPU architectures. We've seen the effect quite clearly in synthetic tessellation benchmarks. Few games have shown a similar effect, simply because they don't push enough polygons to strain the Radeons' geometry processing rates. However, with all of its geometric detail, the DX11 upgraded version of Crysis 2 now manages to push that envelope. The guys at Hardware.fr found that enabling tessellation dropped the frame rates on recent Radeons by 31-38%. The competing GeForces only suffered slowdowns of 17-21%.
Radeon owners do have some recourse, thanks to the slider in newer Catalyst drivers that allows the user to cap the tessellation factor used by games. Damien advises users to choose a limit of 16 or 32, well below the peak of 64.
As a publication that reviews GPUs, we have some recourse, as well. One of our options is to cap the tessellation factor on Radeon cards in future testing. Another is simply to skip Crysis 2 and focus on testing other games. Yet another is to exclude Crysis 2 results from our overall calculation of performance for our value scatter plots, as we've done with HAWX 2 in the past. We haven't decided exactly what we'll do going forward, and we may take things on a case-by-case basis. Whatever we choose, though, we'll be sure to point folks to this little article as we present our results, so they can understand why Crysis 2 may not be the most reliable indicator of comparative GPU performance.
187 comments — Last by luisnhamue at 5:11 PM on 09/11/11
|1. GKey13 - $650||2. JohnC - $600||3. davidbowser - $501|
|4. cmpxchg - $500||5. DeadOfKnight - $400||6. danny e. - $375|
|7. the - $360||8. Ryszard - $351||9. rbattle - $350|
|10. Ryu Connor - $350|
|Nvidia's GeForce GTX 980 and 970 graphics cards reviewedThe bigger Maxwell arrives in style||420|
|AMD's Radeon R9 285 graphics card reviewedTonga is quite the surprise||124|
|Asus' ROG Swift PG278Q G-Sync monitor reviewedEverything is awesome when you're part of a team||152|
|Nvidia's Shield Tablet reviewedA whole other kind of Android tablet||29|
|First impressions of Nvidia's Shield TabletMobile gaming done right||43|
|Video review: Corsair Raptor M45, Vengeance M65 & M95Gaming mice from simple to complex||41|
|Custom-cooled Radeon R9 290X cards from Asus and XFX reviewedMore fans and pipes than a Phish concert||73|
|AMD's Radeon R9 295 X2 graphics card reviewedHawaii is surrounded by water, right?||251|
|The TR Podcast 162: Apple's biggest and Nvidia's fastest||1|
|Microsoft unveils a wireless display dongle of its own||5|
|Micro Center selling AOC's 24'' G-Sync monitor for $450||12|
|Steam storefront revamped with Discovery Update||11|
|Reversible, USB Type-C cables can pass DisplayPort signals alongside data and power||42|
|Early deal of the week: Delicious SSD discounts||17|
|New Gmail accounts no longer require Google+||22|
|Acer's G-Sync-infused 4K monitor priced at $800||53|
|Some of Samsung's TLC SSDs are slow to read old data||34|
|You married well.||+51|