So what do we make of this?
Crytek's decision to deploy gratuitous amounts of tessellation in places where it doesn't make sense is frustrating, because they're essentially wasting GPU power—and they're doing so in a high-profile game that we'd hoped would be a killer showcase for the benefits of DirectX 11. Now, don't get me wrong. Crysis 2 still looks great and, in some ways at least, is still something of a showcase for both DX11 and the capabilities of today's high-end PCs. Some parts of the DX11 upgrade, such as higher-res textures and those displacement-mapped brick walls, appreciably improve the game's visuals. But the strange inefficiencies create problems. Why are largely flat surfaces, such as that Jersey barrier, subdivided into so many thousands of polygons, with no apparent visual benefit? Why does tessellated water roil constantly beneath the dry streets of the city, invisible to all?
One potential answer is developer laziness or lack of time. We already know the history here, with the delay of the DX11 upgrade and the half-baked nature of the initial PC release of this game. We've heard whispers that pressure from the game's publisher, EA, forced Crytek to release this game before the PC version was truly ready. If true, we could easily see the time and budget left to add PC-exclusive DX11 features after the fact being rather limited.
There is another possible explanation. Let's connect the dots on that one. As you may know, the two major GPU vendors tend to identify the most promising upcoming PC games and partner up with the publishers and developers of those games in various ways, including offering engineering support and striking co-marketing agreements. As a very high-profile title, Crysis 2 has gotten lots of support from Nvidia in various forms. In and of itself, such support is generally a good thing for PC gaming. In fact, we doubt the DX11 patch for this game would even exist without Nvidia's urging. We know for a fact that folks at Nvidia were disappointed about how the initial Crysis 2 release played out, just as many PC gamers were. The trouble comes when, as sometimes happens, the game developer and GPU maker conspire to add a little special sauce to a game in a way that doesn't benefit the larger PC gaming community. There is precedent for this sort of thing in the DX11 era. Both the Unigine Heaven demo and Tom Clancy's HAWX 2 cranked up the polygon counts in questionable ways that seemed to inflate the geometry processing load without providing a proportionate increase in visual quality.
Unnecessary geometric detail slows down all GPUs, of course, but it just so happens to have a much larger effect on DX11-capable AMD Radeons than it does on DX11-capable Nvidia GeForces. The Fermi architecture underlying all DX11-class GeForce GPUs dedicates more attention (and transistors) to achieving high geometry processing throughput than the competing Radeon GPU architectures. We've seen the effect quite clearly in synthetic tessellation benchmarks. Few games have shown a similar effect, simply because they don't push enough polygons to strain the Radeons' geometry processing rates. However, with all of its geometric detail, the DX11 upgraded version of Crysis 2 now manages to push that envelope. The guys at Hardware.fr found that enabling tessellation dropped the frame rates on recent Radeons by 31-38%. The competing GeForces only suffered slowdowns of 17-21%.
Radeon owners do have some recourse, thanks to the slider in newer Catalyst drivers that allows the user to cap the tessellation factor used by games. Damien advises users to choose a limit of 16 or 32, well below the peak of 64.
As a publication that reviews GPUs, we have some recourse, as well. One of our options is to cap the tessellation factor on Radeon cards in future testing. Another is simply to skip Crysis 2 and focus on testing other games. Yet another is to exclude Crysis 2 results from our overall calculation of performance for our value scatter plots, as we've done with HAWX 2 in the past. We haven't decided exactly what we'll do going forward, and we may take things on a case-by-case basis. Whatever we choose, though, we'll be sure to point folks to this little article as we present our results, so they can understand why Crysis 2 may not be the most reliable indicator of comparative GPU performance.
187 comments — Last by luisnhamue at 5:11 PM on 09/11/11
|1. Hdfisise - $600||2. Ryszard - $503||3. Andrew Lauritzen - $502|
|4. the - $306||5. SomeOtherGeek - $300||6. Ryu Connor - $250|
|7. doubtful500 - $200||8. Anonymous Gerbil - $150||9. webkido13 - $135|
|10. cygnus1 - $126|
|AMD's high-bandwidth memory explainedInside the next generation of graphics memory||235|
|The TR Podcast bonus video: AMD, Zen, Fiji, and moreWith special guest David Kanter||53|
|BenQ's XL2730Z 'FreeSync' monitor reviewedFirst of its breed and 144Hz speed||240|
|EVGA's Torq X5 and X10 mice reviewedRodentia evgae||36|
|Nvidia's GeForce GTX Titan X graphics card reviewedYour GTX 980 is puny. I spit on it. Ptoo.||443|
|Five GeForce GTX 960 cards overclockedHow do I compare thee? Dunno, really||189|
|The TR Podcast 169.5 bonus edition: Q&A intensifiesYou ask, we attempt to answer||5|
|BenQ's XL2420G G-Sync monitor reviewedTwo scalers, one monitor, zero tearing||54|
|New Need for Speed looks like a lean, mean machine||19|
|Join us right now for a TR Podcast live stream||6|
|Gigabyte's Z97-HD3 motherboard reviewed||7|
|Time Warner slings free Maxx upgrades to counter Google Fiber||41|
|Upcoming Catalyst 15.5 beta drivers may help Radeons in The Witcher 3, Project Cars||141|
|Razer makes an amazing technicolor mousepad||29|
|YouTube live streamers can now broadcast at 60 FPS||18|
|Collaborative rendering reduces bandwidth for streaming games||30|
|$60 tuner almost turns the Xbox One into a DVR||22|