So what do we make of this?
Crytek's decision to deploy gratuitous amounts of tessellation in places where it doesn't make sense is frustrating, because they're essentially wasting GPU power—and they're doing so in a high-profile game that we'd hoped would be a killer showcase for the benefits of DirectX 11. Now, don't get me wrong. Crysis 2 still looks great and, in some ways at least, is still something of a showcase for both DX11 and the capabilities of today's high-end PCs. Some parts of the DX11 upgrade, such as higher-res textures and those displacement-mapped brick walls, appreciably improve the game's visuals. But the strange inefficiencies create problems. Why are largely flat surfaces, such as that Jersey barrier, subdivided into so many thousands of polygons, with no apparent visual benefit? Why does tessellated water roil constantly beneath the dry streets of the city, invisible to all?
One potential answer is developer laziness or lack of time. We already know the history here, with the delay of the DX11 upgrade and the half-baked nature of the initial PC release of this game. We've heard whispers that pressure from the game's publisher, EA, forced Crytek to release this game before the PC version was truly ready. If true, we could easily see the time and budget left to add PC-exclusive DX11 features after the fact being rather limited.
There is another possible explanation. Let's connect the dots on that one. As you may know, the two major GPU vendors tend to identify the most promising upcoming PC games and partner up with the publishers and developers of those games in various ways, including offering engineering support and striking co-marketing agreements. As a very high-profile title, Crysis 2 has gotten lots of support from Nvidia in various forms. In and of itself, such support is generally a good thing for PC gaming. In fact, we doubt the DX11 patch for this game would even exist without Nvidia's urging. We know for a fact that folks at Nvidia were disappointed about how the initial Crysis 2 release played out, just as many PC gamers were. The trouble comes when, as sometimes happens, the game developer and GPU maker conspire to add a little special sauce to a game in a way that doesn't benefit the larger PC gaming community. There is precedent for this sort of thing in the DX11 era. Both the Unigine Heaven demo and Tom Clancy's HAWX 2 cranked up the polygon counts in questionable ways that seemed to inflate the geometry processing load without providing a proportionate increase in visual quality.
Unnecessary geometric detail slows down all GPUs, of course, but it just so happens to have a much larger effect on DX11-capable AMD Radeons than it does on DX11-capable Nvidia GeForces. The Fermi architecture underlying all DX11-class GeForce GPUs dedicates more attention (and transistors) to achieving high geometry processing throughput than the competing Radeon GPU architectures. We've seen the effect quite clearly in synthetic tessellation benchmarks. Few games have shown a similar effect, simply because they don't push enough polygons to strain the Radeons' geometry processing rates. However, with all of its geometric detail, the DX11 upgraded version of Crysis 2 now manages to push that envelope. The guys at Hardware.fr found that enabling tessellation dropped the frame rates on recent Radeons by 31-38%. The competing GeForces only suffered slowdowns of 17-21%.
Radeon owners do have some recourse, thanks to the slider in newer Catalyst drivers that allows the user to cap the tessellation factor used by games. Damien advises users to choose a limit of 16 or 32, well below the peak of 64.
As a publication that reviews GPUs, we have some recourse, as well. One of our options is to cap the tessellation factor on Radeon cards in future testing. Another is simply to skip Crysis 2 and focus on testing other games. Yet another is to exclude Crysis 2 results from our overall calculation of performance for our value scatter plots, as we've done with HAWX 2 in the past. We haven't decided exactly what we'll do going forward, and we may take things on a case-by-case basis. Whatever we choose, though, we'll be sure to point folks to this little article as we present our results, so they can understand why Crysis 2 may not be the most reliable indicator of comparative GPU performance.
187 comments — Last by luisnhamue at 5:11 PM on 09/11/11
|Nvidia's GeForce GTX 950 graphics card reviewed...alongside the Radeon R7 370||152|
|Fable Legends DirectX 12 performance revealedA peek at the future of games and graphics||280|
|Tiny Radeon R9 Nano to pack a wallop at $650But AMD's performance numbers may overstate its case||186|
|Nvidia's Shield Android TV reviewedThe flagship of Android TV sets sail||75|
|GeForce GTX 980 Ti cards comparedEVGA, Gigabyte, MSI, and Asus square off||36|
|Asus' Strix Radeon R9 Fury graphics card reviewedFiji goes air-cooled||312|
|AMD's Radeon R9 Fury X graphics card reviewedThe red team vents its Fury||690|
|AMD's Radeon Fury X architecture revealedSome more insights into the Fiji GPU||155|
|Apple's A9 impresses and the Nexus strikes back: The TR Podcast 188||2|
|Google Fiber has arrived in Damage Labs||51|
|Silverstone's PT18 chassis lets NUCs run fan-free||4|
|Intel to begin shipping Skylake CPUs with SGX enabled||9|
|Premium HDMI cables will be ready for next-generation media||33|
|Microsoft acquires Havok physics engine from Intel||83|
|AMD unleashes mobile Tonga with the FirePro W7170M||14|
|Deals of the week: Crucial's MX200 500GB SSD and more||12|
|Report: TSMC makes around 6 in 10 Apple A9 SoCs||19|