Crysis 2 tessellation: too much of a good thing?

By now, if you follow these things, you probably know the sordid story of DirectX 11 support in Crysis 2. Developer Crytek, a PC favorite, decided to expand into consoles with this latest release, which caused PC gamers to fear that the system-punishing glory of Crytek’s prior, PC-only games might be watered down to fit the inferior hardware in the console market. Crytek assured its fans that no such thing would happen and, in a tale told countless times in recent years, proceeded to drop the ball dramatically. Upon release, Crysis 2 supported only DirectX 9, with very limited user adjustments, like so many other games cross-developed for the consoles. The promised DX11 support was nowhere to be found. Although the game’s visuals weren’t bad looking as released, they fell far short of fulfilling hopes that Crytek’s latest would be one of those rare games capable of taking full advantage of the processing power packed into a state-of-the-art PC. Instead, the game became known as another sad example of a dumbed-down console port.

Months passed, and rumors about a possible DirectX 11 update for the game rose and fell like the tide, with little official word from Crytek. Then, in late June, Crytek and publisher EA unloaded a massive, PC-specific update to Crysis 2 in three parts: a patch to version 1.9 of the game weighing in at 136MB, a 545MB patch adding DirectX 11 support, and a hulking 1.65GB archive containing high-res textures for the game. An awful lot of time had passed since the game’s release back in March, but the size and scope of the update sure felt like a good-faith effort at making things up to PC gamers.

With the DX11 update installed, Crysis 2 becomes one of the most visually striking and technically advanced video games in the world. The features list includes a host of techniques that represent the cutting edge in real-time graphics, including objects tessellated via displacement mapping, dynamically simulated and tessellated water, parallax occlusion mapping (where tessellation isn’t the best fit), shadows edges with variable softness, a variant of screen-space ambient occlusion that takes the direction of light into account, and real-time reflections.

The highest profile of those features is probably tessellation, which seems to be the signature new capability of DX11 in the minds of many. Tessellation allows the GPU to employ its vast computing power to transform the low-polygon models used in most games into much higher-detail representations, with a relatively minimal performance cost. Used well, tessellation promises to improve the look of real-time graphics in some pleasant and impactful ways, eliminating—at long last—the pointy heads on so many in-game characters and giving difficult-to-render objects like trees much more organic external structures.


Lost Planet 2 without tessellation


Lost Planet 2 with tessellation

One of the major benefits of DX11’s tessellation capability is its dynamic and programmable nature: the game developer can ramp up the polygon detail only where needed, and he can scale back the number of polygons in places where they wouldn’t be perceived—say, in the interior of objects compromised of flat surfaces or in objects situated further from the camera. Such dynamic algorithms can maintain the illusion of complexity without overburdening the GPU’s geometry processing capacity.

Unfortunately, we have so far seen few examples of tessellation used well in a video game. That’s true in part because of the aforementioned scourge of console-itis (though the Xbox 360 does have limited tessellation hardware) and in part because game developers must create higher-detail versions of their underlying 3D models in order to insert them into games—not exactly a cost-free proposition.

With its DX11 update, Crysis 2 had the potential to be one of the first games to offer truly appreciable increases in image quality via tessellation. A busy summer has kept us from spending as much time with the DX11 update as we’d like, but we saw some intriuging things in Damien Triolet’s coverage (in French) over at Hardware.fr. (English translation here.) We won’t duplicate all of his hard work, but we do want to take a look at one aspect of it: a breakdown of Crysis 2‘s use of tessellation using a developer tool from AMD called GPU PerfStudio.

GPU PerfStudio is a freely available tool for Radeon graphics cards, and its integrated debugger can analyze individual frames in a game to see where GPU time is going. The work needed to construct each frame can be broken down by individual draw calls to the DirectX 11 3D API, and a visual timeline across the bottom of the screen shows which of those draw calls are taking longest to complete. PerfStudio will even let you take a look at the DX11 shaders being used in each draw call, to see exactly how the developer has structured his code.

When we fired up Crysis 2 in its DirectX 11 “ultra” quality mode, we saw that some obvious peaks were related to the creation of tessellated objects. Not only could we see the hull shaders used in the first stage of tessellation—proof that tessellation was in use—but we were also able to see the polygon meshes output by the tessellation process. We noticed some of the same things Damien pointed out, along with a few new ones, including one of the true wonders of this game’s virtual world.

The world’s greatest virtual concrete slab

Yes, we’re talking about a concrete barrier of the sort that you’ll find lining highways all across the country at this time of the year. Also known as a Jersey barrier, these simple, stark structures are strewn liberally throughout the mid-apocalyptic New York cityscape in Crysis 2, providing cover and fortifying certain areas. You might not know it, and you almost surely didn’t expect it, but these flat concrete blocks are one of the major targets for enhancement in the DirectX 11 upgrade to the game.

Jersey barrier in DirectX 9

Enhanced, bionic Jersey barrier in DirectX 11

Here’s a look at the DX9 and DX11 versions of the Jersey barrier. Rather than resize these first few screen shots, I’ve cropped them to provide you with a pixel-for-pixel capture of the game’s imagery.

You can see that there’s not much visual difference between the two. The biggest change is the little “handles” atop the slabs. In the DX9 version, they’re flat and just textures. In DX11, they appear to be real structures protruding from the top of the barrier. I think there may be higher-quality textures in use in DX11, but some of the difference there may be the result of the fact that I haven’t duplicated the camera position precisely between the two shots. Whatever the case, the visual improvement when moving from DX9 to DX11 is subtle at best.

However, in the DX11 “ultra” mode, the handling of this particular object takes up a pretty good chunk of GPU time during the creation of this frame. Why? Well, have a look at the output of one of the most time-intensive draw calls:

The tessellated polygon mesh created in the DX11 version of Crysis 2

Yep, this flat, hard-edged slab is one of the most highly tessellated objects in the scene. The polygons closest to the camera are comprised of just a few pixels each. Further from the camera, the wireframe mesh becomes a solid block of red; there, we’re probably looking at more than one polygon per pixel. Those are some very small polygons indeed.

Let’s move around and have another look at the same barrier from the side, so we can get a cleaner look at its geometry.

These barriers a strewn all over the streets. Let’s take a look at another one from a different part of the game.

Yes, folks, this is some truly inspiring geometric detail, well beyond what one might expect to see in an object that could easily be constructed from a few hundred polygons. This model may well be the most complex representation of a concrete traffic barrier ever used in any video game, movie, or any other computer graphics-related enterprise.

The question is: Why?

Why did Crytek decide to tessellate the heck out of this object that has no apparent need for it?

Yes, there are some rounded corners that require a little bit of polygon detail, but recall that the DX9 version of the same object without any tessellation at all appears to have the exact same contours. The only difference is those little metal “handles” along the top surface. Yet the flat interior surfaces of this concrete slab, which could be represented with just a handful of large triangles, are instead subdivided into thousands of tiny polygons.

Venice in Gotham

Another of Crysis 2‘s DX11-exclusive features is, as we’ve mentioned, dynamically simulated and tessellated water.

Gazing out from the shoreline, that simulated water looks quite nice, and the waves roll and flow in realistic fashion.

GPU PerfStudio gives us a look at the tessellated polygon mesh for the water, which is quite complex. It’s hard to say for sure, but the tessellation routine doesn’t appear to be scaling back the number of polygons dynamically based on their distance from the camera. As a result, the mesh dissolves into a solid purple band of pixels near the horizon. Still, the complexity is used impressively; the water is some of the most convincing we’ve seen in any game.

From the same basic vantage point, we can whirl around to take a look at the terra firma of Manhattan. In this frame, there’s no water at all, only some federally mandated crates (this is an FPS game), a park, trees, and buildings. Yet when we analyze this frame in the debugger, we see a relatively large GPU usage spike for a certain draw call, just as we saw for the coastline scene above. Here is its output:

That’s right. The tessellated water mesh remains in the scene, apparently ebbing and flowing beneath the land throughout, even though it’s not visible. The GPU is doing the work of creating the mesh, despite the fact that the water will be completely occluded by other objects in the final, rendered frame. That’s true here, and we’ve found that it’s also the case in other outdoor areas of the game with a coastline nearby.

Obviously, that’s quite a bit needless of GPU geometry processing load. We’d have expected the game engine to include a simple optimization that would set a boundary for the water at or near the coastline, so the GPU isn’t doing this tessellation work unnecessarily.

A happier example

Not all of the additional polygons in the DX11 version of Crysis 2 are wasted—far from it. For instance, we were looking for tessellated objects in the scene above, when we saw this:

Those bricks alone the side of the window are incredibly detailed and are almost surely the result of tessellation with displacement mapping. In the DX9 version of the game, those bricks won’t have individual contours or variation like that. Make no mistake: this is a lot of polygons, again approaching one poly per pixel in places. Still, this is an addition to the game that genuinely improves image quality.

Unfortunately, because there’s a little patch of water out on the horizon, this entire area has a tessellated water mesh beneath it.

One complex scene

DirectX 11 tessellation and some of the other effects have been added into Crysis 2 somewhat haphazardly after the fact, and some scenes have little or no such “enhanced” content. Others, though, are chock full of it. We’ve seen a few examples of specific tessellated objects; now let’s take a closer look at a frame that ties a bunch of these elements together.

We’re back in the first level of the game, having just emerged into the city for the first time. The building we were in is a tangled wreck of debris. Again, these are some pretty nice visuals for a modern game. The splintered wood and rubble is complex enough to look relatively natural and realistic, for the most part.

The wood on the floor isn’t just a flat texture mapped to a flat surface. Instead, the boards are angled and warped—and tessellated heavily. Some of the boards really are just long stretches of flat surfaces, though, and even those are comprised of many thousands of tiny polygons.

The displacement-mapped bricks that we saw in the prior scene are back here, covering a partially destroyed wall. Once again, they look very nice, and the polygon counts are huge.

The splintered wood supports in the scene are also heavily tessellated, so much that the mesh begins to look like a solid surface.

A closer look at some of that tessellated wood reveals some straight, rectangular planks. The only really apparent complexity is at the splintered ends of one plank, but even those spikes aren’t terribly jagged. They’re just a handful of sharp points.

Amazingly, the flat window frame, the splintered plank, and the interior wall are all made up of incredibly dense polygon meshes.

Pulling back to our full scene once again, we note that beneath it all lies a blanket of wavy water, completely obscured but still present.

So what do we make of this?

Crytek’s decision to deploy gratuitous amounts of tessellation in places where it doesn’t make sense is frustrating, because they’re essentially wasting GPU power—and they’re doing so in a high-profile game that we’d hoped would be a killer showcase for the benefits of DirectX 11. Now, don’t get me wrong. Crysis 2 still looks great and, in some ways at least, is still something of a showcase for both DX11 and the capabilities of today’s high-end PCs. Some parts of the DX11 upgrade, such as higher-res textures and those displacement-mapped brick walls, appreciably improve the game’s visuals. But the strange inefficiencies create problems. Why are largely flat surfaces, such as that Jersey barrier, subdivided into so many thousands of polygons, with no apparent visual benefit? Why does tessellated water roil constantly beneath the dry streets of the city, invisible to all?

One potential answer is developer laziness or lack of time. We already know the history here, with the delay of the DX11 upgrade and the half-baked nature of the initial PC release of this game. We’ve heard whispers that pressure from the game’s publisher, EA, forced Crytek to release this game before the PC version was truly ready. If true, we could easily see the time and budget left to add PC-exclusive DX11 features after the fact being rather limited.

There is another possible explanation. Let’s connect the dots on that one. As you may know, the two major GPU vendors tend to identify the most promising upcoming PC games and partner up with the publishers and developers of those games in various ways, including offering engineering support and striking co-marketing agreements. As a very high-profile title, Crysis 2 has gotten lots of support from Nvidia in various forms. In and of itself, such support is generally a good thing for PC gaming. In fact, we doubt the DX11 patch for this game would even exist without Nvidia’s urging. We know for a fact that folks at Nvidia were disappointed about how the initial Crysis 2 release played out, just as many PC gamers were. The trouble comes when, as sometimes happens, the game developer and GPU maker conspire to add a little special sauce to a game in a way that doesn’t benefit the larger PC gaming community. There is precedent for this sort of thing in the DX11 era. Both the Unigine Heaven demo and Tom Clancy’s HAWX 2 cranked up the polygon counts in questionable ways that seemed to inflate the geometry processing load without providing a proportionate increase in visual quality.

Unnecessary geometric detail slows down all GPUs, of course, but it just so happens to have a much larger effect on DX11-capable AMD Radeons than it does on DX11-capable Nvidia GeForces. The Fermi architecture underlying all DX11-class GeForce GPUs dedicates more attention (and transistors) to achieving high geometry processing throughput than the competing Radeon GPU architectures. We’ve seen the effect quite clearly in synthetic tessellation benchmarks. Few games have shown a similar effect, simply because they don’t push enough polygons to strain the Radeons’ geometry processing rates. However, with all of its geometric detail, the DX11 upgraded version of Crysis 2 now manages to push that envelope. The guys at Hardware.fr found that enabling tessellation dropped the frame rates on recent Radeons by 31-38%. The competing GeForces only suffered slowdowns of 17-21%.

Radeon owners do have some recourse, thanks to the slider in newer Catalyst drivers that allows the user to cap the tessellation factor used by games. Damien advises users to choose a limit of 16 or 32, well below the peak of 64.

As a publication that reviews GPUs, we have some recourse, as well. One of our options is to cap the tessellation factor on Radeon cards in future testing. Another is simply to skip Crysis 2 and focus on testing other games. Yet another is to exclude Crysis 2 results from our overall calculation of performance for our value scatter plots, as we’ve done with HAWX 2 in the past. We haven’t decided exactly what we’ll do going forward, and we may take things on a case-by-case basis. Whatever we choose, though, we’ll be sure to point folks to this little article as we present our results, so they can understand why Crysis 2 may not be the most reliable indicator of comparative GPU performance.

Comments closed
    • luisnhamue
    • 8 years ago

    I just build my new gaming rig, which includes a Radon HD 6950 2GB. And I was going to play Crysis 2. After seeing this investigation techreport, im going to play anyway, i will cap the tesselation levels as recommended.
    For the cheating that Crytek did, i feel shame.

    • Kaleid
    • 8 years ago

    Video:
    [url<]http://www.youtube.com/watch?v=IYL07c74Jr4[/url<]

      • l33t-g4m3r
      • 8 years ago

      Watching it in motion has a lot more impact than screenshots.

    • Novum
    • 8 years ago

    There is a good reason why the water is always “visible”.

    It’s derived from the water rendering algorithm of Crysis 1/CryEngine 2. It’s a camera aligned mesh that is rendered after all opaque geometry and therefore invisible parts of it are very quickly rejected by the GPUs Hier-Z mechanism. This is very efficient without tessellation, because the geometry load is minimal without it. So always rendering it was actually the fastest way.

    With tessellation it’s another story, because the geometry load actually is quite substantial with all the domain/vertex/hull shader work to do. But doing the visibility calculations on the CPU instead or using occlusion queries wouldn’t have been trivial.

    As with the other D3D11 stuff it seems to have been implemented in a rush or as an afterthought.

    Nothing wrong with that from my perspective, because it’s a free add on. I also don’t think there is some NVIDIA conspiracy behind it, but you never know 😉

      • l33t-g4m3r
      • 8 years ago

      Between this and what wtfbbqlol says, it does seem that the crysis 2’s tessellation was somewhat of a rush job. The part that seems malicious is the concrete slabs. Still, you’d have thought crytek would have known about the water, and implemented a lod or something. Crytek, kings of bloat since crysis 1.

        • Novum
        • 8 years ago

        There is some kind of LOD. Otherwise it would turn all pink much earlier.

        Also I do still believe that Crysis is just heavy on computations and was actually pretty well optimized.

        For subjective “linear” better graphics you need exponential more computing power. That’s the problem.

    • wtfbbqlol
    • 8 years ago

    I wonder if tessellation is just tricky to apply optimally and these ridiculous amounts of polygons are a byproduct of that.

    I did a quick search for this and found a developer’s blog that highlights his beef with DX11’s inherent (unless very smartly controlled) tessellation wastefulness:

    [url<]http://sebastiansylvan.wordpress.com/2010/04/18/the-problem-with-tessellation-in-directx-11/[/url<] edit: maybe someone who knows DX11 tessellation programming could comment. I don't know enough about these things to explain.

    • davidm71
    • 8 years ago

    Who would have thought the business of gfx tech would be so cut throat?! So Nvidia conspired with Crytek to develop a more tessellated game that favors Nvidia’s more powerful 500 series cards over ATI 6000 series cards in head to head benchmarks! I almost don’t have a problem with that if your using Crysis 2 to sort out and benchmark the best of the best which just happens to be Nvidia. Sorry ATI but that’s just how that horse race played out and Nvidia’s ahead by more than a nose here. The only problem I see here is that also hurts people with weaker Nvidia cards and shows Crytek doesn’t give squat about well optimized code!

    Personally I really don’t care as I have best of both worlds with two 480s in Sli and two 6970s in crossfire. Just wish that damn dual gpu flicker in Crysis would go away!

      • Bensam123
      • 8 years ago

      Mispost

    • spigzone
    • 8 years ago

    The smoking gun is all the other added effects were applied with some precision and attention to detail to extract the greatest graphics quality improvement with minimum gpu loading. ONLY tesselation was applied to maximize gpu loading without significant graphics quality gain.

    I would not be at all surprised if:

    1. Nvidia programmers were directly involved with the tesselation coding.

    2. Retail salespersons at Best Buy etal. are being ‘guided’ to inquire of prospective GPU buyers if Crytek games in general and Crisis 2 in particular were games they had or intended to buy and strongly push Nvidia GPUs in general and high end Nvidia GPUs in particular if Crisis 2 was going to be played based on ‘objective’ performance scores.

    The crux of the problem with this TWIMTBP approach is it WORSENS the playability for AMD gamers instead of being NEUTRAL.

    It’s one thing for Nvidia gamers to have a better gaming experience because Nvidia added Nvidia specific optimizations that didn’t affect AMD gameplay, it’s another entirely for AMD players to have a worse gaming experience because Nvidia’s optimizations made the AMD gameplay worse than if those optimizations had not been implemented.

    That meets the literal definition of sabotage.

      • clone
      • 8 years ago

      in this case TWIMTBP worsens the gaming experience for everyone and not just AMD users.

      AMD users see a 30% drop because of TWIMTBP “optimisations” while Nvidia users also see a 20% drop in performance.

      everyone loses but Nvidia users lose a little less which is all good for Nvidia apparently.

    • Fighterpilot
    • 8 years ago

    Great,a fully occluded ocean scene with enough tessellation to exceed the geometry processing power of AMD DX11 cards…..but well within that of the NVidia competition.

    Good game NVidia.

    Hopefully BF3 will include high speed # hashing in order to reload or enter/exit a vehicle…..

    • swaaye
    • 8 years ago

    Possibilities –

    1) The guys who worked on this DX11 patch didn’t really know what they were doing. Seems unlikely because the patch does function and has some fancy stuff in it.
    2) They just wanted to put something out quick to appease the bitching around the ‘net and tessellated stupid stuff as a result.
    3) NVIDIA paid Crytek/EA to make sure tessellation was overboard because it will hurt ATI much more with their lower geometry throughput. Nice way to tilt the benchies.
    4) Crytek hates AMD and wanted to punish them for some evil reason. Maybe AMD/ATI devrel made them angry. I know some developers definitely prefer NV hardware.

    I’m leaning towards #2, myself.

    • beck2448
    • 8 years ago

    Nvidia for the pros, whining fanbois for AMD.

    • willyolio
    • 8 years ago

    the scene with the splintered wood was the most obvious “tesselation for the sake of tesselation” to me. Isn’t this exactly what tesselation is made for? to add extra polygons and splinters when you zoom in close?

    nope. 100,000 triangles added to make an incredibly smooth cylinder while the broken wood still looks like bart simpson’s hairdo.

    • NikK
    • 8 years ago

    This guy even deletes comments, amazing…..

      • Cyril
      • 8 years ago

      Do you see a post missing? Our comment system doesn’t let us make comment posts disappear; when a post is actually deleted, it remains in the comment hierarchy and its contents are simply replaced with the words “post deleted.” I don’t see anything like that in this discussion thread.

      • l33t-g4m3r
      • 8 years ago

      Cyril’s right. However, it sounds like you feel guilty about your posting, knowing you’re doing something that should get your posts deleted, therefore blaming a missing post glitch on it being deleted. Perhaps if you weren’t shilling for nvidia, you’d have a clear conscience. Just my 2c.

    • The Dark One
    • 8 years ago

    I think Carmack’s take on current-gen tessellation was pretty interesting:

    [url<]http://www.youtube.com/watch?v=hapCuhAs1nA#t=21m50s[/url<]

    • NikK
    • 8 years ago

    This is incredibly silly even to read…

    On one hand we have people SCREAMING for more detail and better implementation of DX11 and Tessellation and when a company Finally does that some people start whining…

    That aside it’s very “obvious” that certain websites and game titlesfavor NVIDIA and others AMD/ATI….

    Half Life 1&2 favored ATI solutions, so did pretty much every game Valve ever released….The same applies with many titles by Codemasters…..The same applies to many other titles in the market…..3DMARK used to support PhysX but as soon as NVIDIA bought the entire company behind it and ATI pushed they discontinued support…..We also saw how AMD/ATI have reduced the IQ of their drivers when they launched series 6xxx in order to improve their performance….Even more recently AMD/ATI stopped supporting Sysmark 2012 because it didn’t favor their products…..

    Strangely enough none of the above is mentioned in this discovery….Yet we hear about the bad HAWX II and the “conspiracy” over at Crytek….

    So from now on we will say to every developer out there to use worse graphics just so cards made by AMD/ATI can compete with the ones from NVIDIA ? Are you people for real ?

      • Bensam123
      • 8 years ago

      You didn’t even read the article or what the article highlighted that people are poopooing on.

      • Chrispy_
      • 8 years ago

      “On one hand we have people SCREAMING for more detail”

      They’ve at least quadrupled the polygon count per frame, and the only visible improvement is some lumps on top of a concrete barrier. and some lumpier bricks in the occasional windows.

      We want more detail, not more polygons. Once upon a time, there was a relationship between the two…

      • eitje
      • 8 years ago

      Welcome to Tech Report!

      I notice that you registered just prior to posting this, and I wanted to make sure you felt welcome!

      • sschaem
      • 8 years ago

      We are not talking about optimization focus on one platform, but about sabotage.

      Its one thing to leverage AMD or nVidia dev relation to your advantage, its another to release poison to end users.

      The ‘charity’ dx11 pack from Crytek is a joke. Crytek reputation on the PC with Crysis2 went down the drain.

      • ElMoIsEviL
      • 8 years ago

      Your post was incredibly silly to read fyi.

      People want a “better” implementation of DX11 and thus Tessellation. People do not want the usage of such features used to unreasonable/irrational degrees just to get a one up on the competition as is the case here.

      As for these “Titles” favoring AMD (ATi) or nVIDIA… well on the surface it may look like an equal playing field but in reality it is not the same. You have to look at the details in order to understand the differences… aka the means to an end.

      Half Life 2 (Source) ran better on ATi hardware because ATi followed the Microsoft DX9 specifications down to a T whereas nVIDIA had gone down its own path in retaliation to Microsoft (GeForce FX 5800 Ultra days).

      The Codemasters games are just games built to use many DX9/10/11 features. They are not compiled for AMD or any AMD specific features. Codemaster titles generally use many simple shaders rather than few complex shaders which tends to favor AMDs architecture. This has changed recently with AMDs switch from 5D to 4D. So no they are not in cahoots.

      3DMark discontinued support for PhysX because it would have given the nVIDIA cards an unrealistically larger score (not many games use PhysX so the scores would not be reflective of gaming in general). Yes rather logical reasoning on Futuremarks part.

      I have not witnessed a degradation of IQ and have not seen any websites highlighting this. Even if it were true… to use that as some sort of attack on AMD is hypocritical as nVIDIA ALWAYS reduce IQ and have since the days of the Detonator 2 drivers.

      AMD stopped support Sysmark 2012 because they reckoned it didn’t not fairly represent their products. I can’t speak for their reasoning and you appear to simply be grasping at straws trying to find any dirt you can on AMD in order to further your own agenda and justify your obviously biased and irrational world view.

      And the argument has nothing to do with “worse graphics”… can you not read? The reasoning behind this entire article is as follows (I’ll repeat it rather slowly for you simple minded folks)… there appears to be an unreasonably large amount of tessellation used in certain titles that result in a performance hit while not improving image quality one bit. That is to say that there appears to be a deliberate attempt at increasing the usage of tessellation in order to favor nVIDIA products while resulting in no net benefit image quality wise.

      Got it?

    • clone
    • 8 years ago

    very ballsy article, impressed how you managed to keep it honest, the sumnation potent yet balanced and politically correct.

    incredible respect for the site.

    thank you.

    • j1o2h3n4
    • 8 years ago

    Without Scott’s hard work we won’t know what went wrong behind the scenes, needless to say an article to put our opinion on. I think scott was very generous to quote “Damien Triolet” and his intelligent use of GPU PerfStudio on Crysis, as this article has so much more good elaborate details. Incredible exploit, i love it.

    By the way, i’m ignorant, “GPU PerfStudio” is from AMD, i wonder why Nvidia didn’t had a similar software?

    • Rikki-Tikki-Tavi
    • 8 years ago

    I’m a little bit of an Nvidia and Crytek fanboy, but something has gone seriously awry here. Yet, it’s a bit early to jump to the conclusion that there is malicious intent here. It smells of the “fundamental attribution error”, a well known effect in psychology.

    [url<]http://en.wikipedia.org/wiki/Fundamental_attribution_error[/url<] Those of you who have a job, ask yourselves: have you screwed up this badly or worse at work at some point of time? With the boss breathing down your neck, asking you to finish it quickly so you can do something that actually brings in money for the company? That is the essence of the fundamental attribution error: We are reluctant to see the pressured newly-hired code monkey charged with way more than he can handle (situational) and are very happy to see the evil corporate conspiracy (dispositional). Not that I'm saying there couldn't be such a conspiracy, I'm just saying I, for my part are still withholding judgment on this one. And any way you turn this it reflects badly on Crytek.

      • Kaleid
      • 8 years ago

      Don’t need a conspiracy to know that they are after more money. That’s how most businesses operate.

        • Rikki-Tikki-Tavi
        • 8 years ago

        If two companies decide to make a secret deal, that is good for them and bad for their costumers, that is a conspiracy. It’s not how most businesses operate, because if you suggest something like this to a company, there is always the risk that someone on the other side will go to the press.

          • Kaleid
          • 8 years ago

          Along with clergymen businessmen are amongst the least honest ones. Some things are kept quiet about because everyone knows how it things operate. Mutual silence benefits both sides. Watch Inside Job on the 2008 collapse for instance, many knew exactly was what going on.

            • Rikki-Tikki-Tavi
            • 8 years ago

            Different companies. People at financial institutions got there because they want to make as much money as possible. Most game makers are driven by their desire to make the best possible product. That would be a lot of motivation to blow the whistle if they found out that their company was doing the opposite.

      • cegras
      • 8 years ago

      [quote<]We are reluctant to see the pressured newly-hired code monkey charged with way more than he can handle [/quote<] So his solution to this stress was to just over tesselate flat objects?

        • Rikki-Tikki-Tavi
        • 8 years ago

        Yes. He screwed up. It happens, and it’s not proof of a secret deal.
        Maybe they had a spreadsheet that detailed what level of tessellation they where going to apply, and someone skipped a row or something else. Maybe the spreadsheet was right, but the indexing in the code is off. That one happened to me personally before. More than once.

        It all comes down to how Crytek reacts now: if they correct it, than we can pretty conclusively say they just screwed up. If they don’t, they are probably evil.

          • cegras
          • 8 years ago

          Occams razor. Nvidia funded Crytek heavily, they optimized for nvidia gpus, etc.

            • l33t-g4m3r
            • 8 years ago

            Exactly. The easiest way to tell if a game might be compromised is if you see the TWIMTBP/nvidia logo upon starting the game.

            • Rikki-Tikki-Tavi
            • 8 years ago

            Occam’s razor? You don’t even know what that means. It’s screw-up versus corporate conspiracy. Occam’s razor cuts the conspiracy to very fine shreds.

            • cegras
            • 8 years ago

            Nvidia pays Crytek lots of money to optimize. Programmers who work at Crytek are most likely not retarded enough to tesselate flat objects more than 1 polygon per pixel.

    • PixelArmy
    • 8 years ago

    Disclaimer: nVidiot here, so mod away…

    How many ultra-smooth sidewalks or roads do you see? Sure, your concrete or whatever is a [i<]basic[/i<] flat shape, but there's lots of little variations/bumps/texture everywhere. The author accepts this as the reason why there's crazy tessellation for the bricks, but fails to extend this to the jersey wall, wood planks, etc. I am not a graphics programmer... Anyone have technical details on how the water is implemented? Is it one mesh? How's the LOD work? My guess is if it's one object and their LOD is distance based, you're getting the higher LOD for the [i<]entire[/i<] object which just so happens to span way back to the horizon. Similarly, the underground water, it's added to the scene geometry or it isn't. Want a few flat triangles (and presumably bump maps?), well that's DX9 console stuff... You know the stuff everyone complains about... But add DX11 stuff and everyone pines for the DX9 stuff. This is why we can't have nice things.

      • Aphasia
      • 8 years ago

      [QUOTE]How many ultra-smooth sidewalks or roads do you see? Sure, your concrete or whatever is a basic flat shape, but there’s lots of little variations/bumps/texture everywhere. The author accepts this as the reason why there’s crazy tessellation for the bricks, but fails to extend this to the jersey wall, wood planks, etc.
      [/QUOTE]If they actually had used the triangles for deformation that would’ve been true, but the point was, that they didnt. Take a second look at the wireframe images. In the DX11 patch you had the geometry deformation effects accourding to textures, to actually get depth in the roadtracks, etc. But with the tessalation, its not any geometry deformation. It still a flat surface with virtual depth done in the texture. Look at the extension at the side of the concrete barriers, same with the planks and the windowsill. Except with the bricks, where you actually add deformation with the tessallation.

      So if it would’ve been one triangle, or a thousand, its still a flat surface. If they had used the extra triangles for true detail, that would have been a whole other thing.

    • aug
    • 8 years ago

    Great article. I was certainly disappointed once the trio of patches were released. As a 5870 CF user, I was hoping for a HF or profile update from AMD. While some were released for general performance, I still cannot enjoy the Ultra setting on HW that should normally chew through any game. I have another PC running 2 470s in SLI and the performance difference between the 2 is laughable. I will try adjusting the tess. setting in the CCC, though I’m pretty much reserved to running on the extreme preset

    • l33t-g4m3r
    • 8 years ago

    Tessellation is the new Physx. Physx tanked because nobody with an AMD card could use it, it was pointless, and it was occasionally too slow for nvidia users. Plus, nvidia was hyping it only when they had no dx11 cards out. Tessellation, on the other hand is something that is standard for dx11, so nvidia can directly manipulate benchmarks by sponsoring dirty tricks like this. However, amd now includes a tessellation option in their control panel, so users won’t be greatly affected. The real problem will be benchmarkers not using the options.

    Another side affect of this is blowback to nvidia users. We don’t have any tessellation control, and those with slower cards like the 460 will be unable to use tessellation because of slow performance. This is a lose-lose scenario, as the only side that benefits is nvidia from 580/SLI sales. It’s planned obsolescence.

    • tejas84
    • 8 years ago

    Wow I never thought I would see TR join [H]ardOCP as AMD GPU Shills.

    How much did AMD pay you to encourage TR to write this gratuitous attack on NV hardware.

    Smacks of an axe that AMD have been grinding for a while i.e their inferior tesselation hardware

      • Meadows
      • 8 years ago

      Read the article, moron.

        • DeadOfKnight
        • 8 years ago

        Please don’t feed the trolls.

          • Meadows
          • 8 years ago

          Aww mooom, don’t take away my fun!

      • sschaem
      • 8 years ago

      nothing to do with nVidia , this is all in the face of cRytek

        • danny e.
        • 8 years ago

        it might have something to do with nVidia – we just don’t know for sure. The important part is no matter the case this isn’t pro-AMD. It’s anti-stupid.

      • ElMoIsEviL
      • 8 years ago

      AMD Shills?

      When increasing the computational load for a given feature results in no noticeable increase in the end goal (image quality in this case) then why increase the load? Just because you can?

      Just because you can is not a logical/rational argument. I see no shilling here… what I do see is an individual (yourself) emotionally attacking various institutions for daring to hold differing opinions to your own.

    • Coruscant
    • 8 years ago

    My suspicion back to Crysis 1 was that the developers were likely more lazy than sophisticated. The fact that Crysis remained unplayable at max settings, and high resolutions through better than 3 generations of video cards should have been a hint. The graphical presentation is top notch, but was it 3x better? In relative terms, it’s easy to roll out highly detail, poorly performing graphics. The difficulty is in providing the graphical detail while maintaining performance. There’s pushing the envelope, and there’s complete ignorance as to the envelope at all.

    • ohtastic
    • 8 years ago

    Meh. Crysis 3 will be set in Arizona. Problem solved. Guess this proves graphics absolutely don’t sell a game, considering the ill-will and general negativity generated around Crysis 2. I do wonder how Battlefield 3 will play out as far as performance and differences between DX10 and DX11 go. BF3 has PC as lead platform, fingers crossed. EA should learn something from the Crysis 2 shenanigans.

      • sreams
      • 8 years ago

      Arizona? All of those flat deserts will require trillions of polygons. You just made the problem worse.

    • Spotpuff
    • 8 years ago

    That water thing would be hilarious if it weren’t so wasteful.

      • Stargazer
      • 8 years ago

      I think that similar wastes (rendering large amounts of polygons that can never be seen) is actually more common than one might think. Unfortunately…

    • DarkUltra
    • 8 years ago

    This is outrageous. I’m glad I haven’t bought Crysis 2 yet. If they fix this I will buy. It would be cool if the leaves on the ground where where tessellated and their control points had a geometry shader to make them flapping in the wind… hmm

    • ModernPrimitive
    • 8 years ago

    Great article. This is the kind of reviews and information the community needs. I don’t even own a gaming desktop now but I still appreciate the effort. Thanks

    • anotherengineer
    • 8 years ago

    Well I used to work at a concrete plant and outside in the sun I can say that I have NEVER seen the top of a jersey barrier to glow like that in the sun. And whats the mess on the ground beside the barrier, are those supposed to be leaves? I have seen better detail from the source engine.

    Just fail

    Turn off that HDR!!!

    • Chrispy_
    • 8 years ago

    Very nice exposé.

    Crysis ran like crap on hardware of its day because the engine wasn’t optimised very well.
    Crysis 2 runs like crap on hardware of today, because the engine isn’t optimised very well.

    How incredibly surprising.

    Nvidia bribing Crytek to pointlessly push tesselation to the max, for noticeable performance advantage over their major competitor?

    I am, yet again, shocked by this ‘discovery’

    I miss the good old days of Carmack when you were genuinely amazed at the efficiency and speed of an engine. Today’s solution seems to be “we can’t be bothered to code properly, just throw more GPU horsepower at it.”

    If developers are capable of making optimisations to get good graphics on limited console hardware, they’re equally capable of optimising the new DX11 features to improve, rather than worsen the user experience. Personally, I’d take “very nice graphics” at 60fps over “slightly better graphics” at 40fps any day of the week.

    • drfish
    • 8 years ago

    Got a chance to actually read this. Wow, just wow.

    Reminds me of the Civ5 thing I mentioned to you awhile back… With dual 5870s (@ 5540×1050) my frame-rate goes from 18-20fps to 45-55fps just by turning tessellation to low – with no difference in graphics quality that I can detect. Would be curious to see the polygon count there… Might have to try that tool myself…

    • Pax-UX
    • 8 years ago

    Can’t wait for the SETI@home intergeneration patch to make those graphics look more awesomer while searching for real Aliens!

    It’s great fun to watch stupid people being stupid in new and interesting ways… stupid peepole FTW!

    The Sea issue seems to me like a single infinite plain with deformation added, easy thing to code, adding clipping would be costly. The problem is the order of the clipping, Tessel -> Clip or Clip -> Tessel. Assuming infinite plain, it would always have the possibility of being seen so you need add some customization to the engine to turn Tessel on & off by object and visibility… i.e. bounding boxes based of character location + height + direction as cheap Line of Sign implementation before hitting the rendering engine…. then you have to worry about stuff like reflection be correct, not hard to fix just setup everything as a camera with a LoS and render accordingly.

    • Meadows
    • 8 years ago

    In a dash of ironic comedy, the last picture in the article (the “complex scene”) shows flat wooden planks having three thousand times more polygons than, – wait for it -, the steel I-beams around the middle of the picture, which still look like we’ve never left 2007.

    I’m unsure how to describe this. At the very least, I would’ve expected them to bolster the polygon count everywhere and be done with it, but this way it looks [i<]malicious[/i<], like it's [b<]designed to suck.[/b<]

      • derFunkenstein
      • 8 years ago

      Maybe they should quit their jobs and go work for Oreck.

      • ronch
      • 8 years ago

      [quote<]it's designed to suck[/quote<] Yeah. Suck money into Nvidia's pockets, that's what. Who knows, Nvidia told these guys to use lots of tesselation (and make AMD look bad in the process) and use up GPU power unnecessarily. Guess that's the only way to make faster-than-necessary GPUs start to look old and make you want to buy a new video card.

        • stdRaichu
        • 8 years ago

        Don’t tell anyone I told you this, but nVidia’s next line of GPU’s utilise thinking aluminium and denimite mem-shards, which will automatically optimise the graphics.

        nVidia: this way we’re meant to get paid!

          • ronch
          • 8 years ago

          For a second there I thought you said ‘dynamite’. Now that’s explosive graphics if I ever heard of one! 🙂

          • ronch
          • 8 years ago

          [quote<]Don't tell anyone I told you this[/quote<] And you post it here in the forums for everyone to read?

      • sreams
      • 8 years ago

      “So you’re acting now, you’re in a vampire movie, yes? That’s good. Finally, a role that requires you to suck.” – Triumph the Insult Comic Dog

    • holophrastic
    • 8 years ago

    It’s a stupidly-well written article, I’m really impressed in every way. Looks like it took a lot of tedious work too.

    But my conclusions are somewhat different. I’m an AMD Radeon customer, very satisfied since my 3dfx days ended. So I should be on the angry side of this fence.

    But I’m not. I’m actually impressed with the technology demonstration shown here.

    Instead of seeing a poorly-executed attempt, or a one-sided competative advantage, I’m seeing the other side.

    Hey look Ma! If I push these sixteen buttons, I can get it to look even better!

    Forget about the added detail versus GPU effort, and look at the added detail versus developer/designer effort. I’ll bet it took little incremental effort to produce the added detail. That’s impressive as far as technology goes.

    Sure, with more effort and more time and more fine polish, we can take less of a performance hit. But that’s not the idea at this point. The idea is to push the boundaries of detail, not the boundaries of efficiency.

    I remember in 1997, I got a personal tour of Alias Research aka Alias Wavefront, aka the guys behind Maya aka the guys behind the software that rendered the perfect storm aka the guys who had a sizable room with a refridgerated computer with, wait for it, 16 gigs of ram!

    The impressive part was not how few cpu cycles it took to render things. They used to speak of rendering time in years, not hours, let alone real-time. They were working on the running yeti, with hair.

    But up on the wall, was a poster of a lamborghini. It was rendered. It was awesome. It was two years of computer time for the single frame. No body cared about the computer time. The impressive part was the raytraced detail. It was very much photo-realistic, and every phong counts.

    The same is true here. I don’t care about proportionately here. I don’t think that was the idea. If you want higher detail, we have higher detail. So sorry it’s not efficient today. If you want it to be efficient, you can wait a year. But here’s the detail today. Enjoy.

      • APWNH
      • 8 years ago

      I really like the things you mentioned here. It’s quite amazing to see just how incredibly tessellated these models are and it really says something about the capabilities that our hardware is capable of achieving these days. Yes, it is difficult to overstate just how wasteful it is to be generating geometry for an entire ocean when it is invisible, and subdividing FLAT surfaces, but when you think about it from another point of view, the fact that all these millions of polygons are in fact being rendered in real time, it isn’t so hard to get back that nice warm fuzzy feeling. Because one day all those polygons are gonna be put to good use by some good devs, whether it is crytek or not. And my GTX 560Ti or Radeon 6950 will render it fluidly and interactively. And it will be glorious.

      • ThorAxe
      • 8 years ago

      Agreed. It is a very well written article.

      However, I still thoroughly enjoy the detail when it’s visible in Crysis 2

      • Meadows
      • 8 years ago

      I have no idea how to give you more than 1 minus vote.

      • GrimDanfango
      • 8 years ago

      I work in the effects industry. Rendering time is certainly a major consideration, but I can say with utmost certainty that the work that goes into creating a photo-realistic image far far outstrips the rendering time. Photo-realism doesn’t come from pressing the right button and waiting for a year, it comes from experience, understanding, artistry. Just flicking on “raytracing” and upping the ray-recursion limit will only yield a very accurately calculated unrealistic image. Raytracing is just a tool, and like any tool, it only gives the best results when used by someone with experience and skill.

      That is why this is such a preposterous demonstration of a new technology. Tessellation is a hugely powerful tool, but few developers seem to have actually realised that it will still require some very carefully considered and disciplined research and application to get anything out of it. It is not an automatic “make beautiful” flag… I know this for certain, because *nothing* in computer graphics comes for free.

      Take the Battlefield 3 videos – they are absolutely stunning, really breathtaking in places. I can guarantee this isn’t because they’ve got the latest DX11 bells and whistles switched on. The Frostbite 2 engine is no doubt highly cutting-edge, but it doesn’t mean jack without a team of incredible artists who have a cutting-edge understanding of how to model and texture and light the game photo-realistically.

      I’ve said it for a long time… GPU technology really isn’t as important as people like to imagine. If you put the right team of artists and programmers together, they could build a breathtaking and photoreal game on DX8. They could do more with DX11, sure… but DX11 alone, nor any new technology, will ever magically improve graphics without someone highly skilled putting in the work to get the most out of it.

      Tessellation will only show its true potential when a developer realises it has to build the engine and create the 3D assets specifically tailored to tessellation.

        • poulpy
        • 8 years ago

        I have no idea how to give you more than 1 plus vote.

    • r00t61
    • 8 years ago

    Actually, I think those slabs could use some more polygons.

    But seriously, I could read this article over and over. Great work, TR.

    • ThorAxe
    • 8 years ago

    I know it’s fashionable to bash Crysis 2 but I’m going to buck the trend and post some good news for Crytek:

    [i<]The big winner of the evening was undoubtedly “Crysis 2” by Crytek. Europe's gamers have rewarded the FPS not only with the awards for “Best European Action Game,” “Best European Sound,” “Best European Advertisement” and “Best European Art Direction”, but the game was also voted “Best European Game” overall. Crytek’s development HQ was awarded the prize of “Best European Studio”. Because Crytek won so many categories, the host nation, Germany, ended up just before the United Kingdom in the category “Best European Game Country”.[/i<] [url<]http://www.gamasutra.com/view/pressreleases/75977/European_Games_Awards_2011_Winners_Announced.php[/url<]

      • ThorAxe
      • 8 years ago

      Wow, marked down for posting some related news…must be the under 40 set at it again. 🙂

        • wierdo
        • 8 years ago

        Probably cause it looks like an ad billboard and barely related to the actual meat of the topic.

        • Chrispy_
        • 8 years ago

        You sound like a PR marketeer for EA/Crysis. People don’t like that.

        However you try to spin it, Crysis 2 was a rushed, dumbed down console-port by Crytek that represented everything bad about what sloppy console ports are, and this is in the face of Crytek’s promises that they wouldn’t abandon their roots and sacrifice PC development to pander to the console market.

        They lied.
        They failed.
        They disappointed.

        This patch, the olive leaf they offer in peace – is a shoddy, inefficient abuse of a highly-contested DX11 ‘feature’ which, when abused as badly as it here provides negligible IQ gain outside of the improved texture pack, yet has a bigger impact on AMD hardware. Is it significant that this is a TWIMTBP game? Probably – we can only speculate. Either Crytek were lazy and rushed, or they were bribed by nvidia. Either possibility doesn’t make Crytek look like the good guys.

          • Stargazer
          • 8 years ago

          [quote<]Either Crytek were lazy and rushed, or they were bribed by nvidia. Either possibility doesn't make Crytek look like the good guys.[/quote<] Now, now, let's not rush to conclusions. It's also quite possible that they're simply incompetent.

            • Chrispy_
            • 8 years ago

            Ah, there’s me trying to apportion blame on someone, when in fact the blame is on me for expecting Crytek to do their jobs properly.

            I am so naive sometimes 😉

          • ThorAxe
          • 8 years ago

          I likes the the game. It ran like butter for me with DX11 and the HRT patch at 2560×1440 maxed (minus AA) while still looking better than any other game I have seen recently. I still have wow moments that I haven’t had since Crysis despite losing much of the sandbox freedom.

          While I appreciate the article and hope that Crytek will use tessellation more efficiently in the future, perhaps even patch it again (unlikely I know), it didn’t make the game any less enjoyable for me.

        • sweatshopking
        • 8 years ago

        welcome to neckbeard land. people aren’t rational. your post is Fine. we’re discussing crysis two, and whether or not it sucked, and it seems the awards think it’s a good game. personally i hated it, but can appreciate your post. watch out for these nerds. they’re rabid.

          • ThorAxe
          • 8 years ago

          Thanks mate.

          If Duke Nukem was DX11 and had used tessellation efficiently would the game have sucked any less?

      • yogibbear
      • 8 years ago

      Just because someone wins awards doesn’t mean they’re allowed to produce code that lab monkeys would laugh at.

      • rndmuser
      • 8 years ago

      1) Everyone can go to Wikipedia or Metacritic or whatever and find all the relevant reviews/awards for any particular game. Why repost all of such easily accessible info?
      2) Awards don’t mean shit to many people. If you want to rely on someone’s purely subjective opinion – go ahead and do that. I personally prefer to play the game myself (or observe other people actually playing it) and then make my own conclusion, as do many other people.

        • ThorAxe
        • 8 years ago

        Given that this was posted on the day the awards were announced it seemed relevant.

        These awards are voted for, so yes, like just about everything said in this thread, it’s subjective. However, in this instance, it’s the opinion of the majority of people that voted.

        I could ask why post that geometry throughput isn’t great on AMD cards? Didn’t we know that already. It’s not as if the game is unplayable on AMD hardware. I played it all the way through with a 6870 crossfire setup maxed out at 2560×1440 except for AA and it was fine.

        People complain that it’s not DX11 and then when it’s delivered they complain that it’s not efficient, and when it’s efficient they complain that it’s not detailed enough because it runs too well. Just get over it.

          • DarkUltra
          • 8 years ago

          i NikK!

          I haven’t seen a proper response to your post yet. Half Life 2 was actually optimized with renderpaths for both Geforce and Raden. But the Geforce FX architecture was best at doom 3 type graphics with stencil shadows and per-pixel lightning. The then current Radeons had better pixel shader 2.0 performance and was faster at half-life 2 type of graphics.
          In addition the geforce fx was clocked way too high and was dreadfully loud.

          In this case with Crysis 2 we have horribly implemented tessellation with tons of polygons wasted on a flat surface. If the level designer had taken their customers seriously and done a proper implementation, then we could boast how nice and detailed the fermi architecture (and upcoming Radeons) make games.

          Edit: sorry reply was meant for Nikk

      • Bensam123
      • 8 years ago

      Brand loyalty went out with the information age. Now people can learn about what they talk about rather then garishly shouting ‘mines the best’… although some people still do.

    • Jigar
    • 8 years ago

    Nice find Damage, looks like bad code comments will continue for Crytek…

    • mako
    • 8 years ago

    The water cracks me up for some reason. Water, water, everywhere, nor any drop to drink.

      • jihadjoe
      • 8 years ago

      I’m sure the developers are just being considerate of the guy might want to do a ‘deep well drilling’ mod.

        • Meadows
        • 8 years ago

        Maybe they want to one-up Minecraft at some point.

    • Bensam123
    • 8 years ago

    I’d bet more on this all being done after the game was released and none of the planning done before hand. So someone on top tasked them with tesselating the game, being the analytical engineers they are, they started tesselating everything from the ground up the way all busy bees work. Then someone at top said ‘we spent enough time on this, release the damn thing’ and so they released what they finished.

    The water was just a side effect of the game being made for a console.

    This line of thought results from neither the people at the top or the people producing the work from caring what they make.

    • jihadjoe
    • 8 years ago

    So all that hidden geometry is there to make sure anyone with anything less than a top-end GPU basically chokes to death rendering unseen details.

    Great way to move those premium class GPUs!

      • NarwhaleAu
      • 8 years ago

      Anyone with less than a top end Nvidia CPU.

    • LoneWolf15
    • 8 years ago

    “We’ve heard whispers that pressure from the game’s publisher, EA, forced Crytek to release this game before the PC version was truly ready.”

    Does Electronic Arts ever wonder why the gaming community isn’t very fond of them, or do they just not care?

    I’ve seen too many shens by EA to want to buy games published by them. I can’t recall the last time I bought or played an EA-published game; I actually find myself looking for fun games that aren’t published by EA, and finding plenty to choose from.

      • kc77
      • 8 years ago

      I really wouldn’t place too much credence to the above statement. C2 was a console port pure and simple. Back in the day you ALWAYS started with the PC version first with all of the high poly assets in and then scaled them down to console. It’s pretty damn obvious when the PC version is no better than the console that the old school approach wasn’t followed here. That’s precisely how you get tessellation in areas you don’t need it. Superfluous animation in areas you can’t see it etc etc. The DX 11 path was an after thought. They grabbed the high poly / tex assets and added a few things here and there to appease the masses. Instead of developing the game from the start with DX11 in mind.

      • CasbahBoy
      • 8 years ago

      As long as they continue making bank, the simply don’t care.

      • designerfx
      • 8 years ago

      Of games not mentioned: Don’t forget Batman Arkham Asylum, where they did the same thing. Nvidia+EA have continued to do this for a long time.

      Isn’t it ironic that an AMD tool, in comparison, is helping them dig out this trend?

    • squeeb
    • 8 years ago

    Wow, talk about shady.

    • sschaem
    • 8 years ago

    Sabotage…. I bet the xbox360 doesn’t have any of this nonsense.
    We dont need this Dx11 ‘charity’ laced with poison.

    Funny how Crytek does a half baked, buggy, after the fact dx11 game hack while Dice require Dx10 as a minium for BF3 and embraced Dx11.

    Looking at the tech slides, Dice is 1 to 2 years ahead of Crytek.

      • `oz
      • 8 years ago

      Try playing COD Black Ops on ps3/pc and then on xbox360….. there is a clear difference favoring the xbox.

    • Sencapri
    • 8 years ago

    Fail post!! I acceppt defeat!

      • Sencapri
      • 8 years ago

      oh wow my paragraph below is not there ……… EPIC FAIL.

    • Krogoth
    • 8 years ago

    Tessellation = new FSAA

    enough said

    • axeman
    • 8 years ago

    They found a way to flex all that unused Fermi muscle! On things you can’t see!

    • someuid
    • 8 years ago

    This should become a standard part of your video game and video card reveiws for several reasons:
    – shines a light on such poor coding (really, and underground ocean? wtf crytek)
    – shines a light on possible dirty punches by AMD and Nvidia
    – gives us more information to consider when we’re about to plunk down $$$ on these games and video cards.

    As others have said, it is really sad that Crytek did such a piss poor job of this. Can you imagine how much detail they could have added to the game if they had done their 3D scene generation properly? Can you image how much better the frame rates could be? If these folks built cars, we’d all find cast iron anchors hidden under the back seat. “No wonder my gas milage is so bad. Screw you Crytek!!!”

    It almost makes me wish for 2D sprite gaming again.

      • CasbahBoy
      • 8 years ago

      The car analogy (while always a bad idea to bring up) reminded me of the modding community. I haven’t been following whether there are worthwhile mod tools available for the game, but I wonder if a few bored tech-heads will figure out a way to rip that anchor out of the back seat, so to speak.

    • ChangWang
    • 8 years ago

    Wow… talk about sloppy coding. Hmmm… you know, this makes me wonder if they did something similar with the shaders in the first Crysis…

    • CasbahBoy
    • 8 years ago

    Oh my god I was laughing uncontrollably when I had started the second page. After that it just got kind of…sad.

    • I.S.T.
    • 8 years ago

    Sad.

    I agree with removing this from the benchmark suite. It’s just not worth it.

    • BobbinThreadbare
    • 8 years ago

    Good work going through all this. Looks like this game went from my wait for a huge sale and see if it’s not as bad as people say to don’t buy unless they fix this.

      • NeXus 6
      • 8 years ago

      Gameplay is pretty good, but it does lag in spots and the graphics with the DX11 patch, while better, aren’t going to blow you away. Definitely wait for a $20 or less sale on this game.

        • sweatshopking
        • 8 years ago

        i don’t know if i agree the gameplay was pretty good. i’d say for the most part, it’s a step back from crysis 1.

          • NeXus 6
          • 8 years ago

          Parts of it were good, but the AI is definitely worse. You can tell they spent more time on level design, which is pretty awesome compared to Crysis. It’s consolized, but I didn’t think it was that bad compared to other games. The shooting mechanics are basically pointless due to the flaky AI. Lots of options, but they don’t really matter.

            • sweatshopking
            • 8 years ago

            the AI is really bad, much worse than Farcry even. The artwork was better, but i prefer the open world design of crysis vs the corridors of 2. it has more attention to detail this way, but i’d rather run through a cut and paste jungle, than down a straight path, no matter how pretty.

            the physics are much worse in 2, and i’ve posted a video before comparing them. a lot of the graphics are actually [i<] worse [/i<] , including things like light flares, etc.

        • wira020
        • 8 years ago

        I didnt really enjoy it. Its just felt like a generic fps with a better graphic than most. Most of the time i just melee the seph creature lol. they die easier that way.

          • sweatshopking
          • 8 years ago

          i melee’d them the entire game. the 4 invis ones, that you can see with cryvision, just got meleed to f too. lame ending.

    • TardOnPC
    • 8 years ago

    Wow. Excellent findings Scott. No wonder my 5870 chokes up on tessellation for this game.
    d!cK move by Crytek/NVidia.

      • swaaye
      • 8 years ago

      Well, AMD’s Cypress chips were pretty slow for tessellation anyway. Wasteful use of it isn’t helping anything though that’s for sure.

    • codedivine
    • 8 years ago

    For now, I am attributing this one to stupidity (or lack of time) rather than malice. The sheer dumbness is rather appalling though.

      • Hattig
      • 8 years ago

      Except that apart from the sea, bricks and possibly some of the wood, it shows a fundamental misunderstanding of tessellation, which is meant to add detail and contours to a flat object, not tessellate a flat object into a bazillion flat objects.

      And tessellation is meant to use distance-based LOD, so the sea is again unoptimised.

      I cannot help but think that the nVidia engineers who went to help implement DX11 into the game simply have a mandate to apply stupid levels of tessellation so that their products look better. And that just harms everyone.

      • Alexko
      • 8 years ago

      No one would be dumb enough to use thousands of triangles for a strictly flat surface. This has NVIDIA written all over it.

        • Peldor
        • 8 years ago

        Well, at least one other possibility has occurred to me. Crytek could just be screwing with us. “You wanna whine abount DX11 and console-itis? Okay, here’s 250,000 tesselated 100%-certified Grade A DX11 polygons. On the floor!”

      • Kaleid
      • 8 years ago

      I think you’re wrong. They have done things like this before like with the 64bit Far Cry patch for highres (when 64bit isn’t needed) and the need for DX10 to open up the highest settings in Crysis (again not needed).

      DX11 sells graphiccards just like 64bit sold AMD64 CPUs.

    • flip-mode
    • 8 years ago

    This is a situation that is too stupid to adequately express. I really can’t muster a comment that packs enough ridicule of Crytek into it. I can only say that I’m completely disgusted and I’m glad I don’t have the game as I would feel bad for giving Crytek one cent for such garbage.

    Thanks for the investigative reporting. For what it’s worth, I’d vote you totally exclude such garbage software from you benchmark suite.

      • Farting Bob
      • 8 years ago

      Or you could just play the game anyway, without DX11 ultra settings on if your GPU cant handle it. It’ll still look as impressive as this article shows. Unless you are a massive fan of metal caps on top of concrete blocks it would seem as though the DX11 doesnt really add much, if its hard to tell the difference in a screenshot then you have no hope of seeing the difference in-game.

        • flip-mode
        • 8 years ago

        Well, that would be missing the point. The game looks plenty good without DX11 – who knows, perhaps with tessellation done right the DX11 settings would have made a meaningful difference. But the point of refusing to buy the game is to refuse to buy a product that treats the PC gamer as a second class citizen.

        I’d rather EA and Crytek exit the PC market entirely than stay in it just to offer buggy, poorly coded, poorly ported crapware games.

        However, if none of that bothers you that is not my business and I won’t begrudge you buying the game.

          • NarwhaleAu
          • 8 years ago

          I own a Radeon. Seems like the game was intended to make my card look bad. I’m considering pirating this game out of spite… but I don’t really have any urge to play it (or I would have bought it already).

      • Meadows
      • 8 years ago

      My thoughts exactly. I’m so appalled at this (and I’m an NVidia user/fan) that I’m at a loss for appropriate words.

      • dragosmp
      • 8 years ago

      While these coding errors or lack of optimization may be interpreted as “designed to hurt Radeons”, the errors can also be seen as something that was hurried out the door to check the case “now we have DX11 and tessellation”.
      The game on the PC launched while unfinished, it wouldn’t be a far stretch to say patch 1.9 is unfinished dx11. Crysis 2 should be tested because (some) people play it – this discussion is a proof there’s interest in this game, but not included in the overall value comparison since it’s obviously buggy intentionally or not.

        • ThorAxe
        • 8 years ago

        There’s no point being objective here. No one will hear you.

          • flip-mode
          • 8 years ago

          So no objective criticisms can be made here? It seems pretty objective to question the use of tessellation on a bunch of flat surfaces and the persistence of tessellated water geometry in the rendering pipeline even after the water has passed out of view and the fact that there is no scaling of detail in the tessellated water geometry as it recedes in the distance.

            • ThorAxe
            • 8 years ago

            The article is clearly objective, Scott has done a fantastic job.

            However, some of the comments verge on the hysterical.

            • derFunkenstein
            • 8 years ago

            Scott also didn’t spend alot of time saying it’s caused by TWIMTBP, which seems to be the overwhelming conclusion in the comments.

            • derFunkenstein
            • 8 years ago

            It’s easy to tessellate flat surfaces and not have everything look like ATi’s TruForm-patented balloon characters, which is probably why they’re doing it here in the first place. I believe there’s more “lazy” than “malice” here, but it’s ridiculous nonetheless.

        • siberx
        • 8 years ago

        It doesn’t matter if “some pepole play it” or not; if the game isn’t representative of the kind of performance you can expect from your hardware in most applications, there’s no point including it. I could make a game that does a physics simulation down to the atomic level and thus requires a couple orders of magnitude more CPU power than any other game out there; if (some) people were to play that game, would it be sensible to include it in a review and subsequently bash a CPU for failing to keep up when it performs poorly, or to laud a processor that does better when neither has any effect in any other game on the market?

    • drfish
    • 8 years ago

    Oh oh! I remember talking with you about this over brats! 😀

Pin It on Pinterest

Share This