auxy wrote:I don't appreciate the implication that I am unable to determine whether anti-aliasing is being applied! I am much more astute than the average person in this regard, but even to the uninitiated, the difference in no AA vs. 4x MSAA or SSAA is blatant in motion, at ~100DPI.
Regardless, while it probably is silly to assume every game uses deferred rendering, I have every reason to doubt it when developers say MSAA is "incompatible" with their game, because it doesn't really make sense. You can oversample any signal, and computer graphics, in the end, are a digital signal like anything else. (¬д¬；) It may require more intelligence on the part of the driver, and/or more bandwidth/fillrate/etc, but I haven't heard anything to indicate that it can't be done.
Maybe they mean "it's too much work and not worth it", or there are other reasons, but if it can be done -- and it can in every DX9 case, as far as I have seen -- it obviously isn't literally "incompatible."
Sorry, I didn't mean to imply you aren't able to tell whether AA is applied or not, just that what a driver does when AA is forced on isn't well established. You're also not understanding my point, MSAA doesn't oversample a signal, it oversamples coverage information to interpolate a signal. In the case of G-Buffers, you are literally interpolating geometry data like normals and positions, which will be total garbage at edges where you actually want AA to be effective. Whether the artifact will be visible or not depends on the lighting and surface material, but don't misunderstand, you literally cannot
MSAA G-Buffers and expect correct results. If the only thing forced driver AA does it MSAA the back buffer, then the end result will depend on what render passes hit the back buffer. SSAA doesn't suffer from this problem because it actually scales up all the surfaces and the pixel shader gets run for each new pixel, perhaps this is what drivers actually do.
ChronoReverse wrote:As for deferred rendering being everywhere, why would you believe it not to be? Unreal Engine 3 is deferred and it's super popular. Frostbite 2, CryEngine 3 and Unity are also deferred (capable). I'm no expert but it just seems to be a popular way to do lighting.
UE3 started its life on a forward renderer with a nice static lightmap system. In fact, most of the engines you listed started that way and is often still the case today for modern console games (and by extension, PC ports of console games). A clever lightmapping scheme with well controlled dynamic light count can still produce visually impressive results on a forward renderer without the overhead of deferred and allows you to handle transparencies a lot easier. Games like Halo, God of War, and Call of Duty were/are forward rendered, but then Killzone 2 was deferred, so there's no easy rule of thumb. As an engineer though, our job is to determine what technique will satisfy the visual bar set by the art director(s), while still maintaining performance targets and resource limits; just because deferred rendering allows for large light counts doesn't necessarily mean it's the appropriate solution.
The next-gen consoles should see some nice advances in light rendering systems. Tile-based deferred rendering and Forward+(+) are all interesting ways to handle large light count, but clustered shading has my interest the most