Unreal Engine 4.15 arrives with HDR and AFR support

The mention of Unreal Engine 4 (UE) often brings to mind impressive-looking demos that elicit more than a fair share of oohs and aahs. Epic Games isn't resting on its laurels, and it's just released UE 4.15 with support for alternate frame rendering (AFR) on Nvidia SLI configurations and experimental support for HDR output, along with a Kardashian buttload of performance optimizations and developer-oriented improvements.

High-end PC enthusiasts with multiple Nvidia cards in SLI will be happy to know that games built using UE 4.15 can take advantage of alternate frame rendering, meaning they'll likely see a hefty performance boost. According to Epic, "the largest improvement comes from the renderer copying inter-frame dependencies between GPUs as early as possible." Developers will still need to work with Nvidia and test their games on this scenario, though.

Those with fancy HDR TVs or monitors are possibly saddened by the lack of HDR content out there. Epic is aware, as it's just added experimental HDR support to UE 4.15. The engine can currently output HDR content on Nvidia cards under the Direct3D 11 API or on devices that support Apple's Metal graphics API. The company says that there are rendering paths for 1000-nit and 2000-nit displays, and that it'll be adding support for more devices and configurations in the future.

That's not all the good news for gamers. The new version of UE comes with a spankin' new Texture Streaming system, which purports to "reduce CPU usage, memory usage, and load times while eliminating low resolution artifacts." More to the point, Epic says that developers can look forward to an up-to-40% reduction in texture memory usage, faster game load times, and a near-elimination of texture processing-related stalls. Owners of graphics cards with 2GB or 4GB of VRAM should be particularly happy. Maybe even Bethesda can take a hint on the texture size reduction topic, dunno.

There are a few other minor-but-important improvements. Windows games built on UE can now use non-XInput flight sticks and steering wheels, welcome news for simulator buffs everywhere. Developers can now target Nintendo Switch and Linux ARM64 platforms. Mobile VR games can take advantage of Monoscopic Far Field Rendering. Check out Epic's announcement for the nitty-gritty, especially if you wrangle game code or assets for a living.

Comments closed
    • anotherengineer
    • 2 years ago

    ” along with a Kardashian buttload of performance optimizations”

    Is that much performance optimizations even humanly possible??

      • Legend
      • 2 years ago

      Probably not if your hardware has a low profile form factor…

    • Dposcorp
    • 2 years ago

    Am I the only one who got turned on by this line?
    “………. along with a Kardashian buttload of performance optimizations and developer-oriented improvements.”

      • morphine
      • 2 years ago

      I just write the stuff. What you dirty-minded people think of it is not my problem.

    • jokinin
    • 2 years ago

    I don’t have very high hopes on UE. Last game I’ve played that uses it, is XCOM2, needs a lot of hard disk space, and doesn’t even run always above 60fps on an i5 3550, 16GB DDR3 and a GTX 1060 6GB oc.
    Maybe it is because it is not very well optimized, but the UE below doesn’t seem to perform very well on my machine.

      • Airmantharp
      • 2 years ago

      That’s really just XCom. There’s no reason for it to run as slow as it does, as it isn’t *doing* anything.

      But XCom isn’t developed by a premiere code house; I expect better of AAA games of a more mainstream variety. As a couter-example, [url=https://en.wikipedia.org/wiki/Gears_of_War_4<]Gears of War 4 is UE4[/url<], and it runs great- [url=http://www.hardocp.com/article/2016/11/23/gears_war_4_dx12_performance_review/<]even at 4k on decently equipped PCs[/url<].

        • LostCat
        • 2 years ago

        Didn’t Gears 4 have HDR on NV cards already also?

          • Airmantharp
          • 2 years ago

          I’m not sure- I picked it up on sale, and still feel that it wasn’t worth the price. It is certainly a smooth running game.

          I’ll also have to admit that while Gears of War 4 is a UE4 title, XCOM is a UE3 title- so my comparison above doesn’t really work ;).

            • LostCat
            • 2 years ago

            I have too damn many games already. 🙁 I still kinda want it.

    • slowriot
    • 2 years ago

    I hate UE. Well not really hate. But I do strongly dislike that I can often see a game and immediately know it uses UE.

    ARK comes to mind. That game is IMO a showcase for the worst aspects of UE’s graphics. The way the models work. The way the vegetation is rendered. The way the world looks like its been glazed like a donut. Awful.

      • GrimDanfango
      • 2 years ago

      Hardly the fault of UE… you could easily say the same of Unity, for various different reasons – like rubbish devs’ propensity for slapping in store bought 3D assets.

      If a game looks identifiably “Unreal Enginey”, it’s because the developers half-assed the shading, and left it with UE’s default heavily-normal-mapped shiny shader look. These days it’s perfectly capable of staggeringly realistic physically based rendering that looks nothing like that… but it takes a decent dev team to care enough about quality and actually put the legwork in.

        • slowriot
        • 2 years ago

        I’ve been noticing it for years. UE 2.0~ is when it really went into over drive. I think to a degree you’re right but its an issue that plagues even AAA quality games. Varied as Dishonored 1 vs Masseffect series vs Batman games vs ARK. They all got the glaze treatment and models that are a bit too rounded or puffy.

          • synthtel2
          • 2 years ago

          UE3 was used just about everywhere, and plenty of things it was used for don’t look obviously UE at all. Bioshock Infinite, Borderlands, and Mirror’s Edge (the original) come to mind.

          As GrimDanfango was saying, it’s mainly a matter of whether devs are putting in the effort to make it look like their own or not. Any engine this accessible is going to attract plenty of devs who aren’t going to bother (and that’s not a bad thing). I’m not a fan of UE’s default look either, but I think it’s far better than Unity’s.

            • GrimDanfango
            • 2 years ago

            Hah, Mirror’s Edge was a real favourite of mine, and I truly had *no* idea it was an unreal engine game! I assumed they used an in-house engine or something. Really demonstrates my point.

            Bioshock Infinite really *did* have some of the UE-look though… I thought it was pretty sloppy in that regard myself.

            • derFunkenstein
            • 2 years ago

            Bioshock Infinite has that problem on PC? I only played it on PS3 and it did, but I thought surely they’d fix it for the PC version. It’s expected on old hardware like that.

            UE3 games definitely look shiny. Gears of War, Dragon Age Origins, and Batman AA and AC are some of my favorites and they all have that look. It’s unfortunate with material-based shaders available. Just give each object a material attribute and use the right shaders. Of course, you have to write the right shaders, so that takes time, but it’s a disappointment.

            • slowriot
            • 2 years ago

            Bioshock is actually another game that I personally think shows a lot of signs of UE. Particularly the overly rounded, puffy model syndrome and a bit of glaze too. Borderlands shows it in the models often times and how things go transparent when you’re too close.

            The thing to me is it always showed up in very high budget, unique art style games. Bioshock is a bad offender IMO.

            • synthtel2
            • 2 years ago

            Now that you mention it in Bioshock, I suppose it does have that problem with the models, though I didn’t and don’t notice much glaze. I guess it was just stylized such that I didn’t notice it (and I maintain that it’s one of the best looking games at that tech level). Their art team was fantastic, and whatever’s going on in it, they definitely made it work.

            Could you possibly find screenshots from a big-budget game that exemplify the UE3 glaze you’re talking about? I think UE4 has a problem with that, but never really noticed it in UE3. Maybe it was there all along and it just doesn’t bother me at the lower overall tech level, but all that’s coming to mind for UE3 is low-budget stuff like Rocket League.

            Edit: looked up screenshots for some games mentioned to have it above, and my guess as to what’s going on is that they’re a more modern/photoreal art style than most of the UE3 stuff I played, so it actually becomes a bit of a bother. I guess in somewhat stylized stuff that doesn’t have UE4’s general tech level, I consider it pretty normal. I’m curious, how do you think Source looks in comparison?

            • slowriot
            • 2 years ago

            For UE4 I’d say look at anything ARK lol. That’s the game that pops into my head immediately that has all the stuff visually I don’t like taken to an absurd level.

            But for some UE3 examples. Looking for screenshots of some games I think what specifically stands out to me is a certain way lighting works or comes packaged with UE3, that can really highlight it. The degree to which it has a tendency to show up varies by games. It just seems to really stick out for me. Here’s some examples.

            Bioshock 2
            1) Players drill weapon has it real bad here. Also the light on bad guy’s mask shows it to a degree. As well as the gold hallway entry trim in left of screenshot. [url<]http://gameluster.com/wp-content/uploads/2016/04/Bioshock-2-review-3.jpg[/url<] 2) The bottom stairs here really shows it. [url<]http://gameluster.com/wp-content/uploads/2016/04/Bioshock-2-review-4.png[/url<] Batman Arkham City... not only is Batman's suit well oiled but so are the bad guys: [url<]http://www.gamehackstudios.com/wp-content/uploads/2015/03/Batman-Arkham-City-Crack-Download-Free-Full-Version-PC-Torrent-Crack-21.jpg[/url<] And Epic's own Gears of War shows it pretty bad all over the place (and the over use of motion blur, its hard to find a semi-clear gameplay screenshot). 1) Lots of the wood, building on the right, the distant stuff to me has it to a degree. [url<]https://i.ytimg.com/vi/ab7eKlW8UY8/maxresdefault.jpg[/url<] 2) Rocks and ground textures. [url<]https://3.bp.blogspot.com/-4F-78y281j8/WB32btrmT8I/AAAAAAAAAVw/rrFW_HuwbaM8ZXbxDFEtmasJ7WdtFxauQCLcB/s1600/Gears-of%2BWar%2B3%2BPC%2BDownload%2Bwww.freefullpcgame%2B%25281%2529.jpg[/url<]

            • slowriot
            • 2 years ago

            Not to blame this all on Epic. It just occurred to me that Bethesda might have an even worse glaze issue. Fallout 4 has it BAD. So did Skyrim.

            • synthtel2
            • 2 years ago

            Whoa, ARK does have an extra-bad case of that. I tried it out once, but it ran too slow to use decent settings, so I guess I missed most of that before.

            Honestly, most of those UE3 screenshots look like typical specular tech in the pre-physically based, pre-toksvig mapping era, though they’re a bit overdone. Maybe they had a weird specular map implementation? (That’s specular level, which was a hack from the start, not roughness.) They do also look like parallax mapping isn’t in full effect, which could be related.

            Skyrim is at a similar specular tech level, and the art is inconsistent enough that it shows up as a problem. FO4 has somewhat better tech, but again the tech isn’t good enough to support their tuning and use of it.

            A possibly-relevant thing I [url=https://techreport.com/news/31380/sniper-elite-4-sneaks-into-enemy-territory-with-directx-12?post=1020146<]posted[/url<] a while back: [quote<]One school of thought says that the proper strength [for a graphics effect] is the one at which the average pixel affected by it gets modified the same amount as it would with a perfectly realistic algo. Trouble is, that ends up affecting some pixels a whole lot more than is realistic, and it often looks like garbage. I say strength should be set so that almost no pixels are affected more than they would be with a perfectly realistic algo. This doesn't make people say "wow, eye candy" but it tends to be much more pleasant to play (at least IMHO).[/quote<] If you don't have appropriately realistic tech, specular (as anything else) should be subdued to somewhat below realistic. I think the biggest shortcomings of the stuff in question here aren't the algos themselves, but that they're tuned to full power with so-so algos. The big question then is why UE3 has a habit of that if others don't, but I'm still not convinced UE3 had that worse than its contemporaries.

            • slowriot
            • 2 years ago

            There were tons and tons and tons of popular UE3 titles so I just may remember seeing the logo on so many starting screen it just sticks in my head for that particular era of graphics.

            But yeah honestly the more I think about the most recent real bad example is FO4 and it makes sense because its built off mostly last gen tech.

            I was always a fan of the way most Source engine games look. I still feel the lighting in HL2, while clearly technically way behind these days, is so natural it holds up very well.

          • GrimDanfango
          • 2 years ago

          Models that are a bit too anything can’t possibly be the fault of a game engine, unless that engine does something weird like that thing ATI tried back in the olden days, to dynamically subdivide everything even in games that didn’t support it.

          Even modern day tesselation should be essentially engine-agnostic, as it’s a GPU function, and you get out of it whatever you put in, in terms of the resulting detail and quality of the mesh.

            • derFunkenstein
            • 2 years ago

            From the days of all-caps marketing buzzwords, [url=https://techreport.com/review/3203/radeon-8500-vs-geforce3-ti-500/2<]TRUFORM[/url<] Ryzens from its grave...

            • slowriot
            • 2 years ago

            Well… why then so many darn UE engine games have the same puffy, overly rounded models all the time? It’s just as obvious as the glaze issue.

            I think you’re kinda neglecting other things. Like maybe the toolset of UE tends to push devs/artists in certain directions. It just seems that way when games very diverse and high budget show the same traits.

            • derFunkenstein
            • 2 years ago

            I’m guessing it’s some sort of lazy tessellation implementation, but I didn’t notice Batman AA or AC having [url=https://images-na.ssl-images-amazon.com/images/G/01/videogames/detail-page/batman.aa.03.lg.jpg<]puffy anything[/url<].

            • synthtel2
            • 2 years ago

            The tooling is my guess too. Geometry shouldn’t end up looking distinctive at all, but it does seem to happen. Source, REDengine, and Creation all have issues like that.

      • shaq_mobile
      • 2 years ago

      Yeah if you hop on the UE market you’ll find plenty of examples both for and against what you’re saying. UE can range from Mario esque side scrollers to photo realistic architecture examples. That being said, I agree, lots of the UE games look similar. Glossy and emissive materials are the RGB of modern game dev. EVERYTHING MUST BE SHINY AND GLOW.

    • NTMBK
    • 2 years ago

    Only on NVidia? Seriously?

      • morphine
      • 2 years ago

      For now, Epic is adding more systems in the future.

      • GrimDanfango
      • 2 years ago

      How dare they finish one thing before starting the next!

        • NTMBK
        • 2 years ago

        There is absolutely no indication that they plan to add support for AMD. It just says that it supports SLI. This is blatant preferential treatment.

          • wingless
          • 2 years ago

          When they see what Vega has to offer feature-wise, they’ll be working their fingers to the bone to support it.

          • morphine
          • 2 years ago

          FTA: “and that it’ll be adding support for more devices”

          Source: ” Other devices are coming in future releases or will be available through GitHub as added. “

      • Mat3
      • 2 years ago

      Epic and Nvidia are so in bed together that the Unreal engine might as well be considered Nvidia’s own.

      • Airmantharp
      • 2 years ago

      Makes sense. Larger installed base, and Nvidia has spent a lot more effort for far longer on making multi-GPU work.

        • JustAnEngineer
        • 2 years ago

        Your bias is showing.

        ATI brought multi-GPU alternate frame rendering to market in October 1999.

        NVidia’s Scalable Link Interface was introduced in June 2004 – almost five years later.

        3Dfx introduced mulit-GPU Scan Line Interleave to the market in 1998.

        [url<]https://techreport.com/review/6931/nvidia-sli-resurrects-gpu-teaming[/url<]

          • Airmantharp
          • 2 years ago

          Not really. We can go on a history lesson, but you’re not addressing my points.
          1. Nvidia does indeed have a larger installed base between the two, which is relevant to which technology a developer would target first.
          2. It took TR and others to use a tool to show how Crossfire was basically a wash in performance, perhaps even a detriment, something many experienced firsthand. This was after all of your benchmark years above, and thus far more pertinent.

          Point one is as good an explanation as any for why Epic would work on Nvidia’s mutli-GPU tech first. Point two is a refutation of your pettiness here, as I have no interest in promoting anyone or their tech, though you’d love to shout that from the rooftops it seems.

          I’ll end with a TR link of my own, showing AMD’s response to TR’s investigation, relative to GPUs that AMD is selling in graphics cards today:
          [url<]https://techreport.com/news/25428/driver-fix-for-crossfire-eyefinity-4k-frame-pacing-issues-coming-this-fall[/url<]

            • NTMBK
            • 2 years ago

            So your argument is that 4 years ago, they fixed a bug…?

            • RAGEPRO
            • 2 years ago

            His argument is that NVIDIA SLI is a more developed and mature technology than AMD Crossfire. The RAGE FURY MAXX is completely immaterial to this discussion.

            • Airmantharp
            • 2 years ago

            That’s part of the second part. The main argument is just numbers: if you’re going to develop something, it makes some sense to start with the largest market, and multi-GPU is a pretty niche thing already.

            It’s something I’ve delved in personally- I was bit by the Crossfire frame-pacing issue well before AMD fixed the bug with my first multi-GPU setup back when AMD was more competitive with Nvidia’s top-end in terms of performance and performance per watt, and had cards on the market with more VRAM for less (I was targeting a higher than mainstream resolution at the time). A switch to SLI, in large part due to investigative work by TR and others, changed the experience completely.

            But that’s just anecdotal evidence, and not something I’d lead with. Today, I’d be happy to use AMD for GPUs (and CPUs) if they’re willing to compete in the performance brackets I’m looking for, and hell, I was an AMD/ATi fan first.

            And if the rumors floating about Ryzen and Vega are true, that might just happen!

            • Goty
            • 2 years ago

            Crossfire performance is a wash? Are you trying to imply that there are no gains to be had? Can you please provide a link to the TR article? (I could go get it myself, but I’m lazy and you seem pretty invested in this.)

            • Airmantharp
            • 2 years ago

            I’m really not that invested, so the article I linked above should get you started.

            Also, you’re using the wrong tense: I’m not saying that Crossfire *is* a wash, I’m saying that at a certain point in the recent-ish past, it *was* a wash.

            What that means is that running two cards in Crossfire could produce higher average framerates, but the frametimes were so bad that the experience was that of a framerate of just one of the cards, or even worse. TR and others tested it, and I witnessed it firsthand.

            This relates to the discussion, and to my second point above, in that while AMD was having issues with Crossfire, Nvidia had already done the legwork to make sure that the SLI experience was as smooth as possible, which lends credence to the possibility that Nvidia’s commitment to their flavor of multi-GPU might be a factor in a developer like Epic working on getting SLI functional before working on Crossfire.

            • Legend
            • 2 years ago

            Did TR ever look into the frame pacing of AMD cards that utilized the bridge connectors on the top of the cards before they decided to innovate and removes these in preference of just transmitting over the PCI-e bus?

            I’m not saying being innovative is an excuse for the frame stuttering issues or anything else. Just curious.

            • Airmantharp
            • 2 years ago

            I’m not sure, though I don’t think so. As quickly as video cards lose their luster when replaced by newer generations I don’t really hold it against TR, but it would be interesting to see, and I do remember asking the question to myself when AMD talked about addressing the deficiencies of Crossfire.

            • Goty
            • 2 years ago

            Ah, very good, that clears things up tremendously. Frame timing issues took a large, positive step forward in the meantime, however, and even with both AMD and NVIDIA scaling back their emphasis on multi-GPU technologies, it appears that crossfire is [i<]at worst[/i<] on par with SLI as it comes to performance scaling and frametime variance. (Take a look at PC Per's Fury X vs 980 Ti Crossfire/SLI review for reference. They're the only other site I know of that pays significant attention to frametimes and they do a bit more multi-GPU work than TR.)

            • Airmantharp
            • 2 years ago

            After getting called out- with proof- AMD took a huge step forward. I’ve used Nvidia most recently (a pair of GTX970 cards) due to their better performance/watt while typically being only a little behind in performance/price in any segment that AMD has competed in, on top of Nvidia’s drivers being generally better (though not perfect) up until the last year or so.

            Now, AMD looks to have cleaned up their driver act as well as looks to be competing both on terms of performance and performance per watt with Vega, so I’m definitely looking closer at them.

            (I do have a G-Sync monitor, but I don’t hold that against AMD- it’s nice, but it’s not a monitor that I’m liable to keep for long, and while G-Sync is still technically superior to Freesync, the difference just doesn’t matter enough)

      • Pancake
      • 2 years ago

      Majority rules 🙂 *Cuddles its GTX970*

      But, hey, you got Ashes of the Singularity…

        • jihadjoe
        • 2 years ago

        Does anyone actually play Ashes of the Singularity?

          • LostCat
          • 2 years ago

          Yes. Not as much as I could, but yes.

          • chuckula
          • 2 years ago

          Ashes of the Singularity is the best benchmark in which somebody snuck in a game without permission that I have ever seen.

            • Pancake
            • 2 years ago

            I think we can all expect it will be brought out like the dead stinking cat it is to demonstrate the superiority of an AMD CPU/GPU combination in the coming weeks. Never mind that Ryzen will be slaughtered in comparison to Baby Cakes in most games people actually care about…

            Still, gonna build me an 8-core Ryzen Linux box although I’m kinda tossing up toy money between that or a MacBook…

            • chuckula
            • 2 years ago

            Lol. AMD fanboys downthumbed somebody who said he was actually buying RyZen while I — the supposed Intel fanboy — upthumb.

            • Pancake
            • 2 years ago

            As luck would have it I’ve just been given a free older dual-Xeon Mac Pro so I guess I don’t need to buy a MacBook to port/test my software on. Not that I personally give a damn about Macs and actually fervently hope Apple will die painfully but some of my users use their products.

            So, I’ll definitely be building a sweet Ryzen Linux box 🙂 Fanboyism is soo stoopid. I fully expect my Ryzen to run like a piece of crap compared to Baby Cakes with anything using 4 loaded threads or less. But it’s got 8 cores!

    • Legend
    • 2 years ago

    Hasn’t AFR always been the least preferred method of harnessing multiple gfx cards for video games(eg. ATI Rage Fury Maxx), and something which can be force enabled in the driver if better low level support isn’t available?

    Either way great fondness for Epic Games and everything they have done, and are doing for this industry.

      • RAGEPRO
      • 2 years ago

      AFR is actually the only way anyone does multi-GPU in DX11 or OpenGL, anymore.

        • Legend
        • 2 years ago

        Ah yes that is correct, thanks for pointing that out. I was thrown off a bit by AFR being touted in the engine update along with performance optimizations where in fact AFR is not the best way to do multi-gpu rendering.

        I believe I have seen experimental support for DX12 with this engine. I wonder why they decided to support AFR in DX11 and not just put these resources into DX12/Vulcan support? I’m sure that would have a more substantial affect.

          • RAGEPRO
          • 2 years ago

          This is actually substantially less work. Using multiple graphics adapters in DirectX 12 or Vulkan more or less requires application specific support. They probably could implement a generic solution in Unreal Engine, but it is very likely that any time a game developer made significant customizations to the renderer it would break.

          Doing multi-gpu in DirectX 11 shifts the workload on to the driver developer. Essentially, not literally, but essentially, DirectX 11 Crossfire/SLI is transparent to the application. The application can run just as it would on a single graphics card and it’s up to the driver to divide the work across the available GPUs. Official support like was announced today just means that they’ve tested it and it should generally work. I used to run GTX 580s in SLI, and you could force SLI on for any application with a relatively simple tweak. Oftentimes, it would even work. 🙂

          By contrast, using multiple GPUs in DX12 or Vulkan is much more involved. I think you can still do it the old way, at least in DirectX 12. Doing it the proper way under the new APIs involves programming the game in such a way that it can dynamically distribute workloads in an atomic fashion more or less the same way that an OpenCL or CUDA application works. As I understand, doing it this way means that you can (in the best-case) end up with much better, or at least more consistent, GPU utilization and scaling. However, I’m sure you can imagine how difficult it would be (if not impossible) to program a generic solution that works this way across various hardware and software configurations.

          All of the above could be completely wrong; I am in no way a game, driver, or hardware developer. This is just what I’ve come to understand as a user and enthusiast. In any case, I hope it helps. 🙂

            • Legend
            • 2 years ago

            It would be great if game design content wasn’t so dependent on the renderer for compatibly. How many times has a community member just made a map space for UE4 only to have a game engine update break compatibility?

            Doing multi-gpu in DX12/Vulkan may not be quite as much extra work as you indicate though. I believe things are a bit more streamlined within the AMD camp here. It certainly is more involved at the hardware level, but definitely not requiring assembly coding by the developer on a per GPU basis for the render path. At the moment Requiring AMD GCN and nVidia Pascal or greater respectively. That is a fairly narrow baseline for support leaving things like varying sized frame buffer and compute power to consider. Admittedly I may be completely off base with this.

            I think part of my gripe here is examples such as Doom and Tomb Raider are already using these render paths and multi-gpu support great success (even on 4 year old GCN hardware), and are great games on top of that fact. UE4 is much more focused on game engine tech development and have yet to integrate this into the tool set. In sight of this I would expect Epic Games to be supporting DX12/Vulkan multi-gpu much sooner.

            • RAGEPRO
            • 2 years ago

            Well, I don’t think Doom supports multi-GPU in Vulkan mode. Otherwise I think the fact that so few games actually have DX12 multi-GPU support (is it anything besides RotTR and AotS?) really lends support to my point.

            I do agree that it would be nice to see better support for explicit multi-adapter going forward.

            • Airmantharp
            • 2 years ago

            I’ve wound up using SLI for my current and previous GPU solution, mostly because it made sense based on available products at the time. But after Nvidia released the 980Ti, I started to question the need- and now that multi-GPU support is faltering with emerging APIs pushing development responsibility significantly further into developers wheelhouses, I’m fairly certain that I’ll just save up for the next ‘Ti’ GPU or AMD equivalent.

            Of course, if they manage to turn this all around (and they could), I don’t mind running two cards when the performance target warrants it.

            • Legend
            • 2 years ago

            Apology for that, you are correct Doom does not support multi-GPU in Vulkan. I was letting the recollection of performance increase from Vulkan alone associate to using 2 GPU’s somehow…idk.

            Tomb Raider does support multi-GPU with DX12 and the result is quite impressive. So far being featured in a triple A title is rare as you point out likely requiring extra dev time. I remain thinking Epic Games should have been on this with the development stance they have taken in their business model. Even plain support for DX12 has been slow coming for Epic Games and it has no where near the performance improvement that is so clearly evident with the newer API’s in Tomb Raider and Doom.

            Still do enjoy the new UT game even in this early stage : )

      • defaultluser
      • 2 years ago

      No, AFR has always been the preferred way to get “up to 100%” scaling on games.

      Split-screen, or tile-render doesn’t scale perfectly, because the geometry has to be rendered by a single GPU, and then you can split the scene.

        • synthtel2
        • 2 years ago

        SFR completely sidesteps the frame timing shenanigans, though. I’d take a 20-30% gain that behaves like one faster GPU over a 60-90% gain with even a bit of timing weirdness, and AFR is still far from being bulletproof.

        Also in SFR’s favor, AFR adds a frame of latency, which is fine if it’s giving +100% but might not be if it’s only giving +50%. In AFR’s favor, each card in SFR in modern games has to render more than its even portion of the screen so there’s a guard band for various screen space effects, so the gains are pretty small.

    • CuttinHobo
    • 2 years ago

    It’s the Unreal engine… Should they be adding features to make the graphics more realistic?

    – Philosoraptor

      • deruberhanyok
      • 2 years ago

      Well, consider a lot of games using it are rendering sci-fi or fantasy settings, so the environments probably qualify as unrealistic. 🙂

      And for characters, only the uncanniest of uncanny valleys.

      But I don’t think the “uncanny engine” has the same ring to it. Also, Marvel might not be thrilled with that.

      • morete
      • 2 years ago

      Why should graphics developers care about realism anymore? Every time they try and make a game more visually realistic, PC gamers whine and moan that it’s “nothing more than a tech demo, full of bugs, no multiplayer, no storyline, too expensive, doesn’t have a good enough multiplayer, or no multiplayer at all”. Then those same gamers turn around and make it the most pirated game of all time on PC, and the developers lose tens of millions of dollars.
      And then you wonder why the majority of Steam’s top 100 most popular PC games are practically in 2-D, and if you used a single frame screenshot you could count the entire pixel content using your fingers and toes. Then people wonder why there’s no PC monitor manufacturers willing to make OLED panels? Why should they make OLED panels when all gamers will be playing on it is Minecraft?

        • strangerguy
        • 2 years ago

        We are sick seeing better graphics because it has been overemphasized to death for a decade already to the point that is no longer excites anybody. Then only to have the immersion of realism break like wet toilet paper at the slightest touch because interactions with the environment are still mostly static.

        And don’t even get me started on how terribly unoptimized some games were for a ho-hum level of visuals. *cough* FO4, DX:MD, Dishonored 2 *cough*.

Pin It on Pinterest

Share This