AMD turns on multi-GPU frame pacing for DX12 titles

Here at TR, we've long considered multi-GPU rendering something of a false grail for graphics-performance scaling. Potent multi-GPU cards like AMD's Radeon HD 7990 and R9 295 X2 have tended to deliver world-beating average frame rates in our testing, but our Inside the Second frame-time measures usually put a big fat asterisk next to those cards' numbers. That's because issues with consistent frame delivery have often made those CrossFire-on-a-stick cards feel much less smooth in practice than their astronomical frame rates might suggest.

The Radeon R9 295 X2

To get around this problem, Radeons include frame-pacing algorithms in their drivers to make sure that inter-GPU timing issues don't mess with smooth frame delivery too much. At least in the red team's case, that technology apparently wasn't available in DirectX 12 titles—at least until now. AMD tapped some guy from its performance-testing department to explain the technology and its benefits in an informative, concise video that's well worth a watch.

We haven't verified these results for ourselves yet—doing so would require use of an FCAT rig that I don't have set up here. Still, given our history, we're inclined to trust that AMD guy in the video above. DirectX 12 frame pacing for Radeons is available in the Radeon Software 16.10.1 release that hit the interwebs yesterday. If you've got multiple Radeons churning out frames in next-generation titles, you'll most definitely want to install that update post-haste.

Comments closed
    • ptsant
    • 3 years ago

    If the patch works as advertised multi-GPU starts to make some sense for future titles.

    AMD has invested a lot in multichip (and multicard) while nVidia seems to gradually invest more in huge chips and much less in SLI. I know I’d take 1x big GPU over 2x small for the same performance, even if the big one is slightly more expensive. Things could change, however, if the price difference increases too much.

    • Pwnstar
    • 3 years ago

    The music in that video is horrendous.

    • Dysthymia
    • 3 years ago

    As one of if not the first person to call AMD out on this issue, it must be very satisfying to deliver this announcement on a frame pacing fix.

    • Froz
    • 3 years ago

    I wonder how the frame times compare not just to multi-GPU without frame-pacing, but to single GPU. Is it really time when 2 cheaper GPUs might be a better solution then a single more expensive one?

      • Voldenuit
      • 3 years ago

      If you’re in the red camp, your options for high performance are somewhat limited until Vega comes out.

      I wonder if there’s been an uptick of XF’ed midrange parts as a result.

        • RAGEPRO
        • 3 years ago

        Almost assuredly, especially given that AMD marketed it that way versus the 1080. There’s also the fact that the RX 480 supports quad-fire, but, well, given that even dual-GPU machines make up an infinitesimal slice of the market, I doubt quad-GPU systems are even a drop in the bucket.

    • xeridea
    • 3 years ago

    Good video, no salt required

    • cygnus1
    • 3 years ago

    Holy crap, Scott really needs to level the camera or level the damn shelves in the background! Get with it AMD!!

      • DPete27
      • 3 years ago

      1) They’re plastic shelving units and the shelves are sagging
      2) Ever heard of “perspective” and “vanishing points”? The shelving units aren’t running perfectly perpendicular to the direction of the camera.

        • cygnus1
        • 3 years ago

        Sure, shelves could be sagging, but I doubt the posts aren’t actually leaning at such an angle. Maybe my sarcasm was a little off saying to level things. But it’s called photographic framing. And once you understand it’s wrong it’s annoying to see when people publish content like that. The person that setup that camera didn’t think about it. It just looks fairly unprofessional.

          • derFunkenstein
          • 3 years ago

          Thinking back to the DamageBox giveaway video last summer along with a whole bunch of podcast videos, I have to think the camera was set up in Scott’s basement. Probably looks a little amateurish because it was done by video non-pros.

          • kn00tcn
          • 3 years ago

          it’s compensating for his head tilt

      • puppetworx
      • 3 years ago

      I’m concerned that the building he is in maybe about to collapse.

      • spiritwalker2222
      • 3 years ago

      I guess people are happy with what their hearing when the only thing to complain about is some shelf’s in the background.

        • cygnus1
        • 3 years ago

        Lol, yeah. Other than the weird framing/shelf angle, what the video was talking about was great news. It’s important they get multi-GPU working well and this improvement goes a long way toward that.

    • Takeshi7
    • 3 years ago

    Handling frame pacing with drivers will always be inferior to Nvidia’s dedicated hardware frame pacing.

      • RAGEPRO
      • 3 years ago

      Hardware is nice. The problem with hardware is that if it has a problem, you can only work around it, not change it.

      As long as you have the CPU to spare, a software solution has the advantage of being completely configurable. So there’s that, at least.

      Frankly I don’t think anyone is running Crossfire in underpowered machines.
      [sub<]At least, besides those wacky hybrid-crossfire dual-graphics laptops. That no-one should ever buy.[/sub<]

        • tipoo
        • 3 years ago

        Plus living in an era when any old i5 has room to spare, even modern i3s can push most games above 60fps if the GPU allows.

        • DoomGuy64
        • 3 years ago

        It’s more likely using async instead of the CPU, being dx12.

          • RAGEPRO
          • 3 years ago

          AMD also implements frame pacing in software in DirectX9, 10, 11, and OpenGL titles. (Vulkan?)

            • DoomGuy64
            • 3 years ago

            Define “software”. You can also run code in OpenCL and DirectCompute, meaning the driver has a number of ways that it can achieve this effect. I don’t think AMD has ever stated the exact method they’re using, so you can’t make any specific claims other than speculation.

            IMHO, AMD’s framepacing is being computed at least to some extent on the videocard, because it would be inefficient if done solely on the cpu. It’s much more likely a hybrid approach, and I think Nvidia may be doing it in a similar manner as well. PhysX doesn’t have “dedicated hardware” either. It’s all code that executes through the driver, which means a hybrid approach.

        • brucethemoose
        • 3 years ago

        It isn’t just about performance. It’s about latency too.

        Theoretically, hardware frame pacing doesn’t have the latency hit of moving things to/from the CPU, but I don’t know how significant it actually is.

        Still, low latency is the name of the game. We could eliminate all screen tearing and most hitching by simply buffering some frames, but it would feel awful… frame pacing is some kind of middle ground, and the less it impacts latency, the better.

          • kn00tcn
          • 3 years ago

          did anyone measure this additional latency?

      • Andrew Lauritzen
      • 3 years ago

      You can’t handle frame pacing “only” in hardware – the main component is software that evens out the back-pressure on the application itself. I’m not even sure what benefit there is to whatever NVIDIA claims to be doing in hardware, as it’s not something you can properly fix at the end of the pipe anyways.

    • the
    • 3 years ago

    Hey, that guy looks familiar. Where have I seen him before?

      • chuckula
      • 3 years ago

      I think he used to work for one of those online sites that reports about technology.

      Lemme think here…… ah got it! Anandtech.

        • LocalCitizen
        • 3 years ago

        i think he’s from Ars

        i can’t remember either. at the end of the video, in small prints, there’s the word “Damage”. I think that’s a hint of some sort

          • Veerappan
          • 3 years ago

          You jest, but didn’t Scott work with/for Ars way back when…?

        • the
        • 3 years ago

        Purportedly Tech?

      • morphine
      • 3 years ago

      He ugly tho.

    • geniekid
    • 3 years ago

    How can we be sure that dude isn’t an evil clone from a parallel dimension? The facial hair is damning.

      • derFunkenstein
      • 3 years ago

      This is [url=http://1.bp.blogspot.com/-RrbRZQw1YIQ/UVERyQLlfsI/AAAAAAAAEoY/pon_K91JjoA/s1600/spock+beard.png<]absolutely possible[/url<].

      • EzioAs
      • 3 years ago

      …maybe this guy is the real one? …? …..????

        • wiak
        • 3 years ago

        AMD RealRenderβ„’ Technology now available
        we render you Scott Wasson in full 4D to put in your Youtube Videos

      • Captain Ned
      • 3 years ago

      He was sporting it at BBQ and he sure wasn’t the evil clone Scott there, πŸ˜‰

        • drfish
        • 3 years ago

        Bender taught us that the bearded Flexos of the world are not always the evil ones.

      • jihadjoe
      • 3 years ago

      Hairworks 2.0 patch notes:
      – beta support for facial hair

      • gigafinger
      • 3 years ago

      Wasson, Wassoff that facial hair! πŸ˜‰

    • EzioAs
    • 3 years ago

    We can’t just trust any AMD guy!!!

      • derFunkenstein
      • 3 years ago

      I still laugh every time I think about how AMD-centric subreddits railed against TR and how anti-AMD it was, and then Scott went to work for AMD.

        • wiak
        • 3 years ago

        we did? wait what? even TR redid their RX 400 reviews with proper results :=)

        /r/amd <3

          • derFunkenstein
          • 3 years ago

          I’m talking about roughly 12 months ago with the Furry reviews. They were all anti-Scott, he’s anti-AMD, and then a couple months later…

            • EzioAs
            • 3 years ago

            [quote<]I'm talking about roughly 12 months ago with the Furry reviews. [/quote<] I didn't realize TR reviewed gerbils too. πŸ™‚

            • derFunkenstein
            • 3 years ago

            Ha! Whoopsie daisy.

            • LostCat
            • 3 years ago

            There was some BS around but it didn’t seem nearly as prevalent as some wanted to believe.

            • DoomGuy64
            • 3 years ago

            To be fair, that review wasn’t quite up to TR’s normal quality. They nitpicked some issues, and game selection was poor. Seemed rushed out, and probably wouldn’t have gotten the same reaction if TR had spent an extra day on polishing it up.

            If you reviewed the Fury again today, it would be in a much better position. I think that was one of the biggest detractors, because the original review didn’t exactly give a balanced perspective, and read more like a worst case scenario. The 480 review was much better.

            • chuckula
            • 3 years ago

            [quote<]If you reviewed the Fury again today, it would be in a much better position.[/quote<] Demonstrably worse than the smaller and cheaper GTX-1070 that is [b<]SUPPOSED[/b<] to lose to the almost 9 TFlop Furry X -- even in DX 12 titles? That's your definition of "much better position"?

            • RAGEPRO
            • 3 years ago

            Eh, [url=http://pcpartpicker.com/products/video-card/#c=369,319&sort=a8&page=1<]they're about the same price these days.[/url<] [url=http://www.pcgamer.com/the-geforce-gtx-1070-review/<]Not really all that far apart in terms of performance, either, at least by average FPS.[/url<] Of course, the FuryX requires a 120mm fan mount and only has 4GB of memory, so there's that. Don't get me wrong, I'm not saying ANYONE should take a FuryX over a 1070. I do think if you wrote that same review today but using the current drivers, yeah, it probably would look better than it did against the Maxwell stuff it was competing against then. Of course, the problem is, that's not the market we're in now. As Bruno just said, Vega can't come soon enough.

            • Waco
            • 3 years ago

            I don’t think anyone sane would choose a Fury X over a GTX-1070 today. It’s measurably worse in just about every metric.

            • Renko
            • 3 years ago

            I actually did. Whether I am sane or not is definitely up in the air, but I had very specific reasons for doing so.

            I just went through a very hot and humid Michigan summer living on the second floor of a house with no air conditioning. I previously had an R9 390 and a core i5 3570k both air cooled. For much of the summer I couldn’t play games it was so uncomfortable. I also had a relatively open corsair case.

            When I went to build my new computer this summer the 1070 had just come out and was going for a premium. I already had an ASUS MG279 2560×1440 freesync monitor. The cheapest 1070 was $475-500 at the time. I found a fury x off of newegg no less for $379.

            I chose it for many reasons. I now have an AIO water cooled 6700k at 4.6GHz and my liquid cooled fury x. I bought a new case that didn’t have a big side fan leaving only the two front intake fans. I couldn’t be happier. I can barely hear my system. I have never seen my gpu or cpu above 60C. In a 70 degree room environment while gaming the air coming out of my computer from both radiators barely qualifies as warm.

            I can play all games at the highest possible settings unless the game forces limits due to the 4GB vram which was the ONLY negative I would say about the card.

            I love the AIO factory cooler and I am waiting for vega and hoping they release another factory cooled card with that launch.

            TL:DR – Very happy Fury X owner. Possibly sane.

            Edit: Utilities are part of rent so no electric bill problems and the increase in watts needed for it doesn’t offset the temperature that a 1070 would produce unless you pay a premium for one of the few hybrid liquid cooled versions or go full on custom water loop.

            • Rza79
            • 3 years ago

            Well it’s not that crazy what you did. Specially at the resolution you use and in new games, it can keep up quite nicely. $379 is a bargain.

            Like in Gears of War 4, it’s only 19% slower that a GTX 1080:
            [url<]https://www.computerbase.de/2016-10/gears-of-war-4-benchmark/3/#diagramm-gears-of-war-4-auf-dem-i7-6700k-2560-1440[/url<]

            • Waco
            • 3 years ago

            Heat output is [i<]directly[/i<] related to energy usage, not exhaust temperature. The Fury X is going to be dumping quite a bit more heat than any GTX 1070 purely because it uses more power (nearly 130 watts more). My sanity poke was based on today prices though, where GTX 1070s routinely go for around $400.

            • Voldenuit
            • 3 years ago

            [quote<]the increase in watts needed for it doesn't offset the temperature that a 1070 would produce unless you pay a premium for one of the few hybrid liquid cooled versions or go full on custom water loop.[/quote<] Hunh? My GTX 1070 under load is 65C at stock at 70C overclocked to 2025 MHz (over summer) with stock fan profile. Gigabyte GTX 1070 Gaming. Granted, it's a triple fan cooler, but the MSI G1 performs about the same, and the ASUS Strix is even better. Running 120 Hz monitor with Fast Sync, so it's about as hot as you could expect the card to get (I would expect ppl with 60 Hz monitors and V-sync on to see lower loads in most games). I suppose you might have meant that the benefit to the Fury X is that the heat is most likely being dumped [i<]outside[/i<] of the case, but water cooling doesn't magically reduce heat flux, especially once the heat capacity of the reservoir equilibrates. If a card's drawing (say) 250W of power, it's dumping out 250W of heat [i<]somewhere[/i<].

            • Renko
            • 3 years ago

            I wasn’t saying it does that. I was saying the thermal properties of water cooling allow for cooler air to be exhausted than an air cooled 1070. Water has the ability to absorb much more heat without it increasing the water temperature as much as just dumping straight heat into or out of your case.

            I’m not saying the heat from the increased wattage magically disappears, but it is more readily absorbed by the water which in turn keeps the gpu cooler and subsequently the air being blown out of the case.

            The founders edition 1070 (aside from a few vendor blowers) is the only card that would dump heat OUT of my case like the Fury X. The founders edition 1070s do not run cool at all with 80C being fairly common especially with aggresive overclocks. I have no heat from my cpu or gpu being dumped into my case with fans. I’m quite happy with that arrangement. In addition blower style fans aren’t exactly quiet. The fan on fury x isn’t even audible and there is no pump noise at all. I routinely stick my hands above my cpu radiator at the top of my case to make sure the fans are on because that is how little sound and warm air I feel with my tower right next to me. The graphics card exhausts out the back which usually causes quite warm air to be trapped under my desk…I will update you when I notice it.

            Like htpcs or mITX builds there are reasons to choose something that maybe doesn’t make sense to the masses but for those people in certain circumstances certain tradeoffs make sense. *shrug*

            I’ll gladly take my setup over my neighbor who is running a 1080 on a 1080p gsync monitor and doesn’t play competitive fps games. πŸ˜‰

            • Waco
            • 3 years ago

            The temperature of the exhaust has very little to do with the heat output. Power usage drives that.

            Water cooling versus air cooling doesn’t change the heat dump into the room at all…

            • krizby87
            • 3 years ago

            Actually watercooling make your room hotter because it is extracting heat more efficiently than air cooling, you can watch jayztwocent channel for that information.
            1070 and 1080 run hot for their tdp because of the die surface are smaller, making it harder to remove heat; however the amount of heat output should substantial be lower than fury x. I doubt any gpu running at 80C is gonna die any sooner than the pump running in any AIO cooler anyways.

            • Renko
            • 3 years ago

            No arguments from me about the 80C card v AIO cooler time of death. I just wanted something to hold me over to Vega honestly and as of right now I am “locked” into AMD with freesync monitor. Also I don’t have one of those handy dandy plot graphs that TR uses, but I am pretty sure a Fury X at $380 is going to be better price/performance fps and latency than a 1070. Not arguing which is the better card, but which is the better buy for a guy with a freesync monitor around that price to game on high settings at 1440p (lots of caveats there :P). I know what 80C being dumped into my room feels like. The R9 390 despite Sapphire’s best efforts with its massive heatsink was still a space heater.

            Little confused about the water cooling making my room hotter statement. Not disagreeing just confused. If the temps while gaming of my gpu are 60C (which is the hottest I have seen – not going to run Furmark just to see how high I can make it unrealistically go) then the water running through the radiator would be cooler than 60C and that is the heat the fan is blowing out of the case? If I am wrong I am totally cool with that, I don’t have time to check out the YouTube video, brother is in town and picking me up soon, however I shall when I get some time. Just seems like there is no way the fan is blowing out 80C heat which a blower 1070 would (and yes I am aware of why the 1070’s heat output is what it is, same reason a 6700k is harder to cool than a Sandy Bridge OCed to the same amount).

            • cynan
            • 3 years ago

            The Fury will make your room hotter than a 1070 because 1)the chip consumes more watts overall and 2) because the Fury is less efficient (requires more watts to push the same number of frames). The water does not change this, it just acts as a buffer so that the heat is more gradually dumped into your room (instead of immediately into your case and then into your room as with a direct flow air cooler). But because your case is not a completely thermally closed system, it will end up in your room eventually. One way or another, the temperature of the air in the room and the water in the AIO will reach equilibrium.

            • DoomGuy64
            • 3 years ago

            The only person who would actually compare those two is Chuckula. The Fury’s competition was Maxwell.

            If you bought a Fury back then, not only do you have a card that performs better today than it’s Maxwell equivalent, but also can beat the 1070 in Vulkan and dx12. Maxwell is completely obsolete today, while the Fury still has legs.

            The Fury is not a bad card. It was just reviewed extremely poorly using methods that did not show it’s full capability. Doom wasn’t available back then, and if it was, things would have been different.

            • Waco
            • 3 years ago

            It must be interesting to be so blind to reality.

            • DoomGuy64
            • 3 years ago

            Sure thing Mr. AMD doesn’t do CPU upgrades troll.

            How long did it take you to admit:
            [quote<]One generation did. Yay?[/quote<] Do we have to do this again? Because you know I'm right, and there are plenty of benchmarks available to prove it. TR did a bad review of the Fury, which is easily provable by referencing other websites. The games used then were not indicative of the Fury's actual capability, especially since they were titles better optimized for Nvidia. That's why people complained about the review. Everyone outside of the nvidia trolls knew it was a poor game selection.

            • Waco
            • 3 years ago

            I didn’t say they didn’t, but continue to twist reality if you please. πŸ™‚

            • Klimax
            • 3 years ago

            So what is current product from AMD that compete with 1080 or say Titan X Pascal?

            • derFunkenstein
            • 3 years ago

            There isn’t one, but the Fury X’s price has been adjusted downward, where its competition is the GTX 1070. TR didn’t have a Fury X in its Pascal reviews, but Tom’s shows it basically trading blows with the 1070 at a much higher power consumption level.

            [url<]http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585.html[/url<]

            • BurntMyBacon
            • 3 years ago

            [quote<]The Fury is not a bad card. It was just reviewed extremely poorly using methods that did not show it's full capability. Doom wasn't available back then, and if it was, things would have been different.[/quote<] The methods were actually very good. They were quantifiable, repeatable, well thought out, and provided relevant information. For the games and APIs tested, it is hard to find a better more insightful review. The issue isn't the method, but perhaps the model. Every reviewer selects a set of games to model PC gaming as a whole. After all, it is impractical to test every game at every setting on every card ever. You shouldn't be asking yourself if they represented the card accurately. The card represents itself and needs no model to estimate it. Rather, you should ask yourself if their game selection accurately modeled the state of PC gaming at the time of the review.

        • Srsly_Bro
        • 3 years ago

        Scott was sent to AMD as a punishment. lol

        (Owner of a 7950 and an APU)

      • UberGerbil
      • 3 years ago

      Hereby dubbed TAD: That AMD Dude.
      …or That AMD Dude, AKA Damage. TADAD.

Pin It on Pinterest

Share This