We retested Radeon Chill and came away intrigued

If you didn't read our recent overview of AMD's recent Radeon Software Crimson ReLive Edition software release, you may have missed our look at one of the more intriguing graphics-card-related utilities to come around in recent memory. Radeon Chill, as it's called, can dynamically adjust frame rates to avoid doing wasted work when the user is standing still or not moving the mouse in-game, all while providing plenty of performance and responsiveness when a user is providing input. AMD says Chill's dynamic frame metering can reduce graphics-card noise and GPU temperatures, and as a pleasant side effect, it can actually improve responsiveness to user input by leaving GPU resources at the ready when a user suddenly throws a lot of mouse and keyboard inputs at a game.

In our initial look at Chill using AMD's pre-release software, we couldn't get the feature to work quite right. We tested Chill with Counter-Strike: Global Offensive and found that the feature wanted to hold frame rates steady at around 62-64 FPS, on average. You can see that behavior in the graph above—there's little variation in frame times. That's not how Chill was supposed to work.

After the final release of ReLive, we gave the utility another shot with TR's Radeon RX 480 and found that it was working as expected. In turn, we graphed the frame times from a 40-second run through CS:GO's weapons course with Chill on and Chill off. You can see how Chill rapidly varies frame times in response to intermittent user input when I was standing still and shooting wooden targets with a lot of mouse clicks. You can also see how it holds frame times at about 16.7 ms (or 60 FPS) during periods of constant input, like running between segments in the weapons course. At the end of the run, where we're looking at the course timer while standing still, you can see how frame times climb as Chill limits CS:GO to running at 40 FPS—our configured Chill minimum.

With Chill off, the game ran around 100 FPS on average. As you might expect, the game felt more fluid overall with Chill off, but critically, it didn't feel any more responsive. In fact, you can see some spots in the frame-time graph above where our Chilled RX 480 actually seems to put out frames faster in response to user input (especially during frames 700 to 1000 or so) when compared to the run with Chill off. That result does seem to mesh with AMD's claim that Chill can improve responsiveness by keeping more of the GPU available for times when fast rendering in response to user input is needed. Fascinating.

Typically, we'd knock a card for delivering frame times as varied as these, but aside from the expected drop in animation fluidity that comes with a move to 60 FPS from 100 FPS on average, our Chilled RX 480 felt perfectly smooth and snappy in use. If you can tolerate a slightly less fluid gaming experience at times, it might be worth turning on Chill in a game that can typically churn out multiple hundreds of frames per second and seeing how it feels. Heck, your K:D ratio might even improve.

We sadly didn't have time for a full round of formal temperature and noise testing, but I'd say our RX 480's noise levels dropped from "noticeable" to "inaudible" with Chill on. For gamers who are in shared spaces like dorm rooms or offices, Chill could let them game without disturbing others. It could also limit heat output in spaces where air conditioning isn't available. TR's Kill-a-Watt showed about 160W of system power consumption with Chill on, compared to 250-260W with it off. That's significantly less waste heat being dumped into the room, and beefier Radeons might see even larger drops.

Although Chill didn't work for ReLive reviewer Zak Killian in his first round of testing, he saw similar benefits for heat and noise. His Sapphire Tri-X R9 290X normally sits at a toasty 82° C while gaming, and its triple fans do their best to imitate a leaf blower. After an hour of playing Warframe with Chill enabled, however, he reports that the card's GPU core never went above 72° C, which means the fans on his graphics card never went above 39% of their duty cycle (a speed of around 1800 RPM). At that speed, they're barely audible. He was surprised by the change in character for his old Hawaii card, and the 15% reduction in GPU core temperatures he saw even beats AMD's claim of 13% lower GPU temperatures on an RX 480.

Overall, Chill is one of the most handy and fascinating utilities we've used with a graphics card in some time now that we've had time to see it in action. If you have a Radeon, it's well worth giving Chill a shot and seeing whether you can notice a difference in perceived performance. We hope AMD broadly expands Chill compatibility soon.

Comments closed
    • ermo
    • 6 years ago

    Doesn’t this Chill tech really only apply to titles where the rate of change in pixels in the scene when your input sources are idling is fairly low?

    I can’t see Chill giving me much benefit in e.g. racing/flight/space sims? As in, if I’m steering straight ahead with full throttle, my input will be the same from frame to frame, but I most definitely need every frame I can get (up to my monitors refresh rate) at high speeds, meaning that the FTC feature is probably better suited to this particular scenario?

    • derFunkenstein
    • 6 years ago

    [quote<]Yes, but it's still a lame TR joke.[/quote<] FTFY

    • derFunkenstein
    • 6 years ago

    Unfortunately Chill is still in a “whitelist” kind of mode. The AMD reps on the PCPer stream said they hope to get it to a “blacklist” or “works for everything” state eventually, but they’re not there yet.

    • Pwnstar
    • 6 years ago

    Yes, but it’s still a good TR joke.

    • synthtel2
    • 6 years ago

    If frametimes were perfectly consistent, then it could be aligned to work that way, but they aren’t and it isn’t. Because of the latter, the tear would be pretty stable at some random height on the screen, and because of the former, it would drift up and down a bit. Fixing all that basically involves implementing vsync, though some of this special sauce should be able to reduce vsynced power use further than usual and possibly nullify some other disadvantages of it.

    • Mr Bill
    • 6 years ago

    I like this answer better.

    • DPete27
    • 6 years ago

    Assuming that you find a graphics quality setting that your GPU can consistently exceed the refresh rate of your monitor, setting Chill minimum and maximum framerates to your fixed fresh monitor’s native rate would allow you to effectively eliminate the need for VSync while also avoiding tearing, yes?

    • Jeff Kampman
    • 6 years ago

    Not at all. Your fixed-refresh display just gets frames when they come, as it would with any other graphics card. The frame rate changes just happen, like they would in any other game. You’ll probably get tearing when they do, like there normally would be.

    I will say that Chill and FreeSync do work great together.

    • SHOES
    • 6 years ago
    • RAGEPRO
    • 6 years ago

    Yeah, fair enough. I’ve had this conversation many times over. I don’t really even notice tearing at 60 Hz unless it’s especially egregious, so I’m no-one to talk, heh.

    • Waco
    • 6 years ago

    I’ve had high-rate displays in the past. I’d rather lock in 60 FPS with vsync (and the very minimal input lag) than deal with the unbounded GPU usage and tearing. You may not see it, but it’s immediately obvious to me even at 100+ FPS.

    • DPete27
    • 6 years ago

    So set Chill min and max framerates to the same value, which is your monitor’s refresh rate. The GPU will throttle itself to maintain 60fps (if that’s your refresh rate) and you shouldn’t need to enable VSync.

    • RAGEPRO
    • 6 years ago

    Get a high-refresh-rate display. You’ll never notice the tearing at >100 FPS. The benefits in terms of input responsiveness can’t be overstated. You can even enjoy the high refresh rate on the desktop.

    • Waco
    • 6 years ago

    I can’t stand playing anything with tearing. :/

    • RAGEPRO
    • 6 years ago

    I have similar feelings about leaving vsync on. 🙂

    • willmore
    • 6 years ago

    I guess you could turn vsync off, but who would such a horrible thing?

    • Jeff Kampman
    • 6 years ago

    The pre-release software didn’t work and the release software did, so we were obligated to re-test. We’d do it for anybody, not just AMD.

    • Jeff Kampman
    • 6 years ago

    Chill has a global setting (on/off) and per-game profile settings in Radeon Settings. You might have to click into a game’s profile and drill into its “Profile WattMan” settings to see the Chill configuration settings. Chill is also bound to a hotkey (F11) and not every game may launch with it turned on. You’ll hear a little series of beeps from Chill if you press F11 and the feature turns on successfully.

    • DPete27
    • 6 years ago

    Ah, I was in Global Settings. Thanks.

    • cegras
    • 6 years ago

    1) [quote<]With Chill enabled, minimum and maximum framerates can be set on a per-game basis in Radeon Settings. [/quote<] 3) Did you actually check in game FPS after flipping the switch?

    • MrDweezil
    • 6 years ago

    Turn vsync off?

    • RAGEPRO
    • 6 years ago

    I don’t understand the source of your confusion. Do you think that a variable refresh monitor is required to go over 60 FPS?

    • willmore
    • 6 years ago

    Then I’m confused as to how we’re seeing >60 FPS on here. Or are we not measuring when frames are actually being displayed. Please help me understand what’s happening/being measured.

    • DPete27
    • 6 years ago

    Good point. Framerate Target Control could be better in this aspect. I tested it in Torchlight 2 this weekend and it successfully capped framerates at my monitor’s max 75Hz refresh with FTC on. Close to 200fps with FTC off. I suppose Chill has the potential to replace FTC if the user sets the max and min framerate sliders to the same values.

    Inherently Chill has the potential to save more power, which is why AMD is pimping it as their new hotness.

    • Concupiscence
    • 6 years ago

    Yeah, you’re right, I was sleepy yesterday and not thinking clearly. Mea culpa.

    That said, there are still R7 370s cluttering up the retail channel.

    • DPete27
    • 6 years ago

    1) Why do they need to have game-specific profiles for their power saving features? I think Chill and Framerate Target Control require game profiles delivered by AMD, but “Power Efficiency” doesn’t. Much like Lucid Virtu back in the day, if you don’t have broad coverage of games and/or don’t constantly put in the manpower to develop profiles for more games, nobody is going to use the feature.

    2) Again, 3 power saving features is too many. K.I.S.S.

    3) I couldn’t get Chill to work on my system this weekend. I flipped the Chill switch, but the minimum/maximum framerate sliders never showed up.

    • RAGEPRO
    • 6 years ago

    Not at all. I don’t have one.

    • Waco
    • 6 years ago

    Adaptive vsync + adaptive power control is similar at least in practice, isn’t it?

    • Fieryphoenix
    • 6 years ago

    I paid hundreds of dollars to achieve fluid. I’ll keep it, thanks anyway.

    • willmore
    • 6 years ago

    Does this technique pretty much require a VRR monitor?

    • Prestige Worldwide
    • 6 years ago

    As a Canadian, that statement finally makes sense now. Thanks for the clarification

    • AnotherReader
    • 6 years ago

    As a multi-monitor user with a 290X, I agree.

    • AnotherReader
    • 6 years ago

    Just like there was no need to improve the K8..

    • AnotherReader
    • 6 years ago

    That would bring RX 480 closer to the 1050 Ti in power consumption. I need to try this on my 290X soon.

    • tipoo
    • 6 years ago

    So it is in fact based on mouse input…That seems like a questionable decision, vs on-screen movement. You could be watching an in-game scene and want higher framerates, or you could be sitting still sniping or something, many scenarios where you might not want this.

    I guess that’s why you can turn it off, hah. Seems interesting for notebooks I guess.

    • tipoo
    • 6 years ago

    You might use the right ratio of milk to kraft dinner

    • Prestige Worldwide
    • 6 years ago

    [quote<]If you can tolerate a slightly less fluid gaming experience at times, it might be worth turning on Chill in a game that can typically churn out multiple hundreds of frames per second and seeing how it feels. [b<]Heck, your K:D ratio might even improve[/b<].[/quote<] wat

    • Chrispy_
    • 6 years ago

    Uh, no.

    First of all, you can still predict movement at 40fps (25ms per frame) and your reflexes are far slower than this at >100ms. Getting a headshot is about timing since your subconscious brain predicts when their head is going to be and factors in your own nervous system delay.

    This is why people can keep in time with rhythms/beats when clapping, dancing, drumming, singing etc. You’d notice they were out of time with as little as 20ms of delay, but it’s predictable and that’s how we as humans do it despite our physiological and neurological inability to blindly react to stuff any quicker than a few hundred milliseconds.

    In the case of your headshot whine, SSK kind of has it nailed. lower latency is actually an advantage here, so if anything, Chill is a benefit not a hinderance.

    • psuedonymous
    • 6 years ago

    ” In fact, you can see some spots in the frame-time graph above where our Chilled RX 480 actually seems to put out frames faster in response to user input (especially during frames 700 to 1000 or so) when compared to the run with Chill off. That result does seem to mesh with AMD’s claim that Chill can improve responsiveness by keeping more of the GPU available for times when fast rendering in response to user input is needed. Fascinating.”

    This REALLY needs actual input response testing, with the fancy controller + LED trigger next to a monitor filmed with a high-speed camera setup. While frames may be delivered with a shorter render time (what the render time value is reporting), the render time variance is up, so if Chill requires you to wait for a longer render time frame to complete rendering and displaying before it can then do a short render time frame and THEN display the result of your input, the overall input latency may be increase even if a drop in frame render times is triggered.
    The render time metric alone cannot tell you the effect on input latency.

    • Amiga500+
    • 6 years ago

    Then turn the f**king thing off then you yappy bstart.

    Could you be any more negative?

    Its particularly annoying when your too dim or too biased to realise it’ll be of benefit to quite a few people in quite a few circumstances.

    • NTMBK
    • 6 years ago

    This could be handy for XCOM 2. My system sounds like it’s about to take off when I play that game.

    • JosiahBradley
    • 6 years ago

    Still can’t find any chill option anywhere and I did a complete clean install. Meh I’ll stick with my 600w+ GPU draw.

    • rechicero
    • 6 years ago

    Well, it’s something that happens to a lot of ppl and there is no reason for that. I mean, I can admit some extra work from the GPU for more pixels (more Hz would be more pixels too). It’s obvious there is more work = more watts.

    But do they really need to keep the GPU (in a 7950) at 500 Hz and the memory at 1250 when with one monitor 300 and 150 (that means almost 10 times less memory clock!) is enough and GPU use is 0% most of the time with spikes at 5-10%? It seems like an obvious bug in the driver, that nobody’s talking much about.

    Or nothing at all here in TR.

    And yes, it’s a systematic problem with idle power if you connect 2 monitors (something I’d say is more common than crossfire).

    PS: I don’t know about Pascal, I’m using a Radeon 7950.

    • EndlessWaves
    • 6 years ago

    It is just a quirk of two monitors though, not a systematic problem with idle power. If you look at three monitor then Pascal’s power consumption goes up to make the gap equivalent to single monitor idle, and Pascal is also more power hungry when idling connected to a 144hz monitor:

    [url<]https://www.computerbase.de/2016-10/geforce-gtx-1050-ti-test/6/#diagramm-leistungsaufnahme-des-gesamtsystems-windows-desktop[/url<]

    • VincentHanna
    • 6 years ago

    I believe AMD has done this specifically to fight back against The Media’s completely biased measuring of frame-rates to measure GPU performance. Now, maybe we can get back to the more fair method of re-printing press releases verbatim without verifying actual performance.

    -Donald Trump.

    • rechicero
    • 6 years ago

    Probably means it might be possible. Be what you usually are and try to make it happen ;-).

    The time spent beyond acceptable time would be a great metric and I’d say a 90-100W less IS a panacea in any power efficiency war. If there is no downside we are talking about 36-39% less power: that’s big. If it’s actually more responsive it would be huge.

    I’d say we really need a closer look at this.

    • rechicero
    • 6 years ago

    Now, if they can manage to have 2 monitors connected and not need to use the memory at max speed and the CPU in the second power state… I’m going to hold my 7950 a little longer, but when I’m ready to change, that will be something I will look for.

    Chill is great, though. And I love how well AMD GPUs age. Kudos for them. But look at idle power too…

    • Klimax
    • 6 years ago

    And I don’t think we will se anything similar for GeForce anytime soon. There’s no need…

    • Klimax
    • 6 years ago

    WoT and WoWS will likely be bad cases for Chill. Also it si solution when HW simply can’t…

    • Klimax
    • 6 years ago

    It’s problematic SW solution for HW problem. (inefficient chip) Don’t know about CS(:GO), but such jumps can be nearly terminal in World of Warships.

    • ronch
    • 6 years ago

    Well, no butt hurt here considering GCN 1.0 has been around for about 5 years already. I’m still sporting an HD7770 and despite being behind the curve i think it’s still great. Still a very capable and amazing graphics chip in my book when you consider it’s far along the evolution of 3D graphics technology. (Yeah I still have my Voodoo3 3000!!)

    • DoomGuy64
    • 6 years ago

    Wahh, my GTX470 doesn’t “support” the Ultra texture setting in LOTR, or any other game!

    Uh, it’s dead Jim. Tine for a new card.

    GCN 1.0 was so far ahead of it’s time that it remained performance competitive today. However, it is lacking a great deal of modern features that the rest of us now take for granted. It’s feature outdated, so if you want feature parity of newer cards, buy a newer card.

    I’m sure the resell value is still reasonable, being the cards can still run modern games.

    • gerryg
    • 6 years ago

    What does the chart look like if it’s smoothed? Average it per each 5 or 10 frames for both on and off?

    • Ninjitsu
    • 6 years ago

    the Enemy is Within

    • Ninjitsu
    • 6 years ago

    [quote<]The thing is, you let off the controls a lot more using a gamepad than you do with a mouse and keyboard[/quote<] Huh, why? Thumb sticks are in constant use, aren't they? (like in DS3 to move around or fight enemies)

    • Vaughn
    • 6 years ago

    Why it is available in the Menu for my 7970Ghz.

    [url<]https://s29.postimg.org/nue4c751j/Chill_On_7970_Ghz.png[/url<]

    • RAGEPRO
    • 6 years ago

    Honestly Chill is very fine-grained. I really don’t see a lot of problem using it in FPS games, particularly if you have a 60 Hz display. (People with >100Hz displays should probably leave it off for now, imo.)

    However using it in gamepad games — like Dark Souls 3 — is not advisable. The thing is, you let off the controls a lot more using a gamepad than you do with a mouse and keyboard, which means the framerate bottoms out at ~40 FPS a lot more often. It’s not super great.

    If you have a compatible Radeon, you should give Chill a try. Even in a game like Dark Souls 3, which has a 60 FPS cap, I get a big reduction in fan noise and GPU core temps using it. And in a really fast game like Warframe, I really wouldn’t notice it if not for my 120 Hz display. It’s pretty wild, honestly.

    • Mr Bill
    • 6 years ago

    Ah this answers my own experiential question.

    • Mr Bill
    • 6 years ago

    +3 for using experiential in a sentence! But I guess the point is that even if you are camping a position and waiting for that target to show, you have less wasted frames in the pipe that could reduce your effective local response time. However, if you have a lower frame rate during chill does that mean you don’t notice quite as quickly that the target has exposed itself from the server side?

    • Jeff Kampman
    • 6 years ago

    Benchmarking Chilled Radeons is probably not possible in anything but an experiential sense. There’s no comparable utility for GeForce cards that’ll behave in the exact same way and Chill has a certain “built-in” level of performance that it’ll provide regardless of card (unless you’re on some extremely weak Radeon). It’s best thought of as a “try it and see” utility for those that need it, not a panacea for the GeForce versus Radeon power efficiency wars.

    • rems
    • 6 years ago

    Maybe this would be more useful in a cut-scene context where the user input is probably off anyways, I’d rather have a high fps count even if I’m camping in a corner waiting on the earliest frame possible to show me where the enemy’s coming from.

    • cegras
    • 6 years ago

    [quote<]That sounds so broken. Imagine playing a sniper, or waiting for an enemy to round a corner. Going down to 40 fps in these moments of relative inactivity could be the difference between getting a headshot on a fast-moving character or letting them get past you.[/quote<] Nah. Interpolation, and server tick rate, will mean more than a 40 vs 60 fps difference ever will.

    • sweatshopking
    • 6 years ago

    frame times suggest reduced lag, not increased lag.

    • djayjp
    • 6 years ago

    But how often in those instances when it matters are you actually not going to be moving the mouse at all? As long as it really is just as responsive when user input occurs, then I’d say it’s a mighty fine development!

    • prb123
    • 6 years ago

    We need a retest of the 480x and Fury Cards against the comparative NVidia cards with the new Drivers/Software with Chill On and Off.

    • sweatshopking
    • 6 years ago

    the 290 isn’t listed as being compatible on amd’s documents, and I couldn’t adjust it on my 290.

    edit: after doing a driver clean and reinstalling I can now adjust chill

    • synthtel2
    • 6 years ago

    That’s my thinking too. It sounds useful in plenty of games, but there’s a reason CS:GO players want to be capped at 300 fps all the time.

    • tipoo
    • 6 years ago

    Hm, didn’t the tech this is based on that AMD bought out support 1.0?

    • Voldenuit
    • 6 years ago

    Hm… so Chill works by using player input to determine framerate demand?

    That sounds so broken. Imagine playing a sniper, or waiting for an enemy to round a corner. Going down to 40 fps in these moments of relative inactivity could be the difference between getting a headshot on a fast-moving character or letting them get past you.

    Granted, nvidia’s been dynamically power gating their silicon for a few years now, as toms (I believe it was) showed when they measured the instantaneous power draw of Maxwell cards when gaming. But to their credit, nvidia’s power gating seems much more fine-grained than what Chill is being described as doing.

    • Krogoth
    • 6 years ago

    It seems like there’s some “Damage” control going on. 😉

    • deruberhanyok
    • 6 years ago

    [i<]TR's Kill-a-Watt showed about 160W of system power consumption with Chill on, compared to 250-260W with it off.[/i<] Holy power reduction, Batman!

    • Mr Bill
    • 6 years ago

    Now you really can Chill with the guildies between Raids?

    • mark625
    • 6 years ago

    I don’t think the phrase “caveat emptor” really applies here, since those older cards have already been purchased, and the Chill feature is free with the new driver release. Better to say “YMMV”, since different users will see different results. But there’s nothing really to “beware” of, as in spending money and not getting what you expected.

    • Sargent Duck
    • 6 years ago

    *Cue griping that new features don’t support products created years ago*

    <- This coming from a Radeon 7870 owner…Oh look, Christmas time. Perfect time to go video card shopping!

    • Concupiscence
    • 6 years ago

    It’s worth mentioning that GCN 1.0 parts don’t appear to support Chill, as they don’t support the fien-grained sensor interface it uses. If you’re running a 7700/7800/7900/R7 270(x)/R9 280(x)/R7 370, caveat emptor.

Pin It on Pinterest

Share This

Share this post with your friends!