Radeon Software Crimson ReLive Edition: an overview

The supposedly bumpy quality of Radeon graphics drivers has been conventional wisdom among PC enthusiasts for years. Discussions of graphics-card performance often include at least one vague remark like “yeah, but AMD drivers.” It’s a sore spot for fanboys, and it’s an easy way to start a flamewar. Whether it’s justified or not, that perception may have harmed the reputation of Radeons in recent memory for some.

AMD has done a lot of work recently to combat this perception. Two years ago, the company released its Catalyst Omega driver. This major update brought a raft of bug fixes and performance optimizations with it, as well as new features like Virtual Super Resolution and official FreeSync support. It was the first time AMD had openly acknowledged that it had an image problem surrounding its driver software, and it promised a concerted effort to turn things around.

AMD said that Catalyst Omega was the first of many annual major updates, and 2015 didn’t disappoint. Radeon Software Crimson Edition debuted last year, bringing along another big bundle of enhancements and embellishments. Crimson didn’t offer the performance uplift that Catalyst Omega did. Instead, the company focused on stability and bugfixes. It also (mostly) put the venerable Catalyst Control Center out to pasture in favor of a slick new app called Radeon Settings. Now that we’ve been using Radeon Settings for over a year now, we can say that was the right choice.

With a solid software foundation underneath its Radeons, AMD is setting its sights on feature parity with Nvidia’s GeForce Experience suite. One of the coolest features of modern GeForce graphics cards is the ShadowPlay video capture system. Built into GeForce Experience, ShadowPlay uses the graphics card’s onboard video-processing hardware to encode high-quality compressed video streams on the fly with low overhead. This means even folks with relatively modest hardware can stream HD gameplay to their friends.

Until now, Radeon owners have had to rely on extra software with support for their cards’ built-in video-encoding hardware, like the Gaming Evolved app or this Open Broadcaster Software plugin, to enjoy hardware-accelerated on-the-fly encoding.

Today, that all changes. We’re taking the wraps off another major Radeon update this morning: Radeon Software Crimson ReLive Edition. The name’s a bit of a mouthful, and it’s not clear whether ReLive is meant to be spoken as “relive” or “Re-Live” (as in, live-streaming.) Either way, ReLive is the name of AMD’s first-party answer to ShadowPlay. The new video-capture-and-encoding technology works on all GCN Radeons—everything from the HD 7000 series and after.

ReLive isn’t the only big feature debuting in the new driver. AMD purchased a little software company called HiAlgo earlier in the year, and that company’s Chill product is being integrated right into the driver. Chill has the potential to reduce energy usage and GPU temperatures in games that can produce exceptionally high framerates, like CS:GO and World of Warcraft, without harming fluidity and responsiveness when it’s needed.

AMD’s driver isn’t the only focus of today’s software release, either. To make benchmarking easier, the company is taking the wraps off an open-source frame capture and analysis app called Open Capture and Analysis Tool (or OCAT), as well as some other goodies that we’ll talk about later. OCAT builds on the widely-used PresentMon utility to make frame-time benchmarking easier for hardcore reviewers and casual gamers alike.

A good first impression

AMD’s latest drivers are a little different right from the get-go. AMD has used its own installer for a while now, but along with the ReLive update comes a revised setup process with a smoother look and a slick new interface. AMD has finally included a “clean install” option in the setup process that nukes existing settings to make way for the new driver’s defaults. The new installer superficially resembles the Radeon Settings app, which gives it a more coherent look and feel.

Once the drivers are installed, you’ll find the Radeon Settings app right where it usually is: at the top of every context menu. Nvidia actually does this too, and I’ve complained about it before with them as well. I’d like to see an option to disable the context menu entry during setup. Anyway, load up Radeon Settings and notice an extra tab at the top for Radeon ReLive. Let’s check it out.

The video kids

I probably don’t have to tell you that game streaming is a big deal. Viewers watched a total of 459,366 years of streamed gameplay in 2015 on Twitch.tv alone, and the phenomenon has only grown since. More and more, young folks are tuning in to watch other people play games instead of TV or movies. A lot of people are eager to get into the scene, but until relatively recently streaming your own videogame footage required fairly beefy hardware and a significant amount of setup. Nvidia’s ShadowPlay is just one solution that has gone a long way toward lowering the bar to entry, and now ReLive brings that same ease of use to AMD hardware.

ReLive supports streaming high-quality video straight to disk, or a lower-bitrate stream to an online service. The actual recording options available will depend on what graphics card you’re using. My wizened R9 290X only supports AVC (H.264), but folks with Polaris Radeons will have the option of using the newer HEVC (H.265) codec. ReLive lets me choose resolutions up to 4K UHD and either 30 or 60 FPS, although selecting an excessively high option (like 4K60) pops up a warning that “more optimal settings will be used automatically while recording.” Actually trying to record with those settings produced a 60 FPS 1080p video. Your mileage may vary depending on your hardware and display.

I really appreciate the fact that ReLive is off by default. Too often, companies shove these new products and features on the unwitting or unwilling consumer. Unlike GeForce Experience, Radeon Settings also doesn’t require a user to create a whole new account or link a Google or Facebook account to a third-party service in order to work. We appreciate that openness.

Enabling the feature is a simple matter of clicking the toggle switch on the ReLive tab in Radeon Settings, and that creates four new tabs in the window: Global, Recording, Streaming, and Overlay. You’ll start on the Global page, and on this page you can customize the save location for local recordings as well as the hotkeys used to control the various ReLive functions.

Over on the Recording and Streaming pages you can set up unique settings for the two different modes. AMD includes drop-down selectors for resolution, framerate, codec, and audio bitrate, plus a slider for the video bitrate. There are automatic settings for streaming to Twitch or YouTube (simply requiring a sign-in for either service), or a custom Real-Time Streaming Protocol (RTSP) server. Users have the option to archive livestreams, saving them in the recordings folder defined on the General page. One niggle I had is that there’s no option to configure the filename used. Recordings get saved with the starting date and time, and livestreams have “Stream” prepended.

On the Recordings page, users can enable the Instant Replay function. This feature works like similar features from Nvidia, Sony, and Microsoft—when the appropriate key combination is entered, the app saves the last x seconds or minutes of gameplay to the pre-configured recordings folder. The Instant Replay period is configurable up to 20 minutes, and the ReLive configuration window provides a handy estimate of the approximate size of the saved replay file. 

Hotkey bindings are required to include a modifier key (Ctrl, Shift, or Alt) and use a letter or number. Trying to bind, say, Ctrl+Shift+Page Down won’t work. I found this a little frustrating because I’m used to using that very key combination to start recording video using my usual streaming app, Open Broadcaster Software. Still, the default streaming key (Ctrl+Shift+G) is easy enough to remember. ReLive’s hotkeys are captured globally, though, and that means these key combinations can’t be used in other applications.

That global hook is annoying. Using Paint.net to create this very article, Ctrl+Shift+S (for “Save As…”) is bound to “Save Instant Replay” by default for ReLive. AMD’s software wasn’t doing anything because I had the “Record Desktop” function disabled, but I still couldn’t use it in Paint.net. This is sort of a more general complaint about global hotkeys, but ReLive doesn’t have the ability to disable individual functions or hotkeys, so if you don’t think you’ll be using Instant Replay or a camera (as examples) you’ll want to re-bind those functions to something ridiculous like Ctrl+Alt+Shift+P.

 

Wading into the stream

Actually using ReLive is as simple as opening the app you want to capture and pressing the appropriate key combination. A little notification will pop up in the top-right corner, and away you go. Alternatively, if you can’t remember your key combo, you can press Alt+Z to bring up the in-game toolbar. From there, you can start recording, streaming, or save a screenshot. There’s a settings option on the toolbar too, but aside from letting you choose which corner the recording indicator occupies (including none), it simply duplicates a few of the options from the ReLive window in Radeon Settings. The Alt+Z hotkey is configurable along with the rest of the hotkeys, thankfully.

As someone accustomed to streaming using Open Broadcaster Software, ReLive is incredibly easy to use. There’s not much of the way in configuration, and while that does mean it’s a good bit less powerful than OBS, it also “just works.” So far, I’ve been able to load up any game (Dark Souls III, Doom, Warframe, Phantasy Star Online 2, Overwatch, and Tomb Raider among them) and start recording or streaming with a few keypresses. That kind of convenience is worth a few sacrifices. ReLive supports a simple camera overlay and a custom image overlay, too, letting streamers personalize their broadcasts.

Surprisingly, one place ReLive doesn’t require trade-offs is in regards to the video quality. Historically speaking, streamers have avoided using hardware video encoders because the quality compromises they make often result in a nigh-unwatchable video stream. I tested ReLive with my standard streaming settings against the gold standard: Open Broadcaster Software and its x264 encoder. Unfortunately, I don’t have a machine handy to test ReLive’s direct competitor in Nvidia ShadowPlay, but I did test my Core i7-4790K’s Intel QuickSync Video encoder. I recorded several runs in Dark Souls III between two bonfires while streaming at a 4.5-Mbps bitrate. Flip through the images below to see a capture from each encoder.


The differences between these encoders are pretty minute, but we’ve pored over them and all agree that while the x264 software-encode looks the best, the ReLive capture isn’t far behind. Intel’s QuickSync is the clear loser here. It arguably has the sharpest images in scenes without a lot of motion, but it really falls flat in scenes with a lot going on.

Here’s a short video sample from ReLive:

And here’s a sample from QuickSync:

Finally, here’s a look at a sample from the x264 encoder:

The relatively high quality of the video capture is all the more impressive in light of the fact that ReLive has almost no impact on the game’s performance. Even on my beefy Core i7-4790K with 32GB of RAM, and even in a not-particularly-CPU-heavy game like Dark Souls III, the weight of encoding an HD video stream in real time with OBS has a significant and immediately noticeable impact on the smoothness of gameplay. Switching to the QuickSync hardware encoder helps somewhat, but by comparison, ReLive is barely noticeable. Of course, we wouldn’t be The Tech Report if we didn’t back up these numbers with some hard data. Check this out:



Dark Souls III is a little weird in that it has a 60-FPS cap, but our results still show us what we need to know about the effects that video capture has on performance. Encoding using QuickSync causes the game to spend 17% more time under 60 FPS. By contrast, ReLive is barely any slower or rougher than playing without streaming, at least on our test system. (AMD reported larger performance drops with its own test system and 8GB of RAM).

The red team says ReLive is so efficient because it sends the game’s framebuffer to the encoder in hardware. By comparison, according to AMD, OBS does the same thing by hooking into the game software and setting up its own 3D context. That means OBS and the game are competing for the driver’s attention, which causes the larger performance hit.

Ultimately, I’ve been very impressed with ReLive in the short time that I’ve used it. In fact, it’s very likely I’ll be sticking to ReLive as my encoder of choice when livestreaming in the future. x264 is undeniably higher-quality, but the drop in smoothness can be a real problem. Remember, for these test cases I was merely dashing past enemies in mostly-idle environments. In a real gameplay situation with multiple actors in combat, the impact from video encoding can be all the greater. Also, as I mentioned earlier, Dark Souls III is not especially taxing for the CPU. Other games that hit the CPU harder (like Grand Theft Auto V) will benefit that much more from ReLive.

 

Chilling out

Even the most hard-core Radeon fanboys will admit that the red team’s older high-end GPUs consume a lot of power. AMD knows this too, of course. With the Crimson update last year, the company supplied a band-aid for the issue with its Frame Rate Target Control feature. This tool lets users set a target framerate on a global or per-game basis, and the driver will intelligently throttle the GPU to try and keep the game at that framerate.

Radeon Chill, then, is an expansion of that concept. With Chill enabled, the driver monitors user inputs to determine whether quick motion is happening in-game. If not much is going on, the GPU will move to a lower power state, reducing the game’s framerate. Once the user is active again, the framerate will increase as the GPU ramps its clock rate back up. All of these transitions happen nearly instantaneously, and with no specific action required of the user. AMD says this dynamic frame-rate control “has the potential to reduce the GPU’s power consumption, heat production, temperatures, and cooling noise without perceptibly altering the gaming experience.”

Chill requires AMD to implement support for the feature on a per-game basis, and the company has already done so for 18 games. Currently, all of the supported games use the DirectX 9 and 11 APIs. AMD says it intends to support more games and APIs soon, though. Users can enable or disable Chill globally, and define a hotkey (default F11) to enable or disable the feature manually.

With Chill enabled, minimum and maximum framerates can be set on a per-game basis in Radeon Settings. The default range for all of the supported games that I own seems to be 40-144 FPS, although the upper bound can go as high as 300 FPS.

In our initial look at Chill using AMD’s pre-release software, we couldn’t get the feature to work quite right. We tested Chill with Counter-Strike: Global Offensive and found that the feature wanted to hold frame rates steady at around 62-64 FPS, on average. You can see that behavior in the graph above—there’s little variation in frame times. That’s not how Chill was supposed to work.

After my initial look at Chill, TR Editor-in-Chief Jeff Kampman gave the utility a shot with AMD’s release version of ReLive and found it worked as expected, so he graphed the frame times from a 40-second run through CS:GO‘s weapons course with Chill on and Chill off. You can see how Chill rapidly varies frame times in response to intermittent user input when Jeff was standing still while shooting wooden targets with a lot of mouse clicks. You can also see how it holds frame times at about 16.7 ms (or 60 FPS) during periods of constant input, like running between segments in the weapons course. At the end of the run, where Jeff is looking at the course timer while standing still, you can see how frame times climb as Chill limits CS:GO to running at 40 FPS—our configured Chill minimum.

With Chill off, the game ran around 100 FPS on average. As you might expect, Jeff says the game felt more fluid overall with Chill off, but critically, it didn’t feel any more responsive. In fact, you can see some spots in the frame-time graph above where our Chilled RX 480 actually seems to put out frames faster in response to user input (especially during frames 700 to 1000 or so) when compared to the run with Chill off. That result does seem to mesh with AMD’s claim that Chill can improve responsiveness by keeping more of the GPU available for times when fast rendering in response to user input is needed. Fascinating.

Typically, we’d knock a card for delivering frame times as varied as these, but aside from the expected drop in animation fluidity that comes with a move to 60 FPS from 100 FPS on average, Jeff says the Chilled RX 480 felt perfectly smooth and snappy in use. If you can tolerate a slightly less fluid experience, it might be worth turning on Chill in a game that can typically churn out multiple hundreds of frames per second and seeing how it feels. Heck, your K:D ratio might even improve.

Although he unfortunately didn’t have time for formal measurements, Jeff also notes that his RX 480’s noise levels dropped from “noticeable” to “inaudible” while he had Chill on. For gamers who are in shared spaces like dorm rooms or offices, Chill could let them game without disturbing others. It could also limit heat output in spaces where air conditioning isn’t available. Jeff says his Kill-a-Watt showed about 160W of system power consumption with Chill on, compared to 250-260W with it off. That’s significantly less waste heat being dumped into the room, and beefier Radeons might see even larger drops.

Although Chill’s dynamic frame-rate control didn’t work for me in our first round of testing, I definitely saw similar benefits for heat and noise. My Sapphire Tri-X R9 290X normally sits at a toasty 82° C while gaming, and its triple fans do their best to imitate a leaf blower. After an hour of playing Warframe with Chill enabled, however, the GPU core never went above 72° C, which means the fans on my graphics card never went above 39% of their duty cycle (a speed of around 1800 RPM). At that speed, they’re barely audible. It certainly was an unusual change in character for the old Hawaii card, and the 15% reduction in GPU core temperatures I saw even beats AMD’s claim of 13% lower GPU temperatures on an RX 480.

Source: AMD

Our tests aside, AMD showcased Chill with some results from World of Warcraft in its reviewer’s guide, and that game seems quite amenable to Chill’s magic. The graph above shows a closer look at how Chill produces “slower” frames at idle and dynamically responds to user input by increasing performance when needed. AMD warns that “the benefits of Radeon Chill will vary depending on the game and on the performance of the system in question, and your experience with Radeon Chill may differ across game titles,” though, so your mileage may vary.

Overall, Chill is one of the most handy and fascinating utilities we’ve used with a graphics card in some time. If you have a Radeon, it’s well worth giving Chill a shot and seeing whether you can notice a difference in perceived performance. We hope AMD broadly expands Chill compatibility soon.

 

Gotta capture ’em all

Besides ReLive and Chill, AMD is showing off some other fancy software that’s not integral to its latest drivers. Here at TR, we’ve been using a handy app called PresentMon to capture frame-time data ever since the advent of DirectX 12. PresentMon works well when it works, but it’s hard to use and prone to instability at times. Along with the ReLive Edition drivers, AMD sent over a beta version of a new app it’s working on called OCAT, which stands for Open Capture and Analysis Tool.

OCAT is essentially an open-source, GUI-equipped version of PresentMon. After you launch OCAT, you can configure a capture hotkey and a capture time period. Click “Start”, and then tap your hotkey to begin capturing frametimes for every app that creates a 3D context. This can include some apps that most would never want to benchmark, like Microsoft Excel, so OCAT has a blacklist feature. Unfortunately, editing the blacklist means editing an INI file and restarting the application, but anyone messing with OCAT is probably more than familiar with editing configuration files by hand.

Alternatively, you can configure OCAT to launch and capture a single app. In theory, this could force OCAT to capture an application it doesn’t see otherwise, but we didn’t find this to be necessary in our testing. OCAT includes an FPS and frame-time overlay, too, so you can use that for a sanity test while benchmarking. I used OCAT throughout the production of this article and regularly compared its results to PresentMon. The output was identical in all cases, but that comes as no surprise considering OCAT is built upon that app.

Even though AMD supports OCAT development, it’s completely vendor-agnostic and open-source. The app produces CSV files in the same format as PresentMon, so anyone familiar with that type of output should be right at home. If you’re not a hardcore frame-time benchmarker, OCAT can at least show you average FPS, frame times, and even 99th-percentile frame times after a benchmark in its timed-run mode.

In use, OCAT generally manages to hook the application you want it to and reliably logs data. Were it that we could say the same of PresentMon. For folks looking for a DX12- and Vulkan-compatible Fraps successor, OCAT looks promising.

The best of the rest

Of course, ReLive wouldn’t be an AMD driver update without a bundle of fixes, features, and performance improvements. Here’s a grab bag of features and changes AMD is throwing into its latest release.

  • The new driver adds hardware-accelerated VP9 video decode to all GCN and Polaris-based Radeons. That functionality is a “hybrid” decode run on the GPU itself rather than with fixed-function hardware, but AMD say it should reduce power consumption nonetheless.

     

  • WattMan, the detailed tweaking and tuning software that replaced the old Overdrive system for Polaris, now supports a variety of older Radeons. Cards from the R9 Fury, R9 390, R9 380, R9 290, R9 285, R9 260, R7 360, and R7 260 series will all work with WattMan.

     

  • Speaking of the RX 480, AMD says it has increased performance across the board by up to 8% for that part compared to the original launch driver from June.

     

  • RX 400-series Radeons are getting DisplayPort HBR3 support, too. That means those cards can drive a 4K monitor at up to 120 Hz over a single DisplayPort cable, a 5K display at 60 Hz with one cable, or an 8K display at 30 Hz.

     

  • ReLive is the first Radeon driver with full support for HDR content. Radeons with Fiji, Tonga, Hawaii, and Polaris  will be able to display HDR10 and Dolby Vision content with ReLive installed.

     

  • Folks who prefer HDMI outputs instead will find that bad cables will no longer result in frustrating black screens or other intractable issues. AMD has implemented signal detection and fallback algorithms that will step through lower resolutions and refresh rates to find a supported configuration with a failing HDMI cable, then alert the user of the poor signal problem. This feature works with GCN Radeons and Kabini-and-newer APUs.

     

  • Multi-monitor FreeSync users can rejoice, too. FreeSync is now supported on applications running in borderless window mode. AMD says this will reduce input lag on these applications by up to 24%. More importantly, it means these users will be able to alt-tab in and out of their FreeSync games without breaking anything.

Conclusions

If Catalyst Omega was about performance and Radeon Software Crimson Edition was about stability, then Crimson ReLive Edition is about features. AMD says this is its biggest software release ever, and we see no reason to disagree. The ReLive app produces high-quality game captures with less performance impact than competing capture programs, and it “just worked” in our tests. That easy capture and streaming experience is a boon to Radeon owners, even if it is playing a bit of catch-up with ShadowPlay.

Though the early version of the Radeon Chill app we tested didn’t behave entirely as we might have expected in our tests, it promises an intriguing way to manage excessive power consumption, high GPU core temperatures, and undue noise in games that would normally run at multiple hundreds of frames per second. It doesn’t hurt that Chill and ReLive come totally free, too.

Last year, we expressed confidence that AMD had what it took to deliver quality software when we took a look at the Radeon Software Crimson Edition release. The competitive performance of Radeons in our recent reviews, AMD’s consistent launch-day or near-launch-day driver releases in 2016, and our experience with Crimson ReLive Edition only reinforce that conclusion.

Contrast that praise with the frustration we expressed with Radeon drivers just a year and a half ago, and it’s clear that AMD’s software has come a long way in a relatively short time. There are still many factors to weigh when choosing a graphics card, but at least for now, AMD has shown that the quality of driver and software support doesn’t have to be among them.

Comments closed
    • Kougar
    • 3 years ago

    [quote<] Nvidia actually does this too, and I've complained about it before with them as well. I'd like to see an option to disable the context menu entry during setup. [/quote<] Dunno if you're aware but both the context menu and systray icon can be turned off from the NVIDIA control panel -> Desktop menu. As long as the user is not performing a clean installation then these settings will carry over to the new driver install.

      • DoomGuy64
      • 3 years ago

      Considering how bad window’s control panels are for advanced options like refresh rate, having a context menu is better than not having it. Or at least it was with CCC, because crimson no longer has advanced monitor options.

      • RAGEPRO
      • 3 years ago

      Heh, you misunderstood me — I’m talking about the “AMD Radeon Settings” context menu option that appears everywhere you right-click on an Explorer instance (whether it’s the desktop or a File Explorer window.) I can open Radeon Settings from the start menu like everything else; I don’t need it right there in every context menu.

      The Explorer context menus in Windows 10 are already bloated as heck.

    • sluggo
    • 3 years ago

    One data point. I play a lot of Civ and XCOM, and the fan on my stock-cooled RX480 would be running at a very audible clip, even after tweaking with Wattman. After loading the ReLive driver (enabling Chill at install), and making no other changes, the fan noise is gone entirely.

    Thank you AMD.

    There’s no reason to run the hardware at full tilt in these games when most of the CPU cycles are in my noggin. Twitch gamers, different story obv.

    • DoomGuy64
    • 3 years ago

    I like how there’s no noticeable link to the update in this article about the update.
    [url<]https://support.amd.com/en-us/download[/url<]

    • willmore
    • 3 years ago

    Copied from my comments over at PCPER with edits:

    Why did you compare temperature differences with percent? How does that even matter at all? Just because you can put numbers in a spread sheet and do math on them doesn’t mean that you’ll produce results that mean anything.

    As to why this is a problem, try this: Convert the values to a different temperature scale–say Fahrenheit–and rerun the calculations. You get a different resutl. Try Kelvin. Different results again.

    Where’s that Jackie Chan .gif?

      • RAGEPRO
      • 3 years ago

      Well, that’s where you’re right. But – and I am only saying this because I care – there are a lot of decaffeinated brands on the market today that are just as tasty as the real thing.

        • willmore
        • 3 years ago

        I want to see more of you in the lab, RAGEPRO.

        Getting the math right is a moral imperative.

          • RAGEPRO
          • 3 years ago

          Fine, I’ll gain some weight.

            • willmore
            • 3 years ago

            Up the voltage, Brody.

            • RAGEPRO
            • 3 years ago

            I’m pretty sad that most people who see this probably won’t get it. That’s one of the best movies of all time, right up there with The Princess Bride.

            • sweatshopking
            • 3 years ago

            no to both.

            • willmore
            • 3 years ago

            Agreed. I rewatch each of them at least once a year.

            • derFunkenstein
            • 3 years ago

            The Princess Bride? You betcha. Jaws 2? Is that the reference? No way.

            • RAGEPRO
            • 3 years ago

            No, heh. Real Genius from 1985.

            • derFunkenstein
            • 3 years ago

            Oh, that makes much more sense. I could only think of the shark clamping down on the power line in Jaws 2.

            So, uh, to finish this thought: I didn’t get the reference.

            • MOSFET
            • 3 years ago

            I came up with Jaws 2 as well. Oh well.

            • derFunkenstein
            • 3 years ago

            Bro, do you even calculate?

      • JustAnEngineer
      • 3 years ago

      [quote=”willmore”<] Why did you compare temperature differences with percent? How does that even matter at all? Just because you can put numbers in a spread sheet and do math on them doesn't mean that you'll produce results that mean anything. As to why this is a problem, try this: Convert the values to a different temperature scale--say Fahrenheit--and rerun the calculations. You get a different resutl. Try Kelvin. Different results again. [/quote<] A reasonable solution is to express the GPU temperature as rise over ambient or rise over case temperature. If my GPU is running at 82 °C when my case is at 42 °C and my room is at 20 °C, I should subtract either the second or third number from the first one.

        • willmore
        • 3 years ago

        That’s exactly how to do it. It’s how TR has done it in the past for cooler reviews, etc. I wish GPU reviews did it that way. Maybe they have started to, my last memory is that they were in absolute degrees C. Still, we can all see differences pretty readily.

      • Air
      • 3 years ago

      Percent differences in temperature are perfectly fine if you use an absolute scale, like Kelvin.

      Relative scales like Celsius will give arbitrary results, which does seem to be the case in this article. Unless Chill reduces temperatures by 40+ °C…

      EDIT: the reduction from 82 to 72 ºC is a 2,8% reduction in temperature. Doesnt sound impressive.

    • DPete27
    • 3 years ago

    Just an FYI, [url=http://www.tomshardware.com/reviews/amd-radeon-chill-ocat-relive,4846.html<]Tom's Hardware did some benchmarking with Chill.[/url<]

      • AnotherReader
      • 3 years ago

      That looks promising. Let’s hope that they support more games soon.

    • JAMF
    • 3 years ago

    Multi monitor: Everyone unable to launch recorder in game? Alt + Z doesn’t pop up the tool in Eyefinity and even disabling Eyefinity, I have to launch the desktop recording, before alt-tabbing back into the game. The result is a recording of a slideshow.

    • Shobai
    • 3 years ago

    Well, I’m a bit disappointed. Under the old regime I was able to downclock my R9 290’s memory to 150MHz via the slider in Radeon Settings. Under ReLive, Global Wattman gives me access to a single memory state [“State 1”] which pegs the memory at 1260MHz constantly.

    I guess I had hoped that I would just be given extra toys to play with, not have some of the old toys taken away.

      • Shobai
      • 3 years ago

      Spoke too soon: I fired up my computer yesterday and GPU-Z wanted to update. After I installed that update, I then fired up a game. My 290 had actually been running at 150MHz since boot, and clocked up to its 1260MHz default during the game. Unfortunately, once I closed the game, the memory clock kept flitting quickly between 1260MHz and 150MHz, with all the display glitching that that entails.

      No setting under Radeon Settings had any effect. I rebooted the PC and sometime after logging back into Windows, the glitching stopped as something got the card to settle on 150MHz memory clock.

      I imagine that a GPU driver wipe and clean install would sort that out, but I really do appreciate the lower clock for heat and noise and am reticent to lose that…

        • Shobai
        • 3 years ago

        As helpful as the downvote is, could you take a moment to explain why you did?

          • Chrispy_
          • 3 years ago

          Not sure on the downvote but are you sure that the memory flitting between states is not a side effect of GPU-Z on a driver too new for it to understand.

          How well does it all behave if you uninstall GPU-Z?

          I’d be willing to bet that the problem is GPU-Z, since I’ve had it interfere before where vendor tools don’t (EVGA and Sapphire utilities for those respective cards).

          Don’t get me wrong, GPU-Z is an amazing util, but it’s managed by one person – practically unpaid – and he doesn’t have the time or resources to bug-test his work with every new generation of driver on multiple models of GPU from both vendors.

            • Shobai
            • 3 years ago

            That may very well be the case, I’ll have a look next time I have the machine on. I was running the V1.12.0 before I was prompted to update to V1.14.0 – V1.13.0 apparently “added support for Wattman overclocking on older AMD Radeon cards”, among other things, while the V1.14.0 update “[f]ixed clock reading on AMD cards which do not support Wattman”. Perhaps there’s something in there that’s not behaving?

            • Chrispy_
            • 3 years ago

            Maybe, but I notice V1.13.0 and V1.14.0 were released on the same day (9th December) so chances are it’s still in hotfix state, not all kinks ironed out yet.

            Do you know if your card is a reference board+bios or is it a custom board design? It’s the customs that I’ve had the most issue with in the past, both GPU-Z and also overclocking utilities.

            • Shobai
            • 3 years ago

            Yeah, it’s Asus’ DCuII design, so very definitely a custom board.

        • Shobai
        • 3 years ago

        [attempt 2: am on mobile, and Firefox is a crash-happy mess]

        Had a chance to play this arvo. GPU-Z had an update, to V1.16.0, but that just left me unable to clock up to 1260MHz. I also lost any option to change clocks in Radeon Settings – I had no States under memory.

        Uninstalled GPU-Z, rebooted, no change. Attempted to uninstall AMD drivers: they all went bar the graphics driver…not helpful!

        Rebooted to safe mode, DDU, reboot, download fresh ReLive, install, reboot.

        Radeon Settings is restored to what I previously saw, memory clock is at 1260MHz, but can’t clock down. Ambient temps in my West facing study in sunny Queensland have been high 30s to low 40s Celsius for the last few weeks, so I would really appreciate all the help I can get from my GPU in not heating up the room.

        I had been impressed with Radeon Settings up to this point, as no other software would let me downclock the card to 300MHz core, 150MHz memory (neither Afterburner, nor Trixx, Asus’ rubbish, etc). I hope they can fix this!

      • cegras
      • 3 years ago

      Wattman has a live graph to track states over time. What does it say for the memory speed? For my RX480, it also only has 2000 MHz, or the idle value, no ramping in between.

        • Shobai
        • 3 years ago

        That describes the behaviour I see here. After the PC starts up and settles on 150MHz, sometime after boot, Wattman’s graph shows 150MHz. When I fire up a game, it clocks directly up to 1260MHz and stays there. After I close the game, it rapidly cycles between the two states, and the display tears each time – it’s very off putting!

        If it could clock back to 150MHz after I close the game and hold there, I’d have the behaviour I’m looking for. It’s the constant transitions that ruin the experience.

      • Shobai
      • 3 years ago

      Well, it looks like this is all because I’m running dual monitors. That looks to be all of the problem.

    • JosiahBradley
    • 3 years ago

    Did a clean install using the new installer. Turned on ReLive to find I can only record in 1080p60 due to aging hardware (290x). Played some Overwatch and noticed no real difference in performance and a 3 minute instant replay cost me a mere 625MB. No BSOD yet and Wattman works on the 290x in crossfire. Cannot confirm if freesync is working yet in Overwatch borderless fullscreen but they STILL haven’t fixed the crossfire flickering bug so I can just play in fullscreen.

    Edit: I can’t enable Chill as the option doesn’t appear anywhere. I also now have an option to overclock/overvolt my monitor O_o.

      • LostCat
      • 3 years ago

      Chill is hidden in Wattman. Very confusing.

        • Ninjitsu
        • 3 years ago

        what man, chill

        • JosiahBradley
        • 3 years ago

        I looked into Wattman for it but I think I discovered the problem and reported the issue to AMD. It isn’t detecting my GPUs properly and reporting a R9 200 instead of R9 290X. This also causes games to report that I am not meeting the minimum/recommended requirements.

          • LostCat
          • 3 years ago

          All 200 series cards report a 200 to my knowledge.

          The real bug (at least on my box) is that I have to maximize Radeon Settings while in Wattman for the boxes to show up at all.

    • ronch
    • 3 years ago

    I just got an email from AMD telling me about this. That’s a great way to keep in touch with your customers and let them know you’re still concerned with their products. Good job!

    • gerryg
    • 3 years ago

    Yeah, a better shortcut for saving an instant replay could be up, up, down, down, left, right, left, right, B, A…

    • Mr Bill
    • 3 years ago

    Might be able to wring a bit more zip out of an APU system.

    • odizzido
    • 3 years ago

    AMD really has been working hard on the software side, and I think it shows. Nvidia is also slipping recently. I think their next GPUs are being made by the old ATI GPU team? If they do a good job, and given AMD’s new focus I think they might, it could be a great chance for a comeback.

    On a semi-side note, the old 28nm fury has been really impressing me recently. I found it to be pretty underwhelming at launch.

      • freebird
      • 3 years ago

      That and I’ve seen the Fury Pro selling for $375 and the Fury as low as $325 after rebates…

        • DoomGuy64
        • 3 years ago

        In the US? The black friday deals were insane. Cheaper than a 1060.

    • TwistedKestrel
    • 3 years ago

    Minor quibble with the new installer:

    [url<]http://imgur.com/a/1NDez[/url<] That dialog is the entirety of what you see from the installer at that point. If I didn't tell you that was a prompt for action (it wants you to click on the right box), would you know without being able to interact with it?

      • RAGEPRO
      • 3 years ago

      Huh, interesting. I wiped my existing driver before installing the new one so I never saw that. I probably would have been confused too.

      • nanoflower
      • 3 years ago

      I would not know what it wanted me to do. I’m sure I would figure it out but without some signal that it needs input for me to continue how is anyone supposed to just know that they need to click on the driver box to install the new driver?

      • DPete27
      • 3 years ago

      Really reminds me of smartphone, OS, and browser UI design these days. Remove more and more info from the screen. Apparently nondescript little icons are easier to read and understand than text…..

        • odizzido
        • 3 years ago

        I could rant for ages about this. We’re literally regressing back to cave drawings and it sucks.

          • Ninjitsu
          • 3 years ago

          Literally the wrong use of “literally”. 😛

            • odizzido
            • 3 years ago

            no it’s not. We’re going back to using pictures instead of words. Our written language will soon be forgotten.

            edit——–
            oh, if you meant the cave part yeah you’re correct. The drawings are on phones now instead of cave walls.

            • Ninjitsu
            • 3 years ago

            yeah cave part 😛

            • RAGEPRO
            • 3 years ago

            idk tbh smh fam

            [sub<][i<]Sadly the unicode emojis I used at the end got stripped.[/i<][/sub<]

        • rechicero
        • 3 years ago

        Na, but you can call it “clean and simple” design and some ppl will think you are so cool…

      • Ninjitsu
      • 3 years ago

      It took me a few seconds but I could…agreed that it’s not obvious, though.

    • Firestarter
    • 3 years ago

    Now if AMD could make their H264 encoders work well in Steam In-Home Streaming as well, that’d be great. At least they’re not hilariously broken anymore as they had been for almost a year, but it’s still slower than streaming with Intel QSV

    • K-L-Waster
    • 3 years ago

    Hmm, vast improvement in the driver / software environment plus continued improvement in the heat & noise department… should make my next GPU purchase decision more competitive than the last time around.

      • drfish
      • 3 years ago

      Yeah, now they just need a competitive high-end chip. *fingers crossed*

        • nanoflower
        • 3 years ago

        That may be discussed today at the ‘secret conference’ AMD appears to be holding at the old ATI facility in Canada. With them inviting various Youtubers and then the announcement of new AMD GPUs showing up in drivers I suspect we are getting an early Christmas announcement of something new in the GPU area.

        Not sure if they are ready to announce Vega in more detail but something is clearly coming. Maybe as part of the official Zen presentation next week since the people at the secret meeting today will have time to work up their videos/writeups by next week.

    • TwistedKestrel
    • 3 years ago

    [quote<]The new driver adds hardware-accelerated VP9 video decode to all GCN and Polaris-based Radeons. That functionality is a "hybrid" decode run on the GPU itself rather than with fixed-function hardware, but AMD say it should reduce power consumption nonetheless.[/quote<] I'm not sure if this was added to *all* GCN GPUs... I'm using a 7950 and I don't see any VP9 decoder available

      • Concupiscence
      • 3 years ago

      Nah, GCN 1.0 predates VP9. It may be possible to implement partial acceleration by leaning on shaders, but I would assume that’s going to increase power consumption enough to negatively impact laptop users.

        • RAGEPRO
        • 3 years ago

        GCN 1.1 and 1.2 (GCN2 and GCN3) don’t have hardware VP9 either. AMD’s slides say that VP9 decode was added to “Radeon GCN and CG Polaris enabled products.” Not sure what the deal is. We’ll get with AMD and see.

          • joselillo_25
          • 3 years ago

          in my RX460 it consumes the same CPU in youtube vp9 than the older driver, so is a strange deal.

          • TwistedKestrel
          • 3 years ago

          FWIW I have since seen some scuttlebutt that the hybrid decoder could be something like a [s<]driver-injected[/s<] OpenCL [s<]shader[/s<] thingy, and not exposed via DXVA at all. [i<]Edited b/c what I said doesn't exactly make sense[/i<]

            • RAGEPRO
            • 3 years ago

            I actually have VP9 DXVA support on my 290X, so it’s definitely exposed via DXVA.

            • TwistedKestrel
            • 3 years ago

            Oh. Well, that’s interesting!

    • Sargent Duck
    • 3 years ago

    They should have called it the DAMAGE edition.

    • sweatshopking
    • 3 years ago

    On windows my anecdotal evidence has been that over the past year AMD has had VASTLY better drivers than Nvidia. I own a 1060 6GB and a r9 290. The amd drivers have been far more stable and NVidia has been a disaster. Bad enough I wouldn’t recommend buying a 1k series GPU right now at all.

      • Chrispy_
      • 3 years ago

      As someone supporting 1000+ users with mixed GPU vendor I have to concur with this.

      AMD’s drivers have been decent for a long time now. No they’re not perfect, but they’re mostly very stable.

      Nvidia’s drivers still have all manner of silly legacy bugs that have been around for embarrassingly long now, but in the last year or so they have been joined by some really stupid driver releases that have caused far more stability problems than even the red team back in the bad old ATi driver days.

        • anotherengineer
        • 3 years ago

        Ya I have noticed the same thing and people just think I was lying or a fan boy.

        Our office had old desktops with dual core wolfdales with old radeon HD3450 cards. And IB Dell latitude laptops with i7, quadruple the ram, and Nvidia NVS 5200M and I found the radeon HD 3450 played nicer with autocad 2011, 2013 and 2015.

    • ptsant
    • 3 years ago

    Very nice review and also very timely.

    Thanks a lot. I’ll try the features out…

    • joselillo_25
    • 3 years ago

    The announced vp9 (youtube codec) hardware aceleration in chrome is not working in my RX460, my cpu still spikes too much when playing videos.

    • DPete27
    • 3 years ago

    Interesting about Chill. However, I’d rather they worked to improve Framerate Target Control compatibility first. While on the surface Chill would seem to be a more detailed extension of FTC (and the “Power Efficiency” switch…they seriously shouldn’t need 3 power saving features), the sporadic nature of in-game movements makes me wary about stuttering when a player goes from standing still to whipping around in a fraction of a second and whether Chill can catch that drastic change early enough in the pipeline and ramp up clocks fast enough to avoid stuttering.

    [url=https://techreport.com/forums/viewtopic.php?f=3&t=118710<]I did some testing of FTC not long ago[/url<] in Rise of Tomb Raider and found it didn't work. Reading this article made me realize I should check to see if I was running in DX11 or DX12 mode. Seems Chill doesn't work in DX12, so maybe FTC doesn't either.

      • drfish
      • 3 years ago

      Chill reminds me of [url=http://www.benchtest.com/rain.html<]Rain[/url<]. Remember when CPUs needed help with that?

        • Firestarter
        • 3 years ago

        wow that site goes WAY back

      • RAGEPRO
      • 3 years ago

      Maybe so.

      I will say that there was absolutely no stuttering caused by Chill, though. It certainly does lower the framerate, but in an extremely, extremely smooth and consistent way. Although if it worked as it is purportedly intended, I would be more impressed.

      • cegras
      • 3 years ago

      Considering the total frame input lag from mouse to screen can be 1-10 ms, I doubt this extra processing can introduce that much lag to noticeably affect anything. Computers work on timescales outside of human perception.

      • Chrispy_
      • 3 years ago

      I’m a little hesitant to trust FTC and by extension, Chill.

      In theory it’s a great tool but I had a few performance hitches on the RX 480 cards when using both power efficiency toggles and compatibility mode. The way they manage power consumption is good but it needs tweaking with more hysteresis to stop huge, knee-jerk reactions that cause massive dips in clockspeed, which tends to be very obvious unless you’re running freesync.

      On my own RX480 at home, I’ve just agressively undervolted it, which means it never gets close to the 150W limit anyway and makes the whole compatibility toggle redundant.

        • DPete27
        • 3 years ago

        Yeah, I’ve got my MSI RX480 @ 1.075V/1306MHz

        Still, not all games are demanding enough to need 100% GPU clocks depending on what resolution you’re using. And in conjunction with undervolting, enabling “Power Efficiency” in RoTR saved me about 15W without changing framerates. If I jump into something like LoL or Warframe, there’s no way I’d need all the available horsepower of the RX480 at 2560×1080.

        Like I said, I don’t see why AMD needs to have Chill, FTC, and Power Efficiency when they all seem like they should be doing the same thing. But if they can get one working properly in a wide variety of games, I’d gladly take advantage of the extra power savings and game with no GPU fans running. I don’t have issues with temps and noise, but MSI’s RX480 Gaming X cooler is equal only to the XFX GTR this generation. Less capable coolers may benefit in a more meaningful way from a few extra watts of power savings.

      • DPete27
      • 3 years ago

      Hmm, just checked. I’m running RoTR in DX11. FTC didn’t work. Not sure why.

    • Concupiscence
    • 3 years ago

    Now I feel like a jerk for buying a Geforce GTX 1050 Ti a couple of weeks ago. Oops.

    Admittedly it went into my Linux box because the open Radeon driver situation on Linux is still very much in flux, and I got tired of waiting for things to get better for my venerable Radeon 7970. At least the latter was passed on to a Frankenputer I slapped together, so I can put this new software to the test.

      • derFunkenstein
      • 3 years ago

      AMD doesn’t have anything that competes at that $140 mark anyway. You’d either have to spend ~$35 more for an RX 470 or $35 less for an RX 460.

        • Concupiscence
        • 3 years ago

        That’s very true, and exactly the limitation I ran into. The box in question runs Linux full-time, and most of what it’s asked to do amounts to rendering seismic data with high levels of antialiasing for clean output and easy interpretation. And the 1050 Ti’s so cool, quiet, and *tiny*… For my use case it’s really a win-win.

          • derFunkenstein
          • 3 years ago

          Yeah, for sure, especially with Linux where, again, the perception is that AMD’s drivers aren’t up to snuff, like you said.

    • RAGEPRO
    • 3 years ago

    If anyone wants the full ~50MB encoded samples I can provide them, by the way.

    • derFunkenstein
    • 3 years ago

    Oddly enough, the promise of “Radeon and Chill” is how I get the ladies to come over to my place.

    Also very…coincidental…that AMD-supplied graphs look just like TR graphs. Wonder why that is.

      • RAGEPRO
      • 3 years ago

      Because they’re both made with Excel?

      😉

        • derFunkenstein
        • 3 years ago

        /sad trombone

          • nanoflower
          • 3 years ago

          How soon they forget.

          😉

            • RAGEPRO
            • 3 years ago

            Wait, who forgot what? Did you think I didn’t get his joke?

            • nanoflower
            • 3 years ago

            This guy seems to know something about Radeon Chill
            [url<]https://www.youtube.com/watch?v=_RKJB47PoRg[/url<] Guess you didn't notice my smiley at the end. Ah, well. The Internet sucks for text jokes.

            • chuckula
            • 3 years ago

            Do I know that guy from somewhere?

            • derFunkenstein
            • 3 years ago

            No.

            • Mr Bill
            • 3 years ago

            For just a moment, I thought it might be Radeon and Chili. I want that Capsaisin T-shirt.

            • derFunkenstein
            • 3 years ago

            Watching the video I was thinking the same thing. It’s cool, and it’d be a real test of PC nerd-dom around you to wear it out in public.

            • Mr Bill
            • 3 years ago

            Agreed, I already have this [url=http://www.thinkgeek.com/product/3813/<]thinkgeek capsaician shirt[/url<]. But the one Scott is wearing would be way cooler and nerdier.

            • Ninjitsu
            • 3 years ago

            Haha he looks vaguely uncomfortable about it.

            • ClickClick5
            • 3 years ago

            <sarcasm> This guy looks familiar. I think I met him at a BBQ somewhere a while back. How long has he been with AMD? </sarcasm>

      • Prestige Worldwide
      • 3 years ago

      [quote<]Oddly enough, the promise of "Radeon and Chill" is how I get the ladies to come over to my place.[/quote<] False. Ladies like to feel the GeForce.

        • derFunkenstein
        • 3 years ago

        Dang it, that was really good.

        • Kaleid
        • 3 years ago

        But does she really want to…ReLive it?

Pin It on Pinterest

Share This