Personal computing discussed

Moderators: renee, morphine, SecretSquirrel

 
DoomGuy64
Gerbil
Topic Author
Posts: 47
Joined: Mon Jun 08, 2015 4:09 pm

Radeon Software Adrenalin 2020 Edition Review

Tue Dec 17, 2019 9:04 am

State of AMD Drivers, a 2020 Review: The overlay sound effect can't be disabled. Wattman is STILL broken. It is literally a requirement to use OverdriveNTool to get custom tuning to work. IMO, this is a "bug" that has existed for so long, while 3rd party tools work, that I now think AMD has broken Wattman ON PURPOSE. How else do you explain 3rd party editing works, while driver controls do not? Also, every card since Vega has issues hitting boost clocks without manual overrides. I have steam friends who report the 5700 XT is completely broken, and forums are full of error reports with NO KNOWN WORKING DRIVER. Vega HBCC will crash your system, doesn't work. FRTC was removed, apparently had "issues" with latency, but EVERYONE would rather use FRTC than CHILL, especially since CHILL is broken garbage, especially on 5700 cards. Drivers have had TDR BSOD's since 9.x, reportedly now fixed, but forums report settings "TdrDelay" to "8" to fix BSOD's. Not a good look for AMD. Also to fix black screens, press Win+Ctrl+Shift+B, which is great feature Microsoft has, but proves AMD drivers are garbage. 2020 is generally good driver, but will occasionally TURN THE SCREEN DARK like low brightness, usually after exiting low power. The only fix is to toggle GPU scaling on/off. 5700 has issues with RAM speeds not entering low power modes (among other issues), which has been a problem ever since AMD broke the 390's powersaving features and no tech site reported on it, only working cards are Fiji/Vega with HBM (barely). 390 powersaving has been broken since Doom 2016, multimonitor completely disables RAM powersaving, driver team LIES ABOUT THIS, saying it's a requirement, but IT WORKED BEFORE. You can chase ALL OF AMD'S DRIVER SHENANIGANS BACK TO THE 390. First, 390 had driver optimizations on release not ported to 290, and same with windows 10 optimizations / draw calls not ported to windows 7. Modders ported drivers to 290, and later AMD backported these fixes. This is where Radeon Image Sharpening tactics ORIGINATED. New cards get a SOFTWARE FEATURE, while older cards have to wait for AMD to backport a ReShade clone / etc. This has been a KNOWN SOURCE OF BUGS since AMD bought ATI, when there were 3 driver teams that screwed up the RAGE launch so badly that AMD had to consolidate the driver team from 3 to 1. ARTIFICIALLY SEGMENTING SOFTWARE FEATURES IS BAD, DON'T DO IT. STOP. Speaking of, all of AMD's APUs have horrible driver support and NO RELIVE options, which makes NO SENSE when the options are in the hardware. Including FREESYNC, which is based off eDP, while eDP freesync is not available. Speaking of, HBCC is a pretty questionable Vega limited feature, as AMD was previously "working" on Fiji memory constraints of 4 GB. Highly likely to be a *software* feature limited to Vega, although it doesn't work stable anyway, or at least not on Ryzen 3700X/32GB/X370, which Ryzen in general is pretty garbage with memory and requires manual timings and 3rd party tools to get said timings. Bad drivers and iffy OS support outside of linux don't help Ryzen either.

Performance Review: All known performance issues can be directly linked to AMD's overly aggressive power saving which CAN NOT BE DISABLED. It doesn't help that most cards are overvolted with limited power limits and fan speeds, and require undervolting and raising fan/powerlimits to hit boost clocks. The CONSTANT UP AND DOWN BOOST CLOCKING RUINS PERFORMANCE, AND CAUSES STUTTERING. You can never achieve a set clockspeed. Games will bounce clocks up and down depending on usage, including going into a game menu. I have experienced these problems with games with built in dynamic resolution, leading me to believe "Radeon Boost" will be a COMPLETE FAILURE due to boost clocks wildly jumping around for each resolution change. Chill works the same. The agressive powersaving BREAKS EVERYTHING, as performance is NOT STABLE with unstable clocks.

Vega 56 fix: P6 1560 @ 1048 mV / P7 1666 @ 1072 mV | Mem 880 @ 910 mV | fan 65 % | Power Target 24% Actual GPU MHz: ~1600.
You can get more powersaving at lower mV, but at the expense of clocks. Same with reverse. This is my ideal compromise for performance / power / stability / 1600 MHz target. You have to pick a MHz target, and tweak for power / stability. Powercolor Red Dragon is one of the best Vega 56's, but uses Hynix HBM. You can not safely bios flash or raise HBM voltage, but it is more power efficient. This is how Vega should have been made, and proves AMD just screwed it up. AMD also screwed up all newer cards by using thermal pads instead of paste. Replacing the pads are not guaranteed easy fix, since the pads are a "fix" for coolers not sitting flush with both HBM / GPU, worst card is Radeon VII.

Bugs: Multimonitor / Don't even try it. Breaks everything including FreeSync, and I don't think AMD cares or tests for it. Borderless window modes still have issues with Overlay. You can only "sorta" get it to work. Custom monitor Overrides still don't support FreeSync ranges, requiring CRU. Nvidia isn't reported to support custom LFC ranges, so I think AMD still has the edge here. If they do, it's not being properly advertised. Freesync is also clearly superior to Gsync at this time, with FS monitors supporting both FS and ULMB simultaneously, as well as having large panel OLEDs cheaper than BFGD. The Gsync shills were wrong. Nvidia is rebranding these monitors as Gsync, because their disinfo campaign was so effective rebranding was necessary. DIAF Gsync shills, even Nvidia admits the truth now, albeit only through rebranding, and lying by omission. Typical Nvidia. Also, I expect less Gsync shilling now that site traffic dropped and Gsync is rebranded. I fully believe Nvidia was paying people to shill Gsync, along with the brainwashed fanboys, but site traffic no longer justifies the cost, and reality has gotten through to most people today. Nvidia should fight on their merits, not lies, and there are decent merits today. AMD's poor power tuning and driver optimization especially.

Overall: New drivers are nice, but AMD clearly doesn't care about fixing any of its REAL PROBLEMS, which are mostly all power saving related, as well as the lesser optimization and artificial product segmenting issues. The hardware and software is completely broken, and requires manual adjustment for acceptable results. Fixing powersaving would resolve 90% of everyone's problems, but AMD is instead putting lipstick on a pig with new driver UIs. I will say the new UI is better, but none of the core problems are being addressed.

Side note: AMD / Nvidia CAN DO RAYTRACING ON NON-RTX HARDWARE, perhaps better than RTX. It is called PTGI, Path Traced Global Illumination, proven to work on existing hardware. AMD has completely FAILED to push developers into using this technology, but it does work on existing hardware. Nvidia on the other hand, may be coercing developers to not use it in favor of RTX.
https://www.patreon.com/sonicether
https://youtu.be/nt2iURehGkE
Also, Minecraft's OFFICIAL Super Duper pack upgrade was cancelled, and Nvidia's RTX mod came out right after. Make up your own mind on what happened, but Nvidia is well know for these tactics. Pulling an Epic Games exclusive deal for RTX titles is extremely shady, and there are several RTX exclusive games now, much like PhysX had exclusives. Not cool. At least Minecraft has modding, but not everything else does.

Scott Wasson on Adrenalin 2020
https://youtu.be/KqA6ol7g10Q
The one button configuration options seem to be mostly useless bloat prone to future bugs outside of easy noob one button. Better to leave this as install option, not in control panel, and just provide better documentation in control panel.

GPU Metrics: No tooltip popup what the history of your GPU clockspeed is, nor has right click options. It only shows a graph and current clockspeed (only when open, no previous history). This may be an indicator that AMD is attempting to hide your clockspeed history of constant fluctuations, which can be directly correlated with FPS drops and stuttering.

Tuning: No set minimum, lol stops those pesky full speeders. No editing secondary memory states or voltage. Fan Tuning disables Zero RPM for no reason. Best to use OverdriveNTool, which proves tuning works, just not the Control Panel, and it is deliberate. Zero RPM disabled is clearly deliberate. Broken Tuning is intended. Not saving your settings is intended. Broken Tuning support *as intended* needs to be strongly frowned upon, by everyone. Especially when Tuning is literally a requirement to get AMD GPUs to work as advertised.

Performance logging: Why can't we log in system memory, and option to save? Maybe I'd like to see logs without saving? Why is logging not an option IN METRICS!? Why no logging tab? Sounds like deliberate obsfucation. MSI Afterburner FTW. Also, logging to a file is completely useless to anyone who doesn't want to view files instead of graphs.

Game Advisor: Why is there no clockspeed Metrics? Seems like AMD is DOING EVERYTHING POSSIBLE TO HIDE GPU METRICS. Probably because their aggressive power tuning is screwing performance, and they don't want to tie Frame Times to Power Tuning, which is the reality of AMD's performance issues. Lol @ settings advisor. Tells you everything except how to fix AMD's clockspeed fluctuation issues. Have weird performance glitching? Check your clockspeeds. 90% of all problems. Also, graphs are HIDDEN BY DEFAULT, and require clicking the option.

Metric Overlay Custom text color for Red is: #FF0000
Ridiculous that this is NOT A DROP DOWN OPTION, and you only have White, Blue, Green, Magenta, and Custom. Not only should Red be an option, it should be the default option.

Video quality: Been garbage since CCC went away. There is no upscaling option, and all options make video MORE PIXELATED. This makes 3rd party upscalers like MadVR a requirement to watch any video, as default settings are unusable, and Fluid Motion does not smooth 30 FPS to 60 either. Video options are mostly useless.

Hotkeys: REALLY STUPID DEFAULTS. Ctrl-Shift ANYTHING NEAR WASD IS SUPER BAD IDEA. Try playing Shadow of War kb/m with default hotkeys! You will be constantly triggering every single hotkey that uses a WASD key. Push to Talk is also Mouse Middle, and can't be bound Ctrl-Shift-Middle or Right+Middle. Don't stream any game that uses mouse middle when hotkey enabled. Stupid.

Mobile support: I've had issues with early versions and losing connection. Maybe it's better now (seems so), but if AMD can't get Wattman working, I doubt this will get proper bugfixes either. Streaming to mobile? Get it working over the internet and provide advanced configuration. Also, why is this a hidden option outside of Home/Cogwheel now? DON'T HIDE FEATURES.

VR: I got rid of my Samsung VR, but AMD needs to copy Nvidia's VR "optimization", and make it default (it's partially done in software, and AMD supports most hardware features). Otherwise, speed sucks, and so does VR in general. Also, support 3d monitors. No excuse. You can turn any 144hz panel into 3d with 3d conversion kit.
VR Streaming: is this integrated VR streaming to mobile support? Awesome. There is no documentation or easy access to this function though, not to mention controllers are limited on non dedicated VR, unless there is one I don't know about. IMO, VR Streaming to mobile is the only way to move forward, as VR companies have ruined the market and it won't go anywhere with their control of the market. AMD Bit rate ranges are pretty terrible though. Apparently this is officially for mobile based VR headsets and not specifically phone VR, but this clearly proves VR streaming works, and mobile VR is basically phone VR. Pricegouged VR sets are now dead.

General Video recording: AMD has 3 quality profiles for hardware encoding, low/med/high, and I don't mean bitrate. Default usually is medium, which is not that great. High quality rivals CPU encoding. Hiding this option pisses me off. I shouldn't have to registry edit for it, if there is even a googleable setting, or available outside of 3rd party. This is AMD's hardware equivalent to x264 veryfast-fast-etc. The option should either be available or default high for hardware.

Control Panel Layout: Better and worse. Design is better, while layout now hides all advanced options BEHIND THE COGWHEEL. STOP. Cogwheel setting should be DEFAULT OPTION, or have option to disable NOOB MODE. Noob Mode makes everything harder. Every advanced feature is now hidden behind the Cogwheel mode setting. This doesn't improve ease of use, IT MAKES IT WORSE. The only saving function of forced noob mode is that there is usually a shortcut to go Cogwheel *somewhere*. Also, most advanced settings were thrown into Cogwheel / General, and it is an absolute mess of everything thrown into one tab with NO organization. If you don't need quick access, fine, but this is ridiculous and is in no way "better". One step forward, two steps back.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Radeon Software Adrenalin 2020 Edition Review

Tue Dec 17, 2019 10:33 am

The whole list is just mostly pedantic nitpicking over trivial non-issues(s).

Boost clocking has always been opportunistic since GCN/Kepler. Clockspeeds are dynamic and depend on the workload/thermals/power limit. The days of GPUs running at maximum clockspeed 24/7 and a single idle state are long gone. Get over it. Wattman has been working fine for a year or so. It is suppose to revert to factory settings if it detects a possible hardware fault. That's why saving/reloading profiles is a thing. :roll: Besides, boost clocking has never been guaranteed. It is entirely YMMV.

Power saving works fine here with my multi-monitor setup here as well as Freesync. Chill works great too but does require a bit of tweaking to get the optimal framerate/power consumption curve. I don't recommend using it when you are overclocking but why are you using it in the first-place? Overclocking and Chili are trying to achieve opposite goals.

The overlay beeping is entirely a non-issue and a good indicator that Anti-Lag, Chili etc are being triggered in your game. It is arguably better then some graphical pop-up or extra lines of text on a monitoring tool. Tuning works fine, the UI is just a bit a different. Re-learning item locations is a little annoying but nothing worth crying about. The one-button setting is meant for non-tweaters/enthusiasts and it works fine for their needs. I don't understand why you have to gripe about it. The majority of PC gamers don't care or simply don't have the time/patience to fiddle around their hardware to fine-tune it.

GPU metrics in game advisor don't show clockspeed graphs for one simple reason. Outside of pure academic interest, they are completely irrelevant. Frame-time is what really matters for gamers and it is what metrics are primarily focused on (Thanks Damage). If you are so keen on graphing clockspeed, there are several third-party tools that easily do this. I suppose that a chart tracking GPU utilization/temperature/power consumption/fan speed may matter for some.

Ray-tracing is professional-only despite what Nvidia marketing tries to spiel. The computing cost are still far too high for customer-tier hardware unless gamers are willing to regress back to late 1990s-era 3D models, resolutions and assets. RTX is nothing more than a gimmick to justify tensor hardware on Turing (really just a repurposed Volta for graphical workloads). It is part of long-term bid to fight off integrated GPUs from eating away at lower-end discrete GPU SKUs (bread and butter of NVidia's discrete GPUs via sheer volume).
Last edited by Krogoth on Tue Dec 17, 2019 10:42 am, edited 1 time in total.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Igor_Kavinski
Minister of Gerbil Affairs
Posts: 2077
Joined: Fri Dec 22, 2006 2:34 am

Re: Radeon Software Adrenalin 2020 Edition Review

Tue Dec 17, 2019 10:38 am

DoomGuy64 wrote:
Side note: AMD / Nvidia CAN DO RAYTRACING ON NON-RTX HARDWARE, perhaps better than RTX. It is called PTGI, Path Traced Global Illumination, proven to work on existing hardware. AMD has completely FAILED to push developers into using this technology, but it does work on existing hardware. Nvidia on the other hand, may be coercing developers to not use it in favor of RTX.
https://www.patreon.com/sonicether
https://youtu.be/nt2iURehGkE


That's so cool. Thanks for sharing. I hate Minecraft's blocky graphics but this is one awesome reason for me to play it.

Damage wrote:
...

Hey man! Any particular reason AMD is not interested in PTGI? Is this an architectural issue?
 
DoomGuy64
Gerbil
Topic Author
Posts: 47
Joined: Mon Jun 08, 2015 4:09 pm

Re: Radeon Software Adrenalin 2020 Edition Review

Tue Dec 17, 2019 4:33 pm

Krogoth wrote:
The whole list is just mostly pedantic nitpicking over trivial non-issues(s).

Boost clocking has always been opportunistic since GCN/Kepler. Clockspeeds are dynamic and depend on the workload/thermals/power limit. The days of GPUs running at maximum clockspeed 24/7 and a single idle state are long gone. Get over it. Wattman has been working fine for a year or so. It is suppose to revert to factory settings if it detects a possible hardware fault. That's why saving/reloading profiles is a thing. :roll: Besides, boost clocking has never been guaranteed. It is entirely YMMV.

No. Wattman has not been working "fine" since ever. There are clearly issues in the control panel making it not work acceptably, and requiring 3rd party tools to adjust. Also, the revert settings function is either bugged or overly sensitive. Disabling Zero RPM with custom fan speed is clearly broken. This may possibly be deliberate, but it feels broken in practice. Wattman has also occasionally had issues not going idle after adjustment, among other things. It's not reliable, stable, or fully feature enabled. Multimonitor also breaks powersaving on certain cards. Both Radeon VII and 5700 had a lot of problems on release, it hurt reviews, and I still don't know if they're fixed. None of AMD's new cards are worth buying over Nvidia when nothing works right, and I've also heard reviewers say the 5700 is strongly limited to stop enthusiasts from overclocking a 5700 to 5700 XT levels.
Krogoth wrote:
Power saving works fine here with my multi-monitor setup here as well as Freesync. Chill works great too but does require a bit of tweaking to get the optimal framerate/power consumption curve. I don't recommend using it when you are overclocking but why are you using it in the first-place? Overclocking and Chili are trying to achieve opposite goals.

Why? Learn to read, and I never said I was. AMD removed FRTC, and CHILL is the only remaining option. CHILL affects powersaving, and isn't particularly helpful when you are trying to achieve stable clockspeeds and framerates. Radeon Boost is probably going to be even worse, as I've tried dynamic resolution that was in-game, and it screwed performance. AMD would have to keep stable clocks with dynamic resolution, and I don't know if they are capable of doing so. Either way, FRTC was removed, CHILL is now mandatory for framerate control, but I suppose that's acceptable on a per game basis when new games won't run past 144 FPS on a Vega 56. It's still a feature downgrade, regardless.
Krogoth wrote:
The overlay beeping is entirely a non-issue and a good indicator that Anti-Lag, Chili etc are being triggered in your game. It is arguably better then some graphical pop-up or extra lines of text on a monitoring tool. Tuning works fine, the UI is just a bit a different. Re-learning item locations is a little annoying but nothing worth crying about. The one-button setting is meant for non-tweaters/enthusiasts and it works fine for their needs. I don't understand why you have to gripe about it. The majority of PC gamers don't care or simply don't have the time/patience to fiddle around their hardware to fine-tune it.

Overlay beeping is annoying for people who want it disabled, and you can't disable it. The UI is not acceptable either. Like I said, there are 2 different conflicting UIs. Much like Windows 8 and full screen Metro, conflicting UIs are BAD design. Simple is fine, as long as you aren't forced to simultaneously use both when you need features in both modes. It makes fiddling worse, and the old control panel was simpler. Not to mention a bunch of options that should be separated are now all thrown into a big list in general.
Krogoth wrote:
GPU metrics in game advisor don't show clockspeed graphs for one simple reason. Outside of pure academic interest, they are completely irrelevant. Frame-time is what really matters for gamers and it is what metrics are primarily focused on (Thanks Damage). If you are so keen on graphing clockspeed, there are several third-party tools that easily do this. I suppose that a chart tracking GPU utilization/temperature/power consumption/fan speed may matter for some.

They're not irrelevant, and clockspeed directly affects Frame-Times. IMO, it's being deliberately obfuscated to hide AMD's broken power tuning. I already mention MSI Afterburner, you don't read, and that's not even the point. The point is that AMD is trying to hide the correlation between power tuning metrics and frame times, and also making what they do show hard to find. The frame time graphs are hidden by default. I think AMD knows they have problems, and they don't want to make it obvious to the end user, even with what they do support.
Krogoth wrote:
Ray-tracing is professional-only despite what Nvidia marketing tries to spiel. The computing cost are still far too high for customer-tier hardware unless gamers are willing to regress back to late 1990s-era 3D models, resolutions and assets. RTX is nothing more than a gimmick to justify tensor hardware on Turing (really just a repurposed Volta for graphical workloads). It is part of long-term bid to fight off integrated GPUs from eating away at lower-end discrete GPU SKUs (bread and butter of NVidia's discrete GPUs via sheer volume).

Yes, and RTX's proprietary solution isn't even a standard, meaning it's DOA. The point I was making was that PTGI gives you playable raytraced graphics on existing non RTX hardware, and nobody is officially supporting it. AMD is dead silent on supporting any form of raytracing in existing hardware, even though it works, making me feel they are planning on pulling a RTX on existing users. Also, I don't put much hope in any non independent game devs supporting raytracing, as publishers control everything that goes into a game. If publishers want lootboxes, we get lootboxes, if publishers get bribed by Nvidia, we get gameworks. The only hope for raytracing today is independent devs, unless you buy into RTX.

Igor_Kavinski wrote:
Any particular reason AMD is not interested in PTGI? Is this an architectural issue?

Yes, money from selling new cards. No, PTGI runs on everything. Crytek also has a raytracing demo they ran on Vega 56, but no further info. I have a strong suspicion all of this is tied into the Gaming Industrial Complex mafia, and their lootbox profiteering agenda. PC games are mostly console ports. Next gen consoles may support ray tracing, but not now. Older game engines do not support raytracing, limiting graphics for developers who don't code their own engines. Publishers don't care about coding for PC unless Nvidia bribes them, and hands them pre-baked effects aka gameworks. This is also partially why AMD had such horrible PC optimization before getting the console monopoly. Devs did not care, and AMD's drivers were worse than Nvidia. Only now is AMD relatively acceptable, as Nvidia completely dominated all dx10 and early dx11 performance on PC. Raytracing won't go anywhere until publishers want it to go somewhere, outside of honest proactive developers who take the initiative. The state of PC gaming is rather grim when there are so few developers working on playable vendor neutral raytracing.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Radeon Software Adrenalin 2020 Edition Review

Tue Dec 17, 2019 5:34 pm

Zero RPM is bloody stupid, unless you got an massive heatsink and severely undervolting/underclocking your GPU (well below the factory idle state). It is just an open invitation on killing your hardware. AMD RTG and Nvidia are protecting users from killing their own hardware by accident. That's why the option only exists in third-party tools. Custom-fan control works unless you got a third-party cooler that doesn't use GPU feed and/or proprietary interface (One of the drawbacks of non-reference cards). Wattman works fine and has more than states on memory and GPU for fine tuning. It has been working fine since 2019 and countless other AMD users on various platforms. The whole idle problems are not from Wattman. it is bugs from WDDM and earlier builds of Windows 10 (I have seen it first-hand) that were taxing the GPU.

FRTC wasn't removed, it was superseded by Chill. Just adjust the min framerate value to match the max value if you want to get the same effect as the old FRTC. FRTC was really just proto-Chill. It just limit the max framerate to reduce power consumption and/or stay within mointor's adaptive range. Chill just added a minimal framerate target value for further power savings and another way to stay within your mointor's adaptive range.

Overlay beeping is a literately a non-issue. It has been a thing since the move to Adrenalin suite and there hasn't never an build-in option to disable it. I suspect the recent uproar is from users being unaware of it until 19.12.2s which started to enable anti-lag with one-button gamer/esport profile. Before the anti-lag, it was only triggered by Chill.

There's no grand conspiracy with ray-tracing. Ray-Tracing is just an interesting gimmick outside of the professional graphics world. Gamers simply don't care for more accurate reflections/shadows. They prefer more complicated models, scenery, higher screen resolutions and textures for graphical fidelity over it. It has been this way since the beginning of digital graphics. Ray-tracing is simply too dang expensive on computing power. The overwhelming majority of gamers aren't willing to sacrifice framerate, model complexity, screen resolution and texture quality for its sake.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Waco
Maximum Gerbil
Posts: 4850
Joined: Tue Jan 20, 2009 4:14 pm
Location: Los Alamos, NM

Re: Radeon Software Adrenalin 2020 Edition Review

Tue Dec 17, 2019 5:53 pm

This is all ranting about trying to get more out of something that isn't guaranteed. Bringing up overclocking being limited is just asinine.

(I do own a Radeon VII, none of these things are issues for me)
Victory requires no explanation. Defeat allows none.
 
DrCR
Gerbil XP
Posts: 350
Joined: Tue May 10, 2005 7:18 am

Re: Radeon Software Adrenalin 2020 Edition Review

Tue Dec 17, 2019 8:23 pm

What OS, besides Windows, are you expecting them to support outside of Linux? Just curious. (I simply did a Ctrl+F for "linux" since that's my scope of interest, and I found the below. I'm currently happily running the Nvidia binary driver in linux, but always consider the option of ATI AMD for my next card.)

DoomGuy64 wrote:
Bad drivers and iffy OS support outside of linux don't help Ryzen either.
 
DoomGuy64
Gerbil
Topic Author
Posts: 47
Joined: Mon Jun 08, 2015 4:09 pm

Re: Radeon Software Adrenalin 2020 Edition Review

Tue Dec 17, 2019 9:38 pm

Krogoth wrote:
There's no grand conspiracy with ray-tracing. Ray-Tracing is just an interesting gimmick outside of the professional graphics world. Gamers simply don't care for more accurate reflections/shadows. They prefer more complicated models, scenery, higher screen resolutions and textures for graphical fidelity over it. It has been this way since the beginning of digital graphics. Ray-tracing is simply too dang expensive on computing power. The overwhelming majority of gamers aren't willing to sacrifice framerate, model complexity, screen resolution and texture quality for its sake.

Yes there is. I'm not speaking about RTX or DXR. GPU's have had global illumination for years, and games still don't max it out like Gamers Nexus illustrated. https://youtu.be/CuoER1DwYLY This absolutely is a conspiracy that nobody is using it. Raytracing wouldn't look half as appealing as it does, if all modern games were maxing out current gen capabilities. They're not, and this is the biggest selling point for RTX, which only exists in sponsored titles. That said, Nvidia has more high profile RTX titles than games that fully utilize dynamic global illumination. That is PATHETIC. Also to counter your other point, games with complicated models and 8GB Ultra texture modes are EQUALLY RARE. All modern PC games are console ports with minor PC lipstick on it. The current state of PC gaming is ridiculously sad. Very few games need more than 4GB of Vram. Then, on the hardware side, both AMD and Nvidia have been stagnating on providing decent affordable mid range products. We've been stuck with 32 ROP 1080p cards for too long. You can't enjoy Ultra textures @ 1080p. Crippled memory bus cards under 256-bit also needs to DIAF. Both developers and hardware manufacturers are holding back progress. Where's the massive game library that needs a 1070TI? Doesn't exist. You can get by with all PC games on a 1060, or an APU for eSports today, which is ridiculous and proves my point about stagnation. The only games my Vega 56 has issues playing are games with complete garbage optimization, which is another point of discussion. Games have dropped visual fidelity performance hits for performance hits because of garbage optimization, and nobody bats an eye tolerating this swill and just upgrades further to brute force past garbage optimization. Consoles are clearly superior in every aspect for gaming at this point. They get first access to new games like Red Dead 2, and superior optimization from the game developers. It's ridiculous the level of performance devs are pulling out of consoles. Giving us 4K Uber quality AAA titles on the equivalent of a 480 or lower. Meanwhile, PC gaming is complete trash. I'm immune to this gas-lighting argument, bro. It's way too obvious we're being screwed. Anyone who spends more than two seconds thinking about what we're getting, and what we're capable of getting should know better.

Krogoth wrote:
Overlay beeping is a literately a non-issue. It has been a thing since the move to Adrenalin suite and there hasn't never an build-in option to disable it. I suspect the recent uproar is from users being unaware of it until 19.12.2s which started to enable anti-lag with one-button gamer/esport profile. Before the anti-lag, it was only triggered by Chill.

I don't particularly care, but it IS an issue because you can't disable it, and shows AMD has lack of any insight. IMO, the default ctrl-shift-wasd hotkeys are VASTLY more stupid, but at least I can rebind hotkeys or disable them. Shadow of War will instantly prove how stupid their defaults are when you play it, as SoW's combat highly revolves around pressing AMD's default hotkeys. You can't disable the sound effect though. Also, 19.12.2's dark grey screen resuming from sleep bug is annoying, and 9.x's ridiculous BSOD/TDR instability proves AMD is asleep at the wheel for QC. Don't even get me started on multi-monitor, AMD's poor APU support, or lack of built-in freesync custom overrides. I've practically given up on AMD ever properly working with multi-monitor while gaming, since so many things don't work right, and you never know when problems will get better or worse, and what hardware works while others don't. The quality control is non-existent, and AMD just releases broken crap to the public like we're their beta tester team, and then they completely ignore any complaints about real bug reports and instead focus on making a new UI to cover up how broken everything else is.

Can you "FIX" most AMD problems? Yes? The only PROBLEM is that the people who can tolerate and fix AMD's issues are the people who DON'T NEED their stupid NOOB UI, and know how to use 3rd party tools. The real noobs are all on Nvidia, and this is the absolute worst way to increase mindshare. You start with EXISTING users, THEN you expand. Not the reverse. If a Noob bought an AMD card, they would play with it for a week, then return the sucker for Nvidia due to all AMD's problems. AMD does not work OOTB, and you literally have to troubleshoot everything, including which vendor model to buy, since stock AMD cards are never up to acceptable public release. The few AMD friends I have on steam who own a 5700 XT ALL COMPLAIN about how broken their cards are, and support forums are flush with complaints. You can't honestly tell me these products and drivers are public ready / noob friendly when everyone I know has problems, and AMD is more concerned with introducing a new UI that introduces new problems to the equation. These "non-issues" like the sound effect aren't important by themselves, they are simply the straw that breaks the camels back. The new UI is AMD's "let them eat cake" moment, and the community is not interested in cake at this point.

DrCR wrote:
What OS, besides Windows, are you expecting them to support outside of Linux? Just curious. (I simply did a Ctrl+F for "linux" since that's my scope of interest, and I found the below. I'm currently happily running the Nvidia binary driver in linux, but always consider the option of ATI AMD for my next card.)

I don't think you got my meaning. Linux supposedly works better than Windows. They took Ryzen support far more seriously than Microsoft did. Both Microsoft and AMD could provide better Ryzen support. AMD needs better drivers and bios support for ram, while Microsoft could optimize their scheduler better. It's all very half baked, and patches are massively delayed well past when they should be released. AMD cards are another story. I have no idea how current support is, but they've supposedly gotten better.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Radeon Software Adrenalin 2020 Edition Review

Tue Dec 17, 2019 11:40 pm

Hate to burst your bubble, but PC hasn't been driving gaming market for over a decade now. Consoles have been dictating everything and set the baseline. It is the main reason why quad-core CPUs that are nearly a decade old and second-generation DX11-hardware (Kepler/GCN 1.x genration) have been viable for gaming for so long.

PS5/Xbox Series X will finally raise the bar going beyond 4GiB VRAM GPUs and quad-core CPUs being the acceptable minimum. It is going to be moving towards 8GiB of VRAM and eight-core CPUs being the acceptable minimal. The lack of dedicated ray-tracking on either console will ensure that ray-tracing will remain professional-only with some customer-tier tech demos in the foreseeable future.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Waco
Maximum Gerbil
Posts: 4850
Joined: Tue Jan 20, 2009 4:14 pm
Location: Los Alamos, NM

Re: Radeon Software Adrenalin 2020 Edition Review

Wed Dec 18, 2019 12:16 am

DoomGuy64 wrote:
I don't think you got my meaning. Linux supposedly works better than Windows. They took Ryzen support far more seriously than Microsoft did. Both Microsoft and AMD could provide better Ryzen support. AMD needs better drivers and bios support for ram, while Microsoft could optimize their scheduler better. It's all very half baked, and patches are massively delayed well past when they should be released. AMD cards are another story. I have no idea how current support is, but they've supposedly gotten better.

No, not even close. The "they" you're talking about, in terms of Linux patches, was Microsoft.

I'm not quite sure what you think they should do better on Windows. Do you have specific complaints (and "this isn't optimized" isn't valid) or is this just more fanboy drivel without any actual backing facts?
Victory requires no explanation. Defeat allows none.
 
DoomGuy64
Gerbil
Topic Author
Posts: 47
Joined: Mon Jun 08, 2015 4:09 pm

Re: Radeon Software Adrenalin 2020 Edition Review

Wed Dec 18, 2019 6:57 am

Waco wrote:
No, not even close. The "they" you're talking about, in terms of Linux patches, was Microsoft.

I'm not quite sure what you think they should do better on Windows. Do you have specific complaints (and "this isn't optimized" isn't valid) or is this just more fanboy drivel without any actual backing facts?

What are you smoking? It's only now that Windows 10 has "acceptable" Ryzen support. Ryzen's launch was pretty pathetic, and required AMD to issue their own power profile since Microsoft's default was negatively affecting performance. Linux had these patches out earlier than windows. AMD's 3xxx series also had issues hitting boost clocks, and they took too long to release an update that was only "acceptable", and plug n play Ram is still not a thing outside of "AMD certified" sticks. The whole platform is half baked, both Windows and AMD, while Linux is more direct about fixing it.

As for Microsoft being involved in Linux, that is a problem not helpful to the linux ethos. Microsoft is simply infiltrating the Linux community through Embrace, Extend, Extinguish. Linux does need financial and infrastructure support, but Microsoft is not a "safe" sponsor, and they are clearly affecting Linux at every level that they infiltrate. This is where we get the new Linux COC, "neutral" programming terms, and suspension of high level coders and managers for PC nonsense. Microsoft may be "helping" Linux, but at the same time they are completely transforming the community. You might as well call Linux a Microsoft controlled corporate project at this point. That said, Linux still pushes out hardware patches faster than Windows, although GPU support is still not up to par. That's most likely due to Linux being more business oriented, and not desktop oriented. I don't use Linux for that reason, but it is pretty clear Windows support for Ryzen was slower, and still not good as Intel.

Krogoth wrote:
Hate to burst your bubble, but PC hasn't been driving gaming market for over a decade now. Consoles have been dictating everything and set the baseline. It is the main reason why quad-core CPUs that are nearly a decade old and second-generation DX11-hardware (Kepler/GCN 1.x genration) have been viable for gaming for so long.

PS5/Xbox Series X will finally raise the bar going beyond 4GiB VRAM GPUs and quad-core CPUs being the acceptable minimum. It is going to be moving towards 8GiB of VRAM and eight-core CPUs being the acceptable minimal. The lack of dedicated ray-tracking on either console will ensure that ray-tracing will remain professional-only with some customer-tier tech demos in the foreseeable future.

You're not bursting my bubble, you're contradicting yourself and proving my point. You were the one saying gamers want HD textures, then you admit the reason why graphics are stagnant is because of consoles. That's my point. We aren't getting full dx11 support on PC, nor are we getting any hardware optimization. Games look bad, and perform worse. Nvidia is at least doing something about it through RTX, but my point is that we don't need RTX and the real problem is lack of PC support. So what then? We wait for next gen consoles to raise the graphics level on PC, because it's not happening organically? Regardless, that doesn't excuse AMD from having subpar drivers, and trying to cover it up with a new coat of UI paint that just introduces new problems.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Radeon Software Adrenalin 2020 Edition Review

Wed Dec 18, 2019 10:35 am

I'm not contradicting myself. Gamers and gaming artist have been preferring allocating the graphical processing budget towards more complicated models/animations, textures, and higher resolutions over more realistic shadowing and reflections. The cost of ray-tracing rendering is simply too high for customer-tier platforms. The majority of gamers don't care enough for more accurate shadowing and reflections to justify it over taking a regressive hit on models, screen resolutions and textures. This has been the conundrum for ray-tracing ever since 3D digital graphics became possible on customer-tier hardware. How hardware advances in prowess for customer-tier platforms is a different matter.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Waco
Maximum Gerbil
Posts: 4850
Joined: Tue Jan 20, 2009 4:14 pm
Location: Los Alamos, NM

Re: Radeon Software Adrenalin 2020 Edition Review

Wed Dec 18, 2019 11:02 am

DoomGuy64 wrote:
What are you smoking? It's only now that Windows 10 has "acceptable" Ryzen support. Ryzen's launch was pretty pathetic, and required AMD to issue their own power profile since Microsoft's default was negatively affecting performance. Linux had these patches out earlier than windows. AMD's 3xxx series also had issues hitting boost clocks, and they took too long to release an update that was only "acceptable", and plug n play Ram is still not a thing outside of "AMD certified" sticks. The whole platform is half baked, both Windows and AMD, while Linux is more direct about fixing it.

So AMD going above and beyond to develop a new power profile is "pathetic"? You're pissed that Microsoft didn't tailor their default power profile for a brand new processor? RAM is absolutely "plug and play" outside of AMD certified RAM if you stick to JEDEC specs. I would hate to try to please you on anything. The only real complaint I've seen you present is the boost clock issue, which turned out to be minimal at best.

DoomGuy64 wrote:
As for Microsoft being involved in Linux, that is a problem not helpful to the linux ethos. Microsoft is simply infiltrating the Linux community through Embrace, Extend, Extinguish. Linux does need financial and infrastructure support, but Microsoft is not a "safe" sponsor, and they are clearly affecting Linux at every level that they infiltrate. This is where we get the new Linux COC, "neutral" programming terms, and suspension of high level coders and managers for PC nonsense. Microsoft may be "helping" Linux, but at the same time they are completely transforming the community. You might as well call Linux a Microsoft controlled corporate project at this point. That said, Linux still pushes out hardware patches faster than Windows, although GPU support is still not up to par. That's most likely due to Linux being more business oriented, and not desktop oriented. I don't use Linux for that reason, but it is pretty clear Windows support for Ryzen was slower, and still not good as Intel.

...that's not how development of the Linux kernel works. There's no "infiltrating" by Microsoft. The community changes weren't driven by Microsoft in any way - it was bringing a toxic community into modern times. Have you ever submitted patches to the Linux kernel? Written a single line of module code? Dealt with Linus? Yeah, thought not.

The "lack of support" you keep harping about is literally a nonissue aside from people staring at clock rates and not understanding what they're seeing.
Victory requires no explanation. Defeat allows none.
 
Igor_Kavinski
Minister of Gerbil Affairs
Posts: 2077
Joined: Fri Dec 22, 2006 2:34 am

Re: Radeon Software Adrenalin 2020 Edition Review

Thu Dec 19, 2019 4:01 am

Let's all agree that things COULD be better. Games could be more advanced graphically. We seem to have hit a visual plateau due to corporate greed.
 
Waco
Maximum Gerbil
Posts: 4850
Joined: Tue Jan 20, 2009 4:14 pm
Location: Los Alamos, NM

Re: Radeon Software Adrenalin 2020 Edition Review

Thu Dec 19, 2019 11:54 am

Igor_Kavinski wrote:
Let's all agree that things COULD be better.

Emptiest statement of the year.
Victory requires no explanation. Defeat allows none.
 
synthtel2
Gerbil Elite
Posts: 956
Joined: Mon Nov 16, 2015 10:30 am

Re: Radeon Software Adrenalin 2020 Edition Review

Sat Dec 21, 2019 8:18 pm

Just going from the thread title, my own experience with this driver release is more good than bad. Being able to globally enable RAL rather than having to set it for each game is great, and while the UI changes seem pointlessly radical, the UI is a lot snappier and less buggy now.1 On the bad side, not being able to set min/max states in WattMan is annoying, the UI didn't make it clear that RAL and Boost can't be enabled at the same time, having a graphical overlay notification on every game boot even with the overlay disabled is moderately annoying, and it made the output glitchiness some early 460s/470s/480s have at idle worse.

WattMan has always done what I want it to do without hassle (and you're not giving much detail - I don't know what brokenness to look for), and low power consumption at idle and full performance when gaming both work. Boost clock stability is about +/- 1% for me, which isn't ideal but also is basically nothing compared to other factors that mess with frametime consistency.

1 The VRAM clock setting in WattMan doesn't read minimum for a second on first load, the temperature/utilization tooltip on the graph at the top of WattMan doesn't sometimes persist after the window is closed, and while it isn't technically a bug you don't open a browser constantly by clicking in wrong area.

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On