Personal computing discussed
Moderators: renee, morphine, SecretSquirrel
techguy wrote:I have a Freesync monitor as well.
Still went with Nvidia. I would rather keep FPS above the low numbers where VRR technologies become truly valuable, than get a slower graphics card and rely on my monitor to bail me out.
Have a friend who was in the same boat and did the same thing. He desperately wanted to buy Vega and waited years to upgrade his system, clinging to an R9 290 that was just barely getting by in modern games on his 1440p Freesync monitor. Upgraded his system to a 1080 Ti and hasn't looked back. Doesn't even talk about settings or FPS any more, it's just not a concern.
There's nothing wrong with VRR technologies, I think they're great. Given graphics card prices lately though I wouldn't recommend Vega to anyone but miners. And 1080 is seriously old hat. I wouldn't spend $550+ on a GPU that came out 2+ years ago unless it was made of gold.
/my 2 cents
Topinio wrote:Kretschmer wrote:We're talking about gaming, not Linux.
¿por qué no los dos?
Srsly, though, card availability hasn't been that many days after the Linux kernel support, even if YMMV on getting decent gaming performance without booting Windows.
- Radeon RX 580 was 89
- Radeon Instinct MI25 was 87 and Radeon RX Vega 64 was 141
- today is 49 days and counting...
Gandolf wrote:Kretschmer wrote:If you have the PSU and cooling to support a Vega64 AND plan on sticking with your monitor grab the Vega. I prefer Pascal, but adaptive sync is really nice. Just be aware that Vega could be the new Hawaii and you might not be able to ugprade for several years.
I have a corsair Obsidian 450D case and a Antec high current pro 850 power supply. My cpu is liquid cooled. Is this enough? My bottleneck could possibly be the case being too small?
Kretschmer wrote:Gandolf wrote:Kretschmer wrote:If you have the PSU and cooling to support a Vega64 AND plan on sticking with your monitor grab the Vega. I prefer Pascal, but adaptive sync is really nice. Just be aware that Vega could be the new Hawaii and you might not be able to ugprade for several years.
I have a corsair Obsidian 450D case and a Antec high current pro 850 power supply. My cpu is liquid cooled. Is this enough? My bottleneck could possibly be the case being too small?
Your PSU is adequate, assuming that it is in good shape. For a Vega64 you'll need to ensure that your case has a lot of big (not necessarily fast) fans to move all that damn heat from the Vega64. Your case has 16.9" of GPU clearance if you're not using the extra drive bay (or 11.5" with said drive bay). Perusing a few random triple slot Vega64 cards on Newegg suggests that you'll need about 13" to install.
Krogoth wrote:It is because G-sync for mobile Nvidia GPUs uses VESA’s VRR implementation over DisplayPort. Freesync 1 is the same thing, so any Freesync monitor can do VRR with mobile Nvidia GPUs.
Gandolf wrote:I have a corsair Obsidian 450D case and a Antec high current pro 850 power supply. My cpu is liquid cooled. Is this enough? My bottleneck could possibly be the case being too small?
Topinio wrote:Gandolf wrote:I have a corsair Obsidian 450D case and a Antec high current pro 850 power supply. My cpu is liquid cooled. Is this enough? My bottleneck could possibly be the case being too small?
This? http://store.antec.com/highcurrentpro/h ... tinum.html
Should be fine.
Get a blower card, then you don't need to care about the case, just it's placement.
Chrispy_ wrote:Krogoth wrote:It is because G-sync for mobile Nvidia GPUs uses VESA’s VRR implementation over DisplayPort. Freesync 1 is the same thing, so any Freesync monitor can do VRR with mobile Nvidia GPUs.
And this is why I hate that Nvidia won't do VESA VRR over Displayport (AKA Freesync 1) on desktop. I own a Freesync TV. I hook a GTX970 up to it.
WHY WON'T NVIDIA DO THE OBVIOUS THING THEY NEED TO DO?!
Kretschmer wrote:A blower can only work with the ambient temperature of the case in question. I need to rehouse my 7700K/1080Ti system, as the current Node 304 with its paltry 92mm intakes gets toasty enough in the summer to throttle the GPU.
Kretschmer wrote:Chrispy_ wrote:WHY WON'T NVIDIA DO THE OBVIOUS THING THEY NEED TO DO?!
This is maddening but won't change until the market punishes Nvidia. They should just call VESA VRR "GSync Lite" to mollify GSync monitor manufacturing partners and flog the incremental difference as a huge boon for "GSync Plus" to save face.
Usacomp2k3 wrote:Kretschmer wrote:Chrispy_ wrote:WHY WON'T NVIDIA DO THE OBVIOUS THING THEY NEED TO DO?!
This is maddening but won't change until the market punishes Nvidia. They should just call VESA VRR "GSync Lite" to mollify GSync monitor manufacturing partners and flog the incremental difference as a huge boon for "GSync Plus" to save face.
That is actually a good idea that I haven't heard before.
Kretschmer wrote:Honestly, I'm less concerned about the few hundred dollars GSync premium than I am about how few GSync monitors hit the market.
Usacomp2k3 wrote:Kretschmer wrote:Honestly, I'm less concerned about the few hundred dollars GSync premium than I am about how few GSync monitors hit the market.
I actually view that as a good thing. That keeps the quality high. It seems to me the specs are rigid enough that they are rejecting the low-end monitors that AMD is happy to through a freesync label on.
Gandolf wrote:techguy wrote:I have a Freesync monitor as well.
Still went with Nvidia. I would rather keep FPS above the low numbers where VRR technologies become truly valuable, than get a slower graphics card and rely on my monitor to bail me out.
Have a friend who was in the same boat and did the same thing. He desperately wanted to buy Vega and waited years to upgrade his system, clinging to an R9 290 that was just barely getting by in modern games on his 1440p Freesync monitor. Upgraded his system to a 1080 Ti and hasn't looked back. Doesn't even talk about settings or FPS any more, it's just not a concern.
There's nothing wrong with VRR technologies, I think they're great. Given graphics card prices lately though I wouldn't recommend Vega to anyone but miners. And 1080 is seriously old hat. I wouldn't spend $550+ on a GPU that came out 2+ years ago unless it was made of gold.
/my 2 cents
The Vega 64 is the same price as the 1080 currently and after the newest drivers benchmarks have it faster than the 1080.
techguy wrote:1) if what you say is true then it should be easy enough to demonstrate across a WIDE RANGE of games and not just DX12 - i.e., show me the link
2) even if it is true - big deal. Vega 64 needs 100W more than a 2+ year old GTX 1080 and it's just now, finally beating it? let me know when AMD has something that competes against Nvidia's fastest and I'll be interested again
techguy wrote:1) if what you say is true then it should be easy enough to demonstrate across a WIDE RANGE of games and not just DX12 - i.e., show me the link
techguy wrote:2) even if it is true - big deal. Vega 64 needs 100W more than a 2+ year old GTX 1080 and it's just now, finally beating it? let me know when AMD has something that competes against Nvidia's fastest and I'll be interested again
Gandolf wrote:Update: I ended up ordering the Vega 64. It should be here Saturday. I am going to check how hot it gets, I don't normally have the side of my case on. But if I have to I will be ordering different fans with higher airflow.
Topinio wrote:Vega 64 looks competitive to me. (Not done a proper look.)
Or a performance drop and losing VRR for a GTX 1080 and keep the monitor, @ £450.
dragontamer5788 wrote:techguy wrote:1) if what you say is true then it should be easy enough to demonstrate across a WIDE RANGE of games and not just DX12 - i.e., show me the link
2) even if it is true - big deal. Vega 64 needs 100W more than a 2+ year old GTX 1080 and it's just now, finally beating it? let me know when AMD has something that competes against Nvidia's fastest and I'll be interested again
Quickie question: are you here to help Gandolf with his decision, or are you here to argue? The point of this topic, as far as I can tell, is to help Gandolf with his decision. Its not Gandolf's job to prove to you anything. If you want to flame war, I suggest trying reddit. There are plenty of people over there who like to argue.
In any case, Gandolf has already purchased a Vega64. So that debate has been settled. I think the extra heat issue is well known, and most of the recent messages in this topic are about heat management and proper fan locations in a case.
techguy wrote:dragontamer5788 wrote:techguy wrote:1) if what you say is true then it should be easy enough to demonstrate across a WIDE RANGE of games and not just DX12 - i.e., show me the link
2) even if it is true - big deal. Vega 64 needs 100W more than a 2+ year old GTX 1080 and it's just now, finally beating it? let me know when AMD has something that competes against Nvidia's fastest and I'll be interested again
Quickie question: are you here to help Gandolf with his decision, or are you here to argue? The point of this topic, as far as I can tell, is to help Gandolf with his decision. Its not Gandolf's job to prove to you anything. If you want to flame war, I suggest trying reddit. There are plenty of people over there who like to argue.
In any case, Gandolf has already purchased a Vega64. So that debate has been settled. I think the extra heat issue is well known, and most of the recent messages in this topic are about heat management and proper fan locations in a case.
He already made his choice, no "help" left to provide at this point, it's all academic now.
Not looking for a flamewar, just stating my opinion.
Kretschmer wrote:Topinio wrote:Vega 64 looks competitive to me. (Not done a proper look.)
Or a performance drop and losing VRR for a GTX 1080 and keep the monitor, @ £450.
How can you not have checked the benchmarks AND assert that the 1080 loses to a Vega64?
Topinio wrote:Quick Google gives https://www.tomshardware.com/reviews/as ... 520-3.html
Vega 64 looks competitive to me. (Not done a proper look.)
[...]
Fair enough, if you have the $$$ and/or a G-SYNC monitor.
Gandolf has a https://www.asus.com/uk/Monitors/ROG-Strix-XG35VQ/ and isn't paying at the GTX 1080 Ti price level, so the Vega is a no-brainer really. To get at least the same performance as a RX Vega 64 and keep VRR would mean buying e.g. GTX 1080 Ti and PG348Q @ £1650 rather than spend £500 for the Vega.
Or a performance drop and losing VRR for a GTX 1080 and keep the monitor, @ £450.
Gandolf wrote:Yea I wasn't trying to start any arguments here either. I was just going buy Tomsguide review posted above. And it has the Vega 64 as being faster in almost every game they tested. And I just figured that plus the freesync would be the best way for me to go.
NVidia's GPUs tend to perform better under DirectX 11, which is far more common right now.
There's an expectation for the overall market to shift towards DirectX12, but its way slower this time around than previous shifts.
K-L-Waster wrote:This:NVidia's GPUs tend to perform better under DirectX 11, which is far more common right now.
.. and this:There's an expectation for the overall market to shift towards DirectX12, but its way slower this time around than previous shifts.
... may be causally linked. Game publishers want to maximize sales, so when they see that NVidia has a larger install base they tailor their games to attract that larger market.
Not "fair" or "forward thinking", of course, but understandable that they want to go after the largest pool of customers.