Personal computing discussed

Moderators: renee, morphine, SecretSquirrel

 
odizzido
Gerbil Team Leader
Posts: 211
Joined: Fri May 06, 2005 6:10 am

Re: Vega Meta Analysis

Sat Aug 26, 2017 11:46 pm

ptsant wrote:
The more I read about the chip, the more I realize that it is first and foremost a compute architecture. The Instinct (the deep learning card based on Vega) and the FE are very serious and competitive product. The RX is a bit of an afterthought. At its nominal price, the RX64 is a bit useless, but the RX56 (again, at the theoretical $400 price) would make a very compelling addition to the RX range.

I am still very happy with my RX480 which runs everything very smoothly, but an undervolted RX56 makes a lot of sense for people who care about twitchy games and 144Hz.


The 56 looked really nice from what I recall. If I were buying a new card that would probably be what I would get. If I could actually find one of course. I just checked and it's not even listed at the local shop I go to.
 
LostCat
Minister of Gerbil Affairs
Posts: 2107
Joined: Thu Aug 26, 2004 6:18 am
Location: Earth

Re: Vega Meta Analysis

Sun Aug 27, 2017 12:47 am

odizzido wrote:
The 56 looked really nice from what I recall. If I were buying a new card that would probably be what I would get. If I could actually find one of course. I just checked and it's not even listed at the local shop I go to.

I don't think they even announced its release date yet. The 64 launch was all about the 64. Like what they did with Ryzen, only with less options.

I could be wrong, but that was the impression I got.
Meow.
 
CScottG
Graphmaster Gerbil
Posts: 1252
Joined: Fri Dec 01, 2006 9:53 pm

Re: Vega Meta Analysis

Sun Aug 27, 2017 1:07 am

Lordhawkwind wrote:
..I had my Fury Pro for two years which is the longest I've had a graphics card and I'll probably keep this Vega card for at least the same time. For £450 I'm not complaining.


This places you in an unbelievably small minority. :wink:

(..but it is good that you are getting value out of it, even if the "tax" is brutal right now.) :D
 
CScottG
Graphmaster Gerbil
Posts: 1252
Joined: Fri Dec 01, 2006 9:53 pm

Re: Vega Meta Analysis

Sun Aug 27, 2017 1:16 am

LostCat wrote:
odizzido wrote:
The 56 looked really nice from what I recall. If I were buying a new card that would probably be what I would get. If I could actually find one of course. I just checked and it's not even listed at the local shop I go to.


I don't think they even announced its release date yet. The 64 launch was all about the 64. Like what they did with Ryzen, only with less options.

I could be wrong, but that was the impression I got.



One more day (Monday the 28th).

I think I remember reading/viewing something that said the 56 was the card to get, not simply because of price/performance, but because it was the only card with any real headroom for overclocking performance gains.

BTW, about $175 of Vega in *any* edition is directly related to its memory. Gamersnexus had a good video on this and why HBM2 was actually need (..and it wasn't just thermal/power-draw).
 
odizzido
Gerbil Team Leader
Posts: 211
Joined: Fri May 06, 2005 6:10 am

Re: Vega Meta Analysis

Sun Aug 27, 2017 1:51 am

Oh I didn't know that the 56 wasn't for sale yet. I guess AMD wanted to get all of the impatient 64 sales they could. I suppose I may have made the same decision if I were AMD so I won't really hold it against them especially since it was only for a short time and I do like the open standards approach they have.
 
CScottG
Graphmaster Gerbil
Posts: 1252
Joined: Fri Dec 01, 2006 9:53 pm

Re: Vega Meta Analysis

Sun Aug 27, 2017 2:00 am

..I had similar problems with their schedule. :oops:

The only thing I think is weird about it that the reviews didn't coincide with its actual presence.

Note: if you can't find the card on Monday at the usual sites for it's retail price, try looking directly at manufacturers websites. My guess is that more than a few providers will keep product for their own webshop, pocket the difference that they would have paid the retailer, with a "order one card only" policy.
 
HERETIC
Gerbil XP
Posts: 488
Joined: Sun Aug 24, 2014 4:10 am

Re: Vega Meta Analysis

Sun Aug 27, 2017 4:46 am

Chrispy_ wrote:
It's a failure to me because it's so power hungry..


When I very first read 1070-151 Watts Vega 56-237 Watts,was very disappointed,50% more power for same result.
Recently read this-
https://www.techspot.com/review/1476-am ... age12.html
When comparing stock Vega 56 to a aftermarket OC 1070 we see similar performance and a power difference of 331-294=37 Watts.
If we were to game 4 hrs a day for a month we would have 120x37/1000=4.44Kw @ $0.24=$1.06 added to monthly power bill.
So definitely NOT a failure-just a little disappointing...........
 
JustAnEngineer
Gerbil God
Posts: 19673
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: Vega Meta Analysis

Sun Aug 27, 2017 5:41 am

Residential electricity prices have recently hit their summer peak at a US national average of 13.2¢ per kW-hr. Looking at year-round average retail pricing to all sectors, electricity averaged 10.4¢ per kW-hr across the U.S. The cheapest power is in Washington, at 7.4¢ per kW-hr, while the folks in Hawaii pay much more, at 26.2¢ per kW-hr.
Last edited by JustAnEngineer on Sun Aug 27, 2017 6:22 am, edited 1 time in total.
· R7-5800X, Liquid Freezer II 280, RoG Strix X570-E, 64GiB PC4-28800, Suprim Liquid RTX4090, 2TB SX8200Pro +4TB S860 +NAS, Define 7 Compact, Super Flower SF-1000F14TP, S3220DGF +32UD99, FC900R OE, DeathAdder2
 
ermo
Gerbil
Posts: 55
Joined: Tue Jan 25, 2011 7:35 pm

Re: Vega Meta Analysis

Sun Aug 27, 2017 6:05 am

Krogoth wrote:
ermo wrote:
Duct Tape Dude wrote:
+1
I wanted a flagship card and would have been ok with taking Vega64+Freesync vs a 1080Ti+Vsync if the power consumption was lower.


Exactly this.

Unless RTG and GloFo can pull a rabbit out of their collective as***^H hats with future steppings, Vega will likely be consigned to the history books as borderline irrelevant.

For something that was hyped as the shining beacon of the future @RTG, that's a pretty poor showing.


It is performing as expected. The crowd who are disappointed are those who were expecting a "Pascal killer". The reality is high-end GPUs have been hitting the walls of physics for some time. GP100 and GP102 aren't exactly that far behind in terms of power consumption.


It's performing as expected in compute (where it trades blows with the 1080Ti and Titan Xp in some scenarios), but NOT in gaming. This is my beef with the RX Vega line.

FWIW, I still want one. Just not with launch-spec silicon + drivers.
Linux: 2x FX-8350, 16/32GB, GTX970/HD7970, Solus+KDE/Exherbo+Xfce
Hackintosh: i7-2600K, 32GB, HD7970, High Sierra
HTPC: Q9400, 8GB, GTS 450, Solus/GNOME
Server: FX-8350, 16GB ECC, iGP, F30/Srv
Wintendo: R7 2700X, 32GB, 2x RX Vega 64 (air), W10Pro
 
ptsant
Gerbil XP
Posts: 397
Joined: Mon Oct 05, 2009 12:45 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 2:22 am

HERETIC wrote:
Chrispy_ wrote:
It's a failure to me because it's so power hungry..


When I very first read 1070-151 Watts Vega 56-237 Watts,was very disappointed,50% more power for same result.
Recently read this-
https://www.techspot.com/review/1476-am ... age12.html
When comparing stock Vega 56 to a aftermarket OC 1070 we see similar performance and a power difference of 331-294=37 Watts.
If we were to game 4 hrs a day for a month we would have 120x37/1000=4.44Kw @ $0.24=$1.06 added to monthly power bill.
So definitely NOT a failure-just a little disappointing...........


Vega has unfortunately been tested with the "balanced" (lol) profile. Simply putting it on "powersave" massively reduces power consumption with only a few percent performance loss. In my book, the Vega 56 is practically equivalent to the 1070 in power/perf with much better compute, Freesync and eventually the potential for performance improvements down the road, as the architecture matures.

So, if someone is not tied to G-Sync or CUDA, the Vega 56 (AT MSRP!!!) is a perfectly acceptable alternative.
Image
 
Jigar
Maximum Gerbil
Posts: 4936
Joined: Tue Mar 07, 2006 4:00 pm
Contact:

Re: Vega Meta Analysis

Wed Aug 30, 2017 3:48 am

Lot of people viewed Vega as failed high-end alternative because Vega didn't beat Nvidia's GTX 1080Ti. Other cons like high power consumption only added fuel to the fire
Image
 
Topinio
Gerbil Jedi
Posts: 1839
Joined: Mon Jan 12, 2015 9:28 am
Location: London

Re: Vega Meta Analysis

Wed Aug 30, 2017 4:35 am

ptsant wrote:
Vega has unfortunately been tested with the "balanced" (lol) profile. Simply putting it on "powersave" massively reduces power consumption with only a few percent performance loss. In my book, the Vega 56 is practically equivalent to the 1070 in power/perf with much better compute, Freesync and eventually the potential for performance improvements down the road, as the architecture matures.

So, if someone is not tied to G-Sync or CUDA, the Vega 56 (AT MSRP!!!) is a perfectly acceptable alternative.

I think that might not be right, based on the first plot on the overclocking page of [H]'s review at https://www.hardocp.com/article/2017/08 ... _review/16 -- at defaults, the thing is either struggling to maintain clocks or aggressively powersaving and I'm concerned this might have a detrimental effect in games.

If so, it'll be seen in plots of frame time variation, so something to check for on installation, I guess.
Desktop: 750W Snow Silent, X11SAT-F, E3-1270 v5, 32GB ECC, RX 5700 XT, 500GB P1 + 250GB BX100 + 250GB BX100 + 4TB 7E8, XL2730Z + L22e-20
HTPC: X-650, DH67GD, i5-2500K, 4GB, GT 1030, 250GB MX500 + 1.5TB ST1500DL003, KD-43XH9196 + KA220HQ
Laptop: MBP15,2
 
ptsant
Gerbil XP
Posts: 397
Joined: Mon Oct 05, 2009 12:45 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 5:52 am

Topinio wrote:
ptsant wrote:
Vega has unfortunately been tested with the "balanced" (lol) profile. Simply putting it on "powersave" massively reduces power consumption with only a few percent performance loss. In my book, the Vega 56 is practically equivalent to the 1070 in power/perf with much better compute, Freesync and eventually the potential for performance improvements down the road, as the architecture matures.

I think that might not be right, based on the first plot on the overclocking page of [H]'s review at https://www.hardocp.com/article/2017/08 ... _review/16 -- at defaults, the thing is either struggling to maintain clocks or aggressively powersaving and I'm concerned this might have a detrimental effect in games.

If so, it'll be seen in plots of frame time variation, so something to check for on installation, I guess.


I was looking at the Techreport review, which included a separate evaluation of RX Vega 56 with the Power saver profile. The end result is -30W from the default and I could probably shave off yet another 20-30W by undervolting (YMMV). The price to pay is an increase in mean frame time (in a single game, Hitman) from 15.5 to 16.2. In that same game, the stock 1070 manages 16.4 ms but consumes 232W vs 296W for the Vega.

Another way to put it is that stock Vega 56 is a bit faster than stock 1070. Making them similar in performance (either by powersaving on the 56 or overclocking the 1070) probably narrows the power difference even further.

I do agree that Vega will always consume a bit more power, but not necessarily the insane amounts that you see in most reviews. As I said before, the RX should have been clocked a bit lower by default, something which results in massive power savings.
Image
 
Kretschmer
Gerbil XP
Posts: 462
Joined: Sun Oct 19, 2008 10:36 am

Re: Vega Meta Analysis

Wed Aug 30, 2017 7:11 am

Why Vega is a Failure with a capital F:
1) You won't be able to buy one anywhere near MSRP for months. Unless you've spent tons of dough on a FreeSync display, it makes sense to just grab Pascal.
2) Power consumption is stupid for the performance. (Lesser) Enthusiasts sites are recommending that buyers pair Vega 64 cards with 1,000 watt PSU upgrades. That destroys any value proposition.
3) AMD will struggle with their margins, because they're selling a huge chip with HBM2 that trades blows with Nvidia's smaller die/GDDR5X parts.
4) Released over a year after Pascal, with buyers defaulting to Nvidia in the meantime.
5) Buying into the FreeSync/AMD ecosystem means that you'll be at the mercy of the RTG for future upgrades. A lot of frustrated 290 buyers had to wait years for a viable upgrade path.
Last edited by Kretschmer on Wed Aug 30, 2017 7:17 am, edited 1 time in total.
 
Kretschmer
Gerbil XP
Posts: 462
Joined: Sun Oct 19, 2008 10:36 am

Re: Vega Meta Analysis

Wed Aug 30, 2017 7:16 am

ptsant wrote:
Another way to put it is that stock Vega 56 is a bit faster than stock 1070. Making them similar in performance (either by powersaving on the 56 or overclocking the 1070) probably narrows the power difference even further.


If you're going to hand tweak your cards, though, it would seem to make sense to either lower the voltage on the 1070 (for similar performance) or overclock the 1070 for another 10+% performance. It seems odds to compare apples and oranges.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Vega Meta Analysis

Wed Aug 30, 2017 10:15 am

Kretschmer wrote:
Why Vega is a Failure with a capital F:
1) You won't be able to buy one anywhere near MSRP for months. Unless you've spent tons of dough on a FreeSync display, it makes sense to just grab Pascal.
2) Power consumption is stupid for the performance. (Lesser) Enthusiasts sites are recommending that buyers pair Vega 64 cards with 1,000 watt PSU upgrades. That destroys any value proposition.
3) AMD will struggle with their margins, because they're selling a huge chip with HBM2 that trades blows with Nvidia's smaller die/GDDR5X parts.
4) Released over a year after Pascal, with buyers defaulting to Nvidia in the meantime.
5) Buying into the FreeSync/AMD ecosystem means that you'll be at the mercy of the RTG for future upgrades. A lot of frustrated 290 buyers had to wait years for a viable upgrade path.


Please drop the green tint shades. They are making you look silly.

The crypto-currency craze is inflating the entire market even on GPUs that are not good at mining. 1070 is hard to come by due to sheer demand.

Power consumption argument holds little water unless you are looking for near-silence and/or shooting for SFF build. 1080Ti isn't that far behind.

AMD is hardly struggling with margins here. They are making a handsome profit per unit. High-end GPUs always follow high margin/low volume strategy. Nvidia is just making healthier margins and this shows in their recent fiscal reports.

Nvidia crowd is also at the mercy of G-Sync ecology. Pick your poison.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Vhalidictes
Gerbil Jedi
Posts: 1835
Joined: Fri Jan 07, 2005 2:32 pm
Location: Paragon City, RI

Re: Vega Meta Analysis

Wed Aug 30, 2017 10:18 am

Jigar wrote:
Lot of people viewed Vega as failed high-end alternative because Vega didn't beat Nvidia's GTX 1080Ti. Other cons like high power consumption only added fuel to the fire


Well, it's AMD's fault for positioning it that way. To make matters even more confused, Vega *is* a high-end card from a "compute" perspective.
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 11:07 am

Kretschmer wrote:
Why Vega is a Failure with a capital F:
1) You won't be able to buy one anywhere near MSRP for months. Unless you've spent tons of dough on a FreeSync display, it makes sense to just grab Pascal.
2) Power consumption is stupid for the performance. (Lesser) Enthusiasts sites are recommending that buyers pair Vega 64 cards with 1,000 watt PSU upgrades. That destroys any value proposition.
3) AMD will struggle with their margins, because they're selling a huge chip with HBM2 that trades blows with Nvidia's smaller die/GDDR5X parts.
4) Released over a year after Pascal, with buyers defaulting to Nvidia in the meantime.
5) Buying into the FreeSync/AMD ecosystem means that you'll be at the mercy of the RTG for future upgrades. A lot of frustrated 290 buyers had to wait years for a viable upgrade path.


Krogoth wrote:
Please drop the green tint shades. They are making you look silly.

The crypto-currency craze is inflating the entire market even on GPUs that are not good at mining. 1070 is hard to come by due to sheer demand.

Power consumption argument holds little water unless you are looking for near-silence and/or shooting for SFF build. 1080Ti isn't that far behind.

AMD is hardly struggling with margins here. They are making a handsome profit per unit. High-end GPUs always follow high margin/low volume strategy. Nvidia is just making healthier margins and this shows in their recent fiscal reports.

Nvidia crowd is also at the mercy of G-Sync ecology. Pick your poison.



You managed to sound offended without really countering his points, congrats!

With the mining craze, AMD should have increased production to meet increased anticipated demand, right? No? That's on them. See, I can actually buy a GTX1070, GTX1080, or GTX1080Ti, for inflated but reasonable prices.

The power consumption argument holds in that an AMD card is going to dump significantly more heat into your room for the same performance. It matters for those it matters to, and it remains and will remain a valid point until AMD gets their efficiency up.

If you want to talk about margins, well, you admit that they're worse off than Nvidia by a long shot- but evidence would be needed either way. I'm betting that the best anyone can do is infer.

And hooray, FreeSync is free! Except it isn't, comes attached to lower-end monitors, and that's okay because AMD simply cannot make a card that's fast enough to warrant a high-end gaming monitor.

And that's on AMD too.
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 11:58 am

Airmantharp wrote:
With the mining craze, AMD should have increased production to meet increased anticipated demand, right? No? That's on them. See, I can actually buy a GTX1070, GTX1080, or GTX1080Ti, for inflated but reasonable prices.


How exactly was AMD supposed to increase production on Vega?

Airmantharp wrote:
If you want to talk about margins, well, you admit that they're worse off than Nvidia by a long shot- but evidence would be needed either way. I'm betting that the best anyone can do is infer.


Yup. Gaming Vega is almost, if not exactly, a loss-leader for AMD. But it was never intended to be competitive with Nvidia on a profit basis. It was a stepping stone to Navi (which AMD will have a chance to be competitive with) to retain mindshare in the high end gaming sector while AMD launched Ryzen. In the short term, AMD will take a hit due to gaming Vega. But gaming Vega will serve AMD's long term strategy just fine, at least as long as HBM2 supply ramps up sooner rather than later.

Airmantharp wrote:
And hooray, FreeSync is free! Except it isn't, comes attached to lower-end monitors, and that's okay because AMD simply cannot make a card that's fast enough to warrant a high-end gaming monitor.


You don't consider this a high end monitor? Or this? I wish G-sync montors had 3840x1600 panels
Last edited by cynan on Wed Aug 30, 2017 12:11 pm, edited 1 time in total.
 
The Egg
Minister of Gerbil Affairs
Posts: 2938
Joined: Sun Apr 06, 2008 4:46 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 12:02 pm

All these Vega threads are getting really obnoxious.

The cards have been out barely 2 weeks. I'm sure they're being produced as fast as possible. Resellers are jacking up the prices, not AMD. Yes, they use more power, but a Freesync monitor will also cost you substantially less.

If you don't like that, or anything about the cards, don't buy one. You can calmly exit the discussion. If you want a card and are frustrated that they aren't available (or for MSRP), unfortunately there's not much you can do. You can get mad at miners and retailers, but they're just doing what they can to make a buck. In reality, we should be mad at the cryptocurreny creators who forced the use of GPUs rather than allowing ASICs to be used. They're the ones who are ultimately at fault. Why does it matter to them which method is used to mine?
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 12:04 pm

cynan wrote:
You don't considerthis a high end monitor?Or this? I wish G-sync montors had 3840x1600 panels


How are we even supposed to take this seriously? You're grasping at crazy straws...

Look, the relative/affordable lack of 16:10 is a pet peeve of mine, sure, but once we get to 21:10 and stuff for gaming, like, can you get anymore weirdly specific?

I'm seriously at the point where it'd be 16:9 for me for gaming because it's not worth it otherwise, and that's from someone with multiple 16:10 monitors currently because I'm a stickler for it when it comes to desktop work.

But for gaming? At this point, I'd let it go. And on a wildly widescreen monitor? Yeah, I'd *really* let it go. :roll:
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 12:06 pm

The Egg wrote:
If you don't like that, or anything about the cards, don't buy one. You can calmly exit the discussion.


Thread is literally entitled "Vega Meta Analysis", which is also exactly what you think shouldn't be part of the "discussion".

Maybe you should exit the discussion if your only point here is that we shouldn't be having one that's actually directly on-topic for once.
 
The Egg
Minister of Gerbil Affairs
Posts: 2938
Joined: Sun Apr 06, 2008 4:46 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 12:11 pm

Glorious wrote:
The Egg wrote:
If you don't like that, or anything about the cards, don't buy one. You can calmly exit the discussion.


Thread is literally entitled "Vega Meta Analysis", which is also exactly what you think shouldn't be part of the "discussion".

Maybe you should exit the discussion if your only point here is that we shouldn't be having one that's actually directly on-topic for once.

Except that there really isn't much of any analysis going on at this point, just scat slinging for issues beyond the control of those in question.
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 12:23 pm

Glorious wrote:
cynan wrote:
You don't considerthis a high end monitor?Or this? I wish G-sync montors had 3840x1600 panels


How are we even supposed to take this seriously? You're grasping at crazy straws...

Look, the relative/affordable lack of 16:10 is a pet peeve of mine, sure, but once we get to 21:10 and stuff for gaming, like, can you get anymore weirdly specific?

I'm seriously at the point where it'd be 16:9 for me for gaming because it's not worth it otherwise, and that's from someone with multiple 16:10 monitors currently because I'm a stickler for it when it comes to desktop work.

But for gaming? At this point, I'd let it go. And on a wildly widescreen monitor? Yeah, I'd *really* let it go. :roll:


What does 21:9 vs 21:10 matter? This (3840x1600) is the resolution wide screen gaming is headed to. Maybe premature now, but with the next evolution in GPU performance, they'll be commonplace (at least as commonplace as 3440x1440 is now). It's logical. You get the full horizontal resolution of 4k, plus a vertical resolution that already exists on 16:10 panels. Yes 1440 is currently more common due to 27" 16:9 panels, but 1440 as a vertical resolution for 3840 horizontal of 4k is too wide. So 1600 it is.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Vega Meta Analysis

Wed Aug 30, 2017 12:24 pm

It is kinda hard to ramp up production when your memory supplies are tight. This is also effecting Nvidia with their GP100 and soon GV100 SKUs. Nvidia won its gambit for this round for sticking with GDDR5X on their non-GPGPU SKUs.

1070, 1080 and 1080Ti are still ptough sales for those looking for an upgrade without spending an arm and leg. 1070 is actually hard to come by due to the mining craze. This is forcing prospective buyers to opt for a 1080 which isn't that much more with current market prices.


Outside of small niches, power efficency has been never been an issue for Performance GPUs since Voodoo 2, Rage 128 and TNT.

Freesync is technically "free" as there's no royalties or propertary hardware involved. Nvidia has no reason to drop G-Sync ecology in light of Vega's lackluster launch and inability to disrupt the high-end market.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 12:32 pm

cynan wrote:
What does 21:9 vs 21:10 matter? This (3840x1600) is the resolution wide screen gaming is headed to. Maybe premature now, but with the next evolution in GPU performance, they'll be commonplace (at least as commonplace as 3440x1440 is now). It's logical. You get the full horizontal resolution of 4k, plus a vertical resolution that already exists on 16:10 panels. Yes 1440 is currently more common due to 27" 16:9 panels, but 1440 as a vertical resolution for 3840 horizontal of 4k is too wide.


I'm sorry, I've been (unfortunately) living on a planet where, for the last decade, 16:9 has increasingly supplanted 16:10 to the point where finding decent 16:10 is disheartening prospect.

I have no idea what your planet is like, but on this matter, at least, it appears to be in line with my preferences.

How do I get there?

cynan wrote:
So 1600 it is.


You need to show me the interdimensional portal you just stumbled out of, because "3440x1600" *HAS LITERALLY TWO ORDERS OF MAGNITUDE LESS* hits on google than "3440x1440".

I am talking ~10k versus ~2 million.

So, no, in *THIS* OBJECTIVE REALITY, "1600" it isn't.

I mean, "it's logical"? What?

Cynan, buddy, please show me your portal.
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 12:36 pm

Glorious wrote:
cynan wrote:
What does 21:9 vs 21:10 matter? This (3840x1600) is the resolution wide screen gaming is headed to. Maybe premature now, but with the next evolution in GPU performance, they'll be commonplace (at least as commonplace as 3440x1440 is now). It's logical. You get the full horizontal resolution of 4k, plus a vertical resolution that already exists on 16:10 panels. Yes 1440 is currently more common due to 27" 16:9 panels, but 1440 as a vertical resolution for 3840 horizontal of 4k is too wide.


I'm sorry, I've been (unfortunately) living on a planet where, for the last decade, 16:9 has increasingly supplanted 16:10 to the point where finding decent 16:10 is disheartening prospect.

I have no idea what your planet is like, but on this matter, at least, it appears to be in line with my preferences.

How do I get there?

cynan wrote:
So 1600 it is.


You need to show me the interdimensional portal you just stumbled out of, because "3440x1600" *HAS LITERALLY TWO ORDERS OF MAGNITUDE LESS* hits on google than "3440x1440".

I am talking ~10k versus ~2 million.

So, no, in *THIS* OBJECTIVE REALITY, "1600" it isn't.

I mean, "it's logical"? What?

Cynan, buddy, please show me your portal.


Well sure, I've never heard of a monitor with 3440x1600 either. :wink:

I never said 3840x1600 was currently popular, so I don't get your point. Only that that's where wide-screen gaming is headed, for the reasons provided.
 
Wren
Gerbil
Posts: 80
Joined: Wed Jul 09, 2014 4:15 pm
Location: England

Re: Vega Meta Analysis

Wed Aug 30, 2017 12:48 pm

Lots of Vega hate. :( And I can sorta see why, to be honest. I was expecting more out of it, but I think it will get considerably better with time. It's the first GPU to feature the highest DX12 feature tiers (I think?) and has features like NGG Fast path and Primitive shaders awaiting to be activated and used. Also Rapid Packed Math should help it in certain titles, where I Think it will surpass the 1080 and potentially go toe to toe with the Ti.

I am quite a big AMD fan, so i went ahead and bought the Liquid Edition for £650 lol. From a thermal/acoustics point of view the cooler really does keep the GPU in check very well. But power consumption at stock and in Turbo mode is... something else. I measured over 650W at the wall (150W from my idle system) in Turbo mode at 1700MHz~ which is like 500W from the card itself. I was quite shocked actually lol. But undervolting helps MASSIVELY. I can shave off 200W and only a few percent of performance by dropping the vCore from 1.2 to 1.05 and the clocks from 1.7 to 1.6. (It makes very little difference to performance).

Overall, aside from knowing that I overpaid for the (current) performance compared to the Nvidia cards, I am very happy with this card. It handles everything at 1440p that I play. I have very high hopes that Vega will eventually match/beat the 1080 Ti when its features are fully unlocked.

Also I wanted to ask a question. What's the prevailing theory behind why Vega (and other GCN GPUs for that matter) can't put their raw TeraFlops into FPS like Nvidia can? I mean at max clocks my card is putting out ~13TFLOPs similar to a lightly overclocked 1080 Ti. I know there is other things like bandwidth, pixel fillrate and geometry but it just always stumped me why AMD has far lower "TFLOP efficiency".

Sorry for the long post.
Ryzen 7 1800X @ 4.025 GHz | 16GB @ 3200 MHz C14 | MSI B350 PC MATE| Asus Turbo GTX 1070 Ti | Samsung Polaris-thingy M.2 SSD | Samsung 850 PRO 128GB | 3x 1TB HDDs |650W Seasonic G-Series Gold |NZXT H440 |Creative Soundblaster Z
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Vega Meta Analysis

Wed Aug 30, 2017 1:07 pm

cynan wrote:
Well sure, I've never heard of a monitor with 3440x1600 either.


fair enough, I'm getting a little confused and I definitely misread you, but maybe that's because I don't really understand what you are saying.

You were getting really specific in regards to how there aren't 3840x1600 g-sync monitors, and that just seems like cherry-picking to me, because there are plenty of 3840x2160 g-sync monitors, right?

I mean, like I said, non 16:9 aspect-ratios are always strange beasts, and in particular, 3840x1600 literally cannot offer anything that 3840x2160 with black bars cannot, which makes it fairly niche, no?

cynan wrote:
I never said 3840x1600 was currently popular, so I don't get your point. Only that that's where wide-screen gaming is headed, for the reasons provided.


But yet there won't be any g-sync 3840x1600 monitors once we get to where "wide-screen gaming" is heading?

Why not?

And, like I said, it's really niche so how this not a strange cherry-pick? Why would I want 3840x1600 instead of 3840x2160? Black bars? So what? I deal with those all the time on 16:10 and even 16:9 anyway. And for gaming, why not render to 2160p? If performance is a problem, OK, black bars or a lower res with same aspect, no?
 
LostCat
Minister of Gerbil Affairs
Posts: 2107
Joined: Thu Aug 26, 2004 6:18 am
Location: Earth

Re: Vega Meta Analysis

Wed Aug 30, 2017 1:20 pm

I really don't get people saying Freesync locks you to AMD cards. Sure, I can't use it right now on my 1070, but it's hardly a necessity if you can keep your framerates up anyway.

I'd have been happy to stay with AMD because of it, but I didn't lose any money or get a useless mon because of it. Where with G-Sync I'm still seeing nothing I'm even a little bit interested in.
Meow.

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On