Personal computing discussed

Moderators: renee, morphine, SecretSquirrel

 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Vega Meta Analysis

Thu Aug 31, 2017 9:08 am

Krogoth wrote:
Please just stop and drop the green-tinted shades.


I don't have green-tinted shades- I have reality-tinted shades. Vega is a poor buy at MSRP for most, and the people most interested in it are those that are buying just because they want to play with it, many owning 1080Ti's themselves, or those that also want to play with it for mining.

Yes, mining.

And then there are those zealots that have only one system and Vega is their upgrade this cycle. Some of those got lucky and got cards at or near MSRP.

And overall, that doesn't add up to much demand, but given that AMD massively undersupplied these cards, we have pricing indicators that appear to show much higher demand.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Vega Meta Analysis

Thu Aug 31, 2017 10:13 am

Airmantharp wrote:
Krogoth wrote:
Because it is the fastest Freesync-capable GPU on the market right now that happens to be a very powerful customer-tier GPGPU. Vega 64 easily matches fully enabled Titan XP at most GPGPU stuff (most recent drivers for Titan) for less $$$$. Not everything revolves around silly games. The tech does scale up to Pascal/Maxwell levels on a hardware level. The problem is that difficult to translate that into games. Vega 10 is pretty much AMD RTG's GF100. Hot, somewhat to completely underwhelming gaming performance but blew most of the competition away at GPGPU stuff.


You do realize that 'the tech scales for compute not games!' is literally what AMD does every generation, right?

Nvidia does the same thing ever since Femi with first version of their silicon. They just distill the design into "lesser" products to go after other markets. They learn their lesson with GF100 mishap. AMD RTG doesn't have the captial nor luxury to create a separate dedicated high-end gaming GPU. It doesn't really matter in long run since discrete GPUs are start becoming a small niche.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
NoOne ButMe
Gerbil Elite
Posts: 707
Joined: Fri May 15, 2015 9:31 pm

Re: Vega Meta Analysis

Thu Aug 31, 2017 12:00 pm

Supply is the stupid one.
GP104 supply ended up trashed due to reasons outside of Nvidia's control.
With Vega. It looks like AMD didn't wait to wait for a batch of HBM2/Vega chips which would be ready in October if rumor is correct.

Late September/early October is enough time to hit holiday seasonal sales.

Might also be AMD is just unwilling to produce more than X units at a time, and given performance and power, they didn't think it would sell so well. But given mining exists, that seems odd.

I'm just PO at "normal" mode basically being 'use 50-100W more power for 1-2FPS'

We should also add that sales of higher end dGPUs are increasing. It's profitable. And unless sales fall off, it will keep on.

I look forward to facepalming as people upgrade from a 1060 6GB to a 1160 (somewhere close to 1070 performance seems reasonable) when using 1080p 60Hz monitors.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Vega Meta Analysis

Thu Aug 31, 2017 12:20 pm

NoOne ButMe wrote:
Supply is the stupid one.
GP104 supply ended up trashed due to reasons outside of Nvidia's control.
With Vega. It looks like AMD didn't wait to wait for a batch of HBM2/Vega chips which would be ready in October if rumor is correct.

Late September/early October is enough time to hit holiday seasonal sales.

Might also be AMD is just unwilling to produce more than X units at a time, and given performance and power, they didn't think it would sell so well. But given mining exists, that seems odd.

I'm just PO at "normal" mode basically being 'use 50-100W more power for 1-2FPS'

We should also add that sales of higher end dGPUs are increasing. It's profitable. And unless sales fall off, it will keep on.

I look forward to facepalming as people upgrade from a 1060 6GB to a 1160 (somewhere close to 1070 performance seems reasonable) when using 1080p 60Hz monitors.


Not enough revenue exists in the high-end gaming market to cover the massive R&D involved in designing a GPU. That's why performance GPUs are essentially crippled or distilled versions of GPGPU designs. The bulk of revenue for discrete GPUs comes from mid-range stuff via volume.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Vega Meta Analysis

Thu Aug 31, 2017 12:47 pm

Krogoth wrote:
Airmantharp wrote:
Krogoth wrote:
Because it is the fastest Freesync-capable GPU on the market right now that happens to be a very powerful customer-tier GPGPU. Vega 64 easily matches fully enabled Titan XP at most GPGPU stuff (most recent drivers for Titan) for less $$$$. Not everything revolves around silly games. The tech does scale up to Pascal/Maxwell levels on a hardware level. The problem is that difficult to translate that into games. Vega 10 is pretty much AMD RTG's GF100. Hot, somewhat to completely underwhelming gaming performance but blew most of the competition away at GPGPU stuff.


You do realize that 'the tech scales for compute not games!' is literally what AMD does every generation, right?

Nvidia does the same thing ever since Femi with first version of their silicon. They just distill the design into "lesser" products to go after other markets. They learn their lesson with GF100 mishap. AMD RTG doesn't have the captial nor luxury to create a separate dedicated high-end gaming GPU. It doesn't really matter in long run since discrete GPUs are start becoming a small niche.


You just contradicted yourself, and I'm not entirely sure you're even addressing my point.
 
blahsaysblah
Gerbil Elite
Posts: 581
Joined: Mon Oct 19, 2015 7:35 pm

Re: Vega Meta Analysis

Thu Aug 31, 2017 1:16 pm

NoOne ButMe wrote:
I'm just PO at "normal" mode basically being 'use 50-100W more power for 1-2FPS'

They likely had to do that to increase supply. While most reviewers had no issues with power save mode, there could be a non-insignificant portion of supply that wont be stable at power save mode power levels. Dont assume you can use power save mode.
 
NoOne ButMe
Gerbil Elite
Posts: 707
Joined: Fri May 15, 2015 9:31 pm

Re: Vega Meta Analysis

Thu Aug 31, 2017 1:46 pm

blahsaysblah wrote:
NoOne ButMe wrote:
I'm just PO at "normal" mode basically being 'use 50-100W more power for 1-2FPS'

They likely had to do that to increase supply. While most reviewers had no issues with power save mode, there could be a non-insignificant portion of supply that wont be stable at power save mode power levels. Dont assume you can use power save mode.

SMH
Read what you wrote, than reread any review which covers power saver mode.
 
blahsaysblah
Gerbil Elite
Posts: 581
Joined: Mon Oct 19, 2015 7:35 pm

Re: Vega Meta Analysis

Thu Aug 31, 2017 3:00 pm

NoOne ButMe wrote:
SMH
Read what you wrote, than reread any review which covers power saver mode.

Can you spell it out for me. You know i have really bad reading comprehension...
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Vega Meta Analysis

Thu Aug 31, 2017 4:45 pm

Can you provide documentation to show how Vega might be unstable in Power Save mode?

I'd theorize that voltage might be dropped in Power Save below what a specific unit is stable, for example, but I'd like to see where this has been shown to be true. It kind of goes the opposite of AMD running their cards hot, with higher voltages than necessary.
 
Mr Bill
Gerbil Jedi
Posts: 1819
Joined: Mon Jan 21, 2002 7:00 pm
Location: Colorado Western Slope
Contact:

Re: Vega Meta Analysis

Thu Aug 31, 2017 5:04 pm

Lordhawkwind wrote:
I'm not a fanboy or apologist but as a Vega 64 AC owner who games at 1440p and with a 144hz freesync monitor the card is pretty awesome. Yes it sucks a lot of power and I had to buy a new top tier 850w PSU (needed it anyway as I overclock my 7700K) but at 1440p it is a very good gaming card. I'm getting between 25% to 50% performance improvements over my Fury Pro which ticks all my upgrade boxes TBH I had my Fury Pro for two years which is the longest I've had a graphics card and I'll probably keep this Vega card for at least the same time. For £450 I'm not complaining.
If you are still happy with yours, you might revisit your comment on the TR review page and maybe say something.
X6 1100T BE | Gigabyte GA-990FXA-UD3 AM3+ | XFX HD 7870 | 16 GB DDR3 | Samsung 830/850 Pro SSD's | Logitech cherry MX-brown G710+ | Logitech G303 Daedalus Apex mouse | SeaSonic SS-660XP 80+ Pt | BenQ 24' 1900x1200 IPS | APC Back-UPS NS-1350 | Win7 Pro
 
Mr Bill
Gerbil Jedi
Posts: 1819
Joined: Mon Jan 21, 2002 7:00 pm
Location: Colorado Western Slope
Contact:

Re: Vega Meta Analysis

Thu Aug 31, 2017 5:09 pm

Krogoth wrote:
blahsaysblah wrote:
I say the 1.75% RX cards vs 17% GTX 1xxx cards will continue with Vega because it's a year late and has only one slight advantage.

Why would you shell out $400-$600 of today dollars for:
- performance from a year ago
- missing a year of experience for the drivers
- missing a year of devs/engines getting optimized/comfortable for it
- strength in numbers for your bug/issue (pick 5% or 17%, both are way bigger than 1.74%)
- tech that still cant scale to 1080 ti or Titan levels.

I bought the we are aiming for mass market PR spin for RX and left it at that. But fool i am, because no one else bought it. Same with Vega spin. Its a year late. It's still not good enough tech to scale to 1080 ti/Titan. Majority of folks will not choose Vega.

Its false equivalence to say Vega and GTX 1070/80 are equals. They clearly do not have same number or quality of cons and pros.

Nothing against AMD/ATI, especially because you cant discount the 100+ million Polaris/Vega in console space today.


Because it is the fastest Freesync-capable GPU on the market right now that happens to be a very powerful customer-tier GPGPU. Vega 64 easily matches fully enabled Titan XP at most GPGPU stuff (most recent drivers for Titan) for less $$$$. Not everything revolves around silly games. The tech does scale up to Pascal/Maxwell levels on a hardware level. The problem is that difficult to translate that into games. Vega 10 is pretty much AMD RTG's GF100. Hot, somewhat to completely underwhelming gaming performance but blew most of the competition away at GPGPU stuff.
The Techgage review sorta brings home what Krogeth is saying.
X6 1100T BE | Gigabyte GA-990FXA-UD3 AM3+ | XFX HD 7870 | 16 GB DDR3 | Samsung 830/850 Pro SSD's | Logitech cherry MX-brown G710+ | Logitech G303 Daedalus Apex mouse | SeaSonic SS-660XP 80+ Pt | BenQ 24' 1900x1200 IPS | APC Back-UPS NS-1350 | Win7 Pro
 
LostCat
Minister of Gerbil Affairs
Posts: 2107
Joined: Thu Aug 26, 2004 6:18 am
Location: Earth

Re: Vega Meta Analysis

Thu Aug 31, 2017 5:20 pm

Mr Bill wrote:
The Techgage review sorta brings home what Krogeth is saying.

Why did they even include 'gaming' without any games?

That was the most pointless thing I've seen in ages.
Meow.
 
Mr Bill
Gerbil Jedi
Posts: 1819
Joined: Mon Jan 21, 2002 7:00 pm
Location: Colorado Western Slope
Contact:

Re: Vega Meta Analysis

Thu Aug 31, 2017 6:00 pm

LostCat wrote:
Mr Bill wrote:
The Techgage review sorta brings home what Krogeth is saying.

Why did they even include 'gaming' without any games?

That was the most pointless thing I've seen in ages.
Point is, the review included a good assortment of professional graphics cards and Vega acquitted itself very well as a GPGPU and was well within the pack for gaming.
X6 1100T BE | Gigabyte GA-990FXA-UD3 AM3+ | XFX HD 7870 | 16 GB DDR3 | Samsung 830/850 Pro SSD's | Logitech cherry MX-brown G710+ | Logitech G303 Daedalus Apex mouse | SeaSonic SS-660XP 80+ Pt | BenQ 24' 1900x1200 IPS | APC Back-UPS NS-1350 | Win7 Pro
 
strangerguy
Gerbil Team Leader
Posts: 262
Joined: Fri May 06, 2011 8:46 am

Re: Vega Meta Analysis

Thu Aug 31, 2017 8:55 pm

NoOne ButMe wrote:
I look forward to facepalming as people upgrade from a 1060 6GB to a 1160 (somewhere close to 1070 performance seems reasonable) when using 1080p 60Hz monitors.


That will only prove how much better NV over AMD in the GPU selling business. NV gives zero **** about winning pointless GPU SJW battles.
8700K 4.3GHz @ 1.05V | Cryorig H7 | MSI Z370M AC | 32GB Corsair LPX DDR4-3200 | GTX 1070 @ 0.8V | 500GB Evo 850 | 1TB M550 | 3TB Toshiba | Seasonic G650 | Acer XB271HU
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Vega Meta Analysis

Thu Aug 31, 2017 9:13 pm

strangerguy wrote:
NoOne ButMe wrote:
I look forward to facepalming as people upgrade from a 1060 6GB to a 1160 (somewhere close to 1070 performance seems reasonable) when using 1080p 60Hz monitors.


That will only prove how much better NV over AMD in the GPU selling business. NV gives zero **** about winning pointless GPU SJW battles.


They do care enough to release updated drivers for their Pascal Titan to "ungimp" them. Nvidia loves the prosumer dough as seen with the success of their Kepler-based Titans.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
NoOne ButMe
Gerbil Elite
Posts: 707
Joined: Fri May 15, 2015 9:31 pm

Re: Vega Meta Analysis

Thu Aug 31, 2017 11:46 pm

blahsaysblah wrote:
NoOne ButMe wrote:
SMH
Read what you wrote, than reread any review which covers power saver mode.

Can you spell it out for me. You know i have really bad reading comprehension...

It's an official mode made by AMD. It lowers the power the card can draw, like how high power increasing he power the card can draw. Or is high power mode going to have GPUs failing on it?

Some cards might lose more performance than others, that is all.

It's a feature. What you're saying is as stupid as saying that turning the more from normal to high performance is going to cause problems.

Think of it like cTPD up/down on Intel low power CPUs. It changes the power target, the difference is Intel lists clocks (I believe) for it's up/down TPD. While AMD does not.
strangerguy wrote:
NoOne ButMe wrote:
I look forward to facepalming as people upgrade from a 1060 6GB to a 1160 (somewhere close to 1070 performance seems reasonable) when using 1080p 60Hz monitors.


That will only prove how much better NV over AMD in the GPU selling business. NV gives zero **** about winning pointless GPU SJW battles.

More concerned about the idiots wasting their money on an overkill GPU. Experience bettered far more by Gsync monitor or just 120/144hz than from 1060 to 1070 performance at 1080p
 
Kretschmer
Gerbil XP
Posts: 462
Joined: Sun Oct 19, 2008 10:36 am

Re: Vega Meta Analysis

Fri Sep 01, 2017 8:56 am

Because it is the fastest Freesync-capable GPU on the market right now that happens to be a very powerful customer-tier GPGPU. Vega 64 easily matches fully enabled Titan XP at most GPGPU stuff (most recent drivers for Titan) for less $$$$. Not everything revolves around silly games. The tech does scale up to Pascal/Maxwell levels on a hardware level. The problem is that difficult to translate that into games. Vega 10 is pretty much AMD RTG's GF100. Hot, somewhat to completely underwhelming gaming performance but blew most of the competition away at GPGPU stuff.


If everyone buys these for GPGPU/mining, AMD's gaming marketshare will further recede, and fewer game developers will bother to optimize their software for AMD hardware. At this rate, they could exit the gaming GPU market entirely.
 
Kougar
Minister of Gerbil Affairs
Posts: 2306
Joined: Tue Dec 02, 2008 2:12 am
Location: Texas

Re: Vega Meta Analysis

Fri Sep 01, 2017 8:54 pm

Kretschmer wrote:
If everyone buys these for GPGPU/mining, AMD's gaming marketshare will further recede, and fewer game developers will bother to optimize their software for AMD hardware. At this rate, they could exit the gaming GPU market entirely.


Aye, there's that risk.

Steam hardware survey for August is out. From March 2016 to August 2017, AMD has dropped from 25.2% to 18.63% market share. August alone was a 1.6% drop, which is even more telling given NVIDIA hasn't launched anything and AMD had launched the 500's and Vega. NVIDIA went up 3.7% last month. With AMD GPU prices being kept higher than NIVIDIAs by miners gamers are slowly being forced to go green.
 
strangerguy
Gerbil Team Leader
Posts: 262
Joined: Fri May 06, 2011 8:46 am

Re: Vega Meta Analysis

Fri Sep 01, 2017 9:48 pm

Kougar wrote:
Kretschmer wrote:
If everyone buys these for GPGPU/mining, AMD's gaming marketshare will further recede, and fewer game developers will bother to optimize their software for AMD hardware. At this rate, they could exit the gaming GPU market entirely.


Aye, there's that risk.

Steam hardware survey for August is out. From March 2016 to August 2017, AMD has dropped from 25.2% to 18.63% market share. August alone was a 1.6% drop, which is even more telling given NVIDIA hasn't launched anything and AMD had launched the 500's and Vega. NVIDIA went up 3.7% last month. With AMD GPU prices being kept higher than NIVIDIAs by miners gamers are slowly being forced to go green.


The $700+ 1080 Ti is only barely behind RX480 and also gaining share at 4x the rate.
The 1080 is a lousy mining card, but gamers will find compelling to step up when the 1070 prices gets jacked up by miners.
The 1070 is an excellent all-around gaming + mining card.
The 1060 is gimped in mining but makes a great value card for the gamer masses.
The 1050s is there to capture <75W add-in GPU market and HTPCs.

It's funny how NV played their hands perfectly, albeit somewhat unintentionally, while AMD does the opposite.
8700K 4.3GHz @ 1.05V | Cryorig H7 | MSI Z370M AC | 32GB Corsair LPX DDR4-3200 | GTX 1070 @ 0.8V | 500GB Evo 850 | 1TB M550 | 3TB Toshiba | Seasonic G650 | Acer XB271HU
 
Kougar
Minister of Gerbil Affairs
Posts: 2306
Joined: Tue Dec 02, 2008 2:12 am
Location: Texas

Re: Vega Meta Analysis

Sat Sep 02, 2017 12:38 am

strangerguy wrote:
It's funny how NV played their hands perfectly, albeit somewhat unintentionally, while AMD does the opposite.


It didn't help AMD stacked the deck by choosing to push free adoption of HBM2 instead of adopting a licensing model. NVIDIA's adoption of it contributed to Vega's delay (I'd love to know by how much) as well as set up NVIDIA to have two very successful enterprise compute designs. At this point I only see history repeating itself a third time with HBM3.
 
LostCat
Minister of Gerbil Affairs
Posts: 2107
Joined: Thu Aug 26, 2004 6:18 am
Location: Earth

Re: Vega Meta Analysis

Sat Sep 02, 2017 1:00 am

so tempting to sell off the 1070 and get a vega 56, but I guess I don't actually need one
Meow.
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Vega Meta Analysis

Sat Sep 02, 2017 1:45 am

strangerguy wrote:
It's funny how NV played their hands perfectly, albeit somewhat unintentionally, while AMD does the opposite.


If I had to assign one attribute to Nvidia's GPU release strategy, it's execution. They've wound up coming from behind with technically lagging, inferior products more than once and have not just caught up to but surpassed their competition. 3Dfx put the hurt on them until the first Geforce, ATi with the Radeon 8500, and then with the 9700. With each misstep they've continued to iterate and innovate to get back out ahead.

Kougar wrote:
It didn't help AMD stacked the deck by choosing to push free adoption of HBM2 instead of adopting a licensing model. NVIDIA's adoption of it contributed to Vega's delay (I'd love to know by how much) as well as set up NVIDIA to have two very successful enterprise compute designs. At this point I only see history repeating itself a third time with HBM3.


If Hynix and now Samsung (and someone else?) weren't allowed to sell HBM freely, would they have even bothered to produce it?

Would licensing have stopped Nvidia from using it, or even remotely made a dent in the margins of their enterprise products?

Maybe Vega will win AMD some enterprise marketshare, but it's clearly a detriment to their consumer release.
 
Kougar
Minister of Gerbil Affairs
Posts: 2306
Joined: Tue Dec 02, 2008 2:12 am
Location: Texas

Re: Vega Meta Analysis

Sat Sep 02, 2017 4:48 am

Airmantharp wrote:
If Hynix and now Samsung (and someone else?) weren't allowed to sell HBM freely, would they have even bothered to produce it?

Would licensing have stopped Nvidia from using it, or even remotely made a dent in the margins of their enterprise products?


AMD co-invested in the R&D to co-develop HBM with Hynix, so almost certainly Hynix would've been happy to sell it to anyone even if it required a licensing agreement. And obviously Samsung saw $$$ in licensing the tech from Hynix to produce it themselves, so there is definitely high interest in it. Almost certainly NVIDIA would've kept the decision to use HBM as long as the licensing was minimal or still favorable, especially given the margins (and prestige) it is making off HBM2 infused products.

I'm sure there would have been a middle ground where AMD could've slowed down NVIDIA's eager adoption, and even if that was impossible it could at least have recouped it's investment cost and made some profit. Because at this point not only is AMD helping fund NVIDIA R&D, but in doing so AMD is also undermining its own ability to deliver products on time. It's basically the best-case win-win scenario NVIDIA could have dreamed of. Just really do not understand any of AMD's business decision making, especially when it seems HBM3 will just be a repeat of the last two.

I'm no expert but I do seem to recall chip experts stating emphatically how dual-sourcing chips from multiple fabs was a very bad, expensive last resort. And suddenly AMD is dual-sourcing not just a chip, but an entire freaking complex Vega package. Which has resulted in 4+ something different Vega packages now in circulation that don't even share the same package dimensions, let alone power or thermal characteristics. It's probably been a nightmare for AMD's engineers to pull the mess together and simply get Vega out the door.
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Vega Meta Analysis

Sat Sep 02, 2017 5:38 am

Mostly, the use of HBM (any version) is completely tied to the current Vega architecture.

That means that the use of GDDRwhatever would require the design of a different GPU. Best explanation that I've heard (repeated here, even) is that AMD is simply too poor to spin up multiple big-chip designs, as Nvidia did with GP100 (enterprise compute GPU with HBM2) and GP102 (high-end consumer GPU with GDDR5X in latest Titans and 1080Ti).

Now, the only reason that this matters is that Nvidia's investment and leadership in the compute space, particularly their creation of a language (CUDA) and the tools to use it, has led to very high demand of whatever they can produce. Thus, when Nvidia chooses to use HBM2, it stands to reason that they'd likely get preference over AMD. At the very least, they can outbid AMD for preference, because the parts that they're producing with HBM2 are bound for the HPC space, not consumers.

I think AMD simply gambled on the yields of HBM and HBM2, and lost. This while they lack the infrastructure to compete in HPC, but produce every GPU with extra compute resources that simply aren't used in the gaming space. And that means that the extra bandwidth that HBM provides goes to waste for gaming. They're exceedingly lucky that the cryptocurrency market hasn't crashed.
 
Kougar
Minister of Gerbil Affairs
Posts: 2306
Joined: Tue Dec 02, 2008 2:12 am
Location: Texas

Re: Vega Meta Analysis

Sat Sep 02, 2017 9:25 am

I don't buy the cash strapped reasoning at all. It is expensive to dual-source chips from two completely unrelated fabs, and that's before sourcing something as complex as HBM2 stacks diffused onto a Vega package.

It was already posted in this thread that Hynix has publically stated customers will now pay up to 2.5x the price of HBM1 for HBM2. Prices of HBM2 have risen by around 30% within the last quarter if the AT article is to be believed.

I am more curious about that co-investment into HBM R&D AMD has actively going with Hynix. One would assume it gives AMD either favorable supply or better pricing or at least something. In the very least AMD should've known Hynix's own projections for HBM2 yields and supply timelines thanks to its partnership. But on the face of it, it really seems like AMD is getting nothing out of this arrangement at this point.
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: Vega Meta Analysis

Sat Sep 02, 2017 11:50 am

Kougar wrote:
I don't buy the cash strapped reasoning at all. It is expensive to dual-source chips from two completely unrelated fabs, and that's before sourcing something as complex as HBM2 stacks diffused onto a Vega package.

It was already posted in this thread that Hynix has publically stated customers will now pay up to 2.5x the price of HBM1 for HBM2. Prices of HBM2 have risen by around 30% within the last quarter if the AT article is to be believed.

I am more curious about that co-investment into HBM R&D AMD has actively going with Hynix. One would assume it gives AMD either favorable supply or better pricing or at least something. In the very least AMD should've known Hynix's own projections for HBM2 yields and supply timelines thanks to its partnership. But on the face of it, it really seems like AMD is getting nothing out of this arrangement at this point.


Well, hopefully it gives AMD first dibs (ie, over Nvidia) to HBM supply. New process tech misses projected timelines all the time, so I don't think something like the HBM2 delay, while unfortunate, is all that rare - and not necessarily something AMD could have effectively foreseen.

It's simple. AMD designed a single high end compute architecture to compete with Nvidia's compute cards. This required something like HBM2. The only way it would be at all cost effective for them to compete with it in the gaming space against Nvidia is for HBM2 to reach mass production. Which it didn't. And now supply and AMD's profit margins, on Vega, are suffering.

And as far as their competitiveness in the professional compute market goes, hopefully their open source approach to the software will win them some marketshare.
 
Lordhawkwind
Gerbil
Posts: 73
Joined: Tue Jan 05, 2010 10:16 am

Re: Vega Meta Analysis

Sat Sep 02, 2017 2:02 pm

Airmantharp wrote:
Mostly, the use of HBM (any version) is completely tied to the current Vega architecture.

That means that the use of GDDRwhatever would require the design of a different GPU. Best explanation that I've heard (repeated here, even) is that AMD is simply too poor to spin up multiple big-chip designs, as Nvidia did with GP100 (enterprise compute GPU with HBM2) and GP102 (high-end consumer GPU with GDDR5X in latest Titans and 1080Ti).

Now, the only reason that this matters is that Nvidia's investment and leadership in the compute space, particularly their creation of a language (CUDA) and the tools to use it, has led to very high demand of whatever they can produce. Thus, when Nvidia chooses to use HBM2, it stands to reason that they'd likely get preference over AMD. At the very least, they can outbid AMD for preference, because the parts that they're producing with HBM2 are bound for the HPC space, not consumers.

I think AMD simply gambled on the yields of HBM and HBM2, and lost. This while they lack the infrastructure to compete in HPC, but produce every GPU with extra compute resources that simply aren't used in the gaming space. And that means that the extra bandwidth that HBM provides goes to waste for gaming. They're exceedingly lucky that the cryptocurrency market hasn't crashed.


From the above comment it's pretty clear that you don't understand how businesses work. AMD and Hynix co-developed HBM and probably in lieu of taking a licensing fee AMD would have negotiated preferential terms for the production of most, if not all, of Hynix's HBM production. This obviously left Nvidia in a bit of a hole and that's why they worked closely with Micron on GDDRX5/6 and as it turned out this was a very wise move. Whilst Hynix have struggled to get mass production in gear Samsung has stepped into the breach no doubt paying Hynix a licensing fee for the privilege but it's an arrangement that suits both parties.

Both companies want volume business and at the moment that's to AMD's advantage as they use HBM for their consumer and HPC cards. As Nvidia only use it for their HPC cards they can't provide the volume required and this means they will be in line behind AMD and most likely paying a high cost in excess of what AMD are charged per module. Nvidia probably don't mind because their profit margins in this market are huge.

I also don't buy all the miners are buying Vega cards because the overall economics don't add up. Who I can see buying Vega even at $699 are content creators and small studios who can pick up each RX Vega at $300 less than a Vega FE and probably saving more on a Titan purchase. If a small company needs 20 cards that's a saving of at least $6,000 thank you very much, not exactly small change. Maybe this is why Nvidia suddenly released this 'miraculous' driver for Titan in the last month or so that has increased it's compute performance. Whilst Vega is not going to hurt Nvidia too badly in the 1080ti gaming market I bet it's beginning to erode Nvidia's share of the 'prosumer' market which overall is far more profitable for both companies.
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Vega Meta Analysis

Sat Sep 02, 2017 11:15 pm

Lordhawkwind wrote:
AMD and Hynix co-developed HBM and probably in lieu of taking a licensing fee AMD would have negotiated preferential terms for the production of most, if not all, of Hynix's HBM production. This obviously left Nvidia in a bit of a hole and that's why they worked closely with Micron on GDDRX5/6 and as it turned out this was a very wise move. Whilst Hynix have struggled to get mass production in gear Samsung has stepped into the breach no doubt paying Hynix a licensing fee for the privilege but it's an arrangement that suits both parties.


As we've seen, GDDR5(X)/6 have all been (will be in the case of v.6) more than needed to feed even the highest performance cards for gaming. They're technologies that the big memory makers are already very good at producing in volume, and Nvidia (and AMD) are very good at implementing. Very safe bet here.

HBM, while a proven technology in terms of deliverable performance, is rather new. Nvidia's choice to limit HBM use to their big-chip compute product line makes a whole lot of sense here, and AMD's gamble- which could certainly have paid off- appears foolish.

Lordhawkwind wrote:
Both companies want volume business and at the moment that's to AMD's advantage as they use HBM for their consumer and HPC cards. As Nvidia only use it for their HPC cards they can't provide the volume required and this means they will be in line behind AMD and most likely paying a high cost in excess of what AMD are charged per module. Nvidia probably don't mind because their profit margins in this market are huge.


Sure, volume is good. The challenge here is that AMD simply cannot make very many of these cards; not only is production of large GPUs hard, but AMD is relying on a technology that is apparently unproven, and they're doing it with their consumer GPUs.

Would you argue that HBM wouldn't have been available for anyone to use if AMD hadn't committed to use it for their consumer products?

Lordhawkwind wrote:
I also don't buy all the miners are buying Vega cards because the overall economics don't add up. Who I can see buying Vega even at $699 are content creators and small studios who can pick up each RX Vega at $300 less than a Vega FE and probably saving more on a Titan purchase. If a small company needs 20 cards that's a saving of at least $6,000 thank you very much, not exactly small change. Maybe this is why Nvidia suddenly released this 'miraculous' driver for Titan in the last month or so that has increased it's compute performance. Whilst Vega is not going to hurt Nvidia too badly in the 1080ti gaming market I bet it's beginning to erode Nvidia's share of the 'prosumer' market which overall is far more profitable for both companies.


Miners don't add up. They're definitely buying those cards that they can and they're tweaking them for profitability, despite the initial numbers not looking so great.

I see others buying them to tinker with, a few actually got cards at MSRP, those that are invested in FreeSync, and a few zealots that will buy AMD no matter how bad the numbers stack up. But mostly, I don't see regular gamers buying Vega.

As for the pro markets... yeah. You'd have to provide sourced numbers to make any kind of compelling argument, showing that both the volume is there and that Vega is actually a good buy for a commanding percentage of those markets, and hell, that they can actually buy them in the first place.

And no, $6000 isn't even pocket change for a small business. You and me, sure. But even a small business would spend that $6000 if it increased productivity.
 
HERETIC
Gerbil XP
Posts: 488
Joined: Sun Aug 24, 2014 4:10 am

Re: Vega Meta Analysis

Mon Sep 04, 2017 8:19 am

VEGA MINES LIKE THE DEVIL..............................
https://www.techpowerup.com/236748/rx-v ... eum-mining
And as a bonus-It's not too bad at gaming either............................................
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Vega Meta Analysis

Mon Sep 04, 2017 8:29 am

Wattage is more like 240W, but that's the kind of return that miners have been banking on.

Who is online

Users browsing this forum: No registered users and 51 guests
GZIP: On