RX Vega prices inch downward in our latest graphics-card spot check

A couple weeks back, I posited that the market for powerful graphics cards was cooling off a bit. That trend appears to be continuing, as Radeon RX Vega graphics cards are inching toward their bare-board suggested prices. A glance at Newegg's wares today shows four RX Vega 56 cards available for $460 to $470 from PowerColor, MSI, Gigabyte, and Sapphire. That's down from the $500 or more those cards demanded when Newegg was selling them as part of Radeon Packs, although one doesn't get the two "free" games that used to come in those bundles any longer. (In fact, Radeon Packs seem to have disappeared from Newegg entirely.) Newegg will even kick back $20 on Gigabyte's RX Vega 56 card for a grand total of $450 if you chance a mail-in rebate.

We found that the reference Radeon RX Vega 56 held its own against a hot-clocked GeForce GTX 1070 in our testing, so we'd have welcomed the apparent decline in RX Vega 56 prices as a competitive development as recently as a couple weeks ago. However, GeForce GTX 1070 prices are also in the midst of a re-entry of late. Surveying all of Newegg's GTX 1070 offerings shows that it's possible to get one of those cards for as little as $400 right now, just $20 over Nvidia's suggested price at launch. Most custom GTX 1070s seem to top out at about $450 at the moment, so buyers have a wide range of alternatives to the reference RX Vega 56 for similar money.

RX Vega 64 cards also seem to be on the edge of a price decline. Newegg is selling both the reference and limited-edition Vega 64 cards from Sapphire for $570 right now, even as many other such cards sell for $610 or $620. That's a drop from launch pricing at retail, but it still isn't enough to make the Vega 64 an appealing choice against the GeForce GTX 1080. In fact, GTX 1080 prices are bad news for the GTX 1070, the RX Vega 56, and the RX Vega 64 alike. Newegg has a triple-fan MSI GTX 1080 for just $490 at the moment (with a free copy of Destiny 2, to boot), and our initial testing suggests the GTX 1080 will still deliver the smoothest and most fluid gaming experience around for that kind of money. For $530 or so, one has an enviable choice of custom cards from EVGA and Gigabyte that will likely run quieter and dump less waste heat than the RX Vega duo.

While these data points are just a tiny slice of the graphics-card market as a whole, they do suggest that the demand crunch we saw this summer is continuing to ease across the board. We're still far from the glory days when an RX 480 4GB cost as little as $150, but at least prices are no longer downright oppressive. Perhaps the market will continue to cool as the leaves continue to fall around the TR labs.

Comments closed
    • End User
    • 2 years ago

    Where are the cool 3rd party cooler cards? 🙁

    • techguy
    • 2 years ago

    Nice to see prices coming down a bit from their absurd levels, but they’re still not at MSRP so they remain uncompelling to all but the most diehard of fans and to miners, perhaps.

    • spiketheaardvark
    • 2 years ago

    Prices seem to be coming in for a softer landing than last time this nonsense happened. Prices crashed pretty hard and fast just as miners were dumping used AMD cards onto Ebay. With a little more stability Nvidia and AMD might be more willing to push up production next this happens knowing that it’s less like for the bottom to fall out of the market.

    • Chrispy_
    • 2 years ago

    Honestly can’t see the point of Vega64 for gamers. It’s a power-hungry compute monster that barely outperforms Vega56 and no matter how much you overclock it, it’s still slower than the equivalent 1080.

    Vega56 is generally better than the 1070, even when it’s running cool and quiet at about 185W on the 2nd BIOS. If you want Vega, get the 56 version. If you want faster than a Vega56, then AMD can’t really help you.

      • chuckula
      • 2 years ago

      Man, the saltiness levels of the true-believers are high today when even mostly-positive-to-AMD posts are getting hit.

      • synthtel2
      • 2 years ago

      I would be interested in Vega 64 for gaming (the budget is the limitation). Power use just isn’t much of a deterrent – it’d be hooked up to a 600W PSU driving <90W of non-graphics components, a blower means there’s no worry of it heating up other components despite SFF, full-load fan noise from the card is mostly irrelevant because I wear headphones when gaming, and power isn’t that expensive considering the time it’ll spend at full load. To the extent power use is an issue, it’s mainly due to reliability concerns, but Vega’s voltages are reasonable and AFAIK that blower does its job.

      What I need before considering a purchase is good Vulkan support from AMDGPU non-Pro (which might already exist – I haven’t checked in a while). Beyond that 56 versus 64 is mostly just about performance per dollar.

      (Nvidia made the #2 spot on my boycott list. Saying why would cause collateral damage. Work may make it necessary to own one, but my 960 can handle that work for a while yet, and when it can’t it’ll be whatever card they’re likely making the least margin on.)

        • Chrispy_
        • 2 years ago

        There are musings floating around the BIOS flashing forums that Vega56 outperforms Vega64 for gaming because once the BIOS is flashed to a 64, even though it still only has 3584 shaders instead of 4096, it can clock higher with the unlocked voltage control of the Vega64 BIOS.

        There’s no conclusive tests to prove this, but it would seem Vega56 is not shader-bound in gaming, and the reduced thermal/power usage of the 3584 shader array means more clock headroom in an otherwise identical board/silicon/VRAM product to Vega64.

        So in that respect, Vega64 really is a dud for gamers – it’s compromised because AMD wanted to cram the largest compute-friendly shader array they could into the chip, when more shaders don’t always mean more gaming performance.

          • synthtel2
          • 2 years ago

          TR’s performance results show the 64 as being faster than the 56 by more than the nominal boost clock difference on all games, both average and 99th percentile, save Doom, which had some kind of 99th percentile regression on the 64. It’s true that it looks a bit bimodal though, and the low side of the bimodal distribution is definitely not as far above the clock difference as would be expected. Games are all shader-bound enough that there should be a bigger difference here, even with resource ratios like this.

          There has got to be something else going on here; if you can get a given performance level with a wider shader array, that should always be more efficient than higher clocks on a smaller one, since clock increases raise power consumption much more than linearly.

          If I had to guess at Vega’s problem, I’d guess it’s infinity fabric. It reportedly was used in eyebrow-raising places within Vega, and it doesn’t appear to be terribly efficient (die area or power wise) even in Zen, which is a much less demanding application. If some IF link(s) got undersized in an effort to save power and can’t reliably keep 64 CUs fed, that would explain some stuff.

          On the original topic, the price gap between 56 and 64 is unlikely to be similar to the performance gap, but if it were I’d have nothing against the 64.

            • Krogoth
            • 2 years ago

            The problem with Vega architecture is that gaming mode isn’t using primitive shaders. It is acting like an overclocked Fiji/Hawaii. The primitive shaders are working for general compute and professional graphics suites(Froniter/FirePro) where it can easily outpace the GP104 and match GP102.

            • synthtel2
            • 2 years ago

            A lack of primitive shaders does not make it an upclocked Fiji, and it does perform significantly better than an upclocked Fiji would, primitive shaders or no. Typical games (at least ones not mangled by TWIMTBP) don’t even spend that long waiting on geometry processing, so don’t expect any miracles from it.

            • Krogoth
            • 2 years ago

            I expect it to at least outpace the GP104 family and probably match the “GV104” but I doubt it will ever catch-up with GP102 family.

            • tipoo
            • 2 years ago

            A few things like that make me think it’ll be some classic AMD FineWine in a few years, but that doesn’t help it any in sales today, nor would I recommend buying on speculative performance for drivers over time vs what we can already see.

            • freebird
            • 2 years ago

            Maybe so, but TR didn’t load the 64 bios on Vega[super<]56[/super<] for their testing did they? Loading the Vega[super<]64[/super<] Bios on the 56 allows for higher default HBM2 speeds and higher OCing due to increased voltage for the HBM2 and a higher voltage limit for the card, which IS the point Chrispy_ made in his comment. (and my comment about the TR review IS NOT a complaint as some trolls on here are trying to state) With the standard Vega[super<]56[/super<] bios I could only OC the HBM2 to 955 or 960, after flashing I had no trouble running at 1025 and mining for days at a time. Actually, I could clock it higher consistently to around 1080-1100, but that speed actually lowered hashing rate. For gaming, higher HBM2 speeds are better. Best results with either Vega come from UNDERVOLTING it, which actually allow it to reach higher stable clocks and/or using WattTool-0.92 to achieve this. There are tons of YT videos and other sites that all report the same thing and if you don't like it flashing back is as simple as flashing to Vega[super<]64[/super<]. [url<]https://www.youtube.com/watch?v=NRXd3WY_rcI[/url<] When strictly speaking of gaming performance, a Vega[super<]56[/super<] with 64 bios will perform nearly as good as a Vega[super<]64[/super<] stock. Only games that are either DX12 and utilize the extra shaders or TMUs will see any performance improvement with Vega[super<]64[/super<]over the Vega[super<]56/64b[/super<]. Currently, there seems to be few games that actually do, but that could be a driver issue or something else. It would be nice to see a bench mark that could scale shader requests to track the performance. Maybe there will be some Vega 56s down the road the are "gimped" more or can clock as high or have HBM2 that performs lower, but right now everyone seems to be having the same experience as I. Load the 64 Bios on Vega[super<]56[/super<] and it is nearly the same as the Vega[super<]64[/super<]. I personally am looking into modding my Vega[super<]56[/super<] with a modded G10 Kraken to cool it and reduce the fan noise, but I'm also looking forward to this card running even better in the future with DX12 games and if games start leveraging Vega's primitive shaders ability.

            • chuckula
            • 2 years ago

            Complaining that TR’s review of Vega is inaccurate because they didn’t do a shady BIOS hack is like complaining that TR’s review of Coffee Lake wasn’t accurate because they didn’t delid the processor.

            In other words: It’s not a valid complaint.

            • freebird
            • 2 years ago

            Where did I say TR’s review was inaccurate or complain about it? I was just pointing out that syntel2 was using the TR review to respond about Chrispy_ comments about Vega[super<]56[/super<] performing like a Vega[super<]64[/super<] when running 64 BIOS. I didn't and don't expect TR to perform and standard review of the cards using a "BIOS Hack" which is another discussion or article of which MANY can be found on YT or other sites. I was attempting to show that performance of the Vega 56 is limited more by the BIOS artificial "limits"rather than the chip itself. That may change in the future, but currently you can get nearly identical performance out of Vega[super<]56[/super<] compared to a Vega[super<]64[/super<] just by a simple BIOS flash, since it affects voltages and clock speeds for the GPU and probably more importantly the MAX speed HBM2 is able to run at. SO your entire comment "In other words" Is NOT a valid comment. 😛 p.s. the MAIN point is, the AIR cooled 64 should be avoided and just get a Vega[super<]56[/super<] if you aren't afraid of flashing a BIOS to it. As I stated before, it can be easily reversed, which I have already done when I mistakenly load the 64 Liquid BIOS instead of the 64 AIR version.

            • Redocbew
            • 2 years ago

            TL;DR the whole thread here.

            The only thing I have to say is about you using actual superscripts the same way AMD does in all their marketing material. I’m trying not to read too much into this, but really? Who does that?

            • freebird
            • 2 years ago

            I do, since some NimRod on here started flamming me in some comments back when Vega was 1st released… when I said the actual name for Vega64 & 56 was Vega[super<]64[/super<] and when I posted an example from AMD's website, he suggested it was just a "formatting" problem with their website... Besides Cut & Paste makes it quick.... Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Vega[super<]64[/super<] Besides, who reads comments in posts and responds stating that a post is TL;DR???? L@zy people.

            • Anonymous Coward
            • 2 years ago

            [quote<]Besides, who reads comments in posts and responds...[/quote<] In the age of information excess, I for one appreciate the TL;DR summary as an aid to efficiency. I'm going to skip the middle section of this thread, in this case. Unless I get bored.

            • synthtel2
            • 2 years ago

            I had core clocks foremost in mind for that comparison, and our conclusions mostly aren’t incompatible. The main point was that if 56 CUs can get better performance per watt than 64, there’s a serious problem somewhere.

            For my own comparisons, I don’t care what tweaks V64 allows, I care what the V64 BIOS defaults are like compared to V56’s, and that may take more research. (It’ll mainly be used in Linux and I don’t know what the OC tooling is like.)

            • freebird
            • 2 years ago

            Which is fine and I agree with you that 56 isn’t > 64.

            My TL;DR ?response was trying to point out the difference between the 56 & 64 (reference) are negligible with the same power/clock limits, since the 56 performs nearly identically to the 64 when running the same bios (and probably since most games don’t 100% saturate the shaders & TMUs, in that case results may differ).

            BTW, flashing the BIOS, (which I only know how in Windows, since I gave up on Linux some time ago) from 56 to 64 should operate perfectly fine in Linux. It would change the default clock and power for HBM2 from 800Mhz to 950Mhz and up the boost limit on the GPU also.

            • synthtel2
            • 2 years ago

            Yes, the BIOS flash should work perfectly, but how much of the gain from it is inherent and how much is only due to tweaking options it opens up?

            The vast majority of shader-heavy work in any modern game should be able to trivially saturate 4096 SPs and 256 TMUs. It isn’t like we’re running into Amdahl’s law here – as far as Amdahl’s law is concerned, each pixel is a usually a naturally independent unit of work, meaning we’re talking about work that’s naturally over two *million* wide at 1080p. It isn’t always quite this parallel in practice, but it’s enough so that it definitely shouldn’t be this kind of problem for the efficiency of those last 8 CUs relative to the first 56 even if the renderer was set up with only 12 to 18 in mind.

            Throwing more SPs at a graphics problem is a valid strategy independent of renderer details, because all the graphics infrastructure we’ve been using for decades has been set up to take advantage of all the parallelism you can throw at it. If more SPs aren’t working, that’s a problem with the hardware or driver.

            This does not mean that hardware limitations can’t be worked around in software. For instance, if Vega’s problem is an internal bandwidth thing, minimizing certain types of memory traffic could be the software change we’re looking for. This may look similar to end users, but is still not the same as games not being designed to take advantage of 4096 SPs.

            (This is not solely directed at you, the sentiment that the sheer shader count is the problem is all over the place.)

            • freebird
            • 2 years ago

            So, how did this BIOS flashing experiment play out in benchmarks? Well, here are the results that were obtained in 3DMark Fire Strike Extreme:

            Radeon RX Vega 56 (stock): 9428 points
            Radeon RX Vega 56 (with Vega 64 BIOS): 10340 points
            Radeon RX Vega 64 (stock): 10479 points
            Radeon RX Vega 56 (with Vega 64 BIOS, overclocked): 11322 points

            Read more at [url<]https://hothardware.com/news/amd-radeon-rx-vega-56-unlocked-vega-64-bios-flash#tYxL9AF2yVspfy1T.99[/url<] [url<]https://www.youtube.com/watch?v=SlpkmPEYTGE[/url<]

            • synthtel2
            • 2 years ago

            Who uses just Firestrike for this kind of testing and calls it a day? o_O

            Really though, this is all much more rhetorical than I think you think. I don’t have $500 to drop on a graphics card right now (spent it all on the monitor and have no regrets), and the picture may be substantially different by the time I do.

            I also think we have some substantial philosophical differences on graphics card purchases. You do seem to think that performance may differentiate a bit more over time (if changes happen at any level that let those last 8 CUs come into play more), but you nevertheless seem to be strictly buying performance here-and-now. I’m most interested in performance at the end of the card’s life; what’s going to make it outdated and send me looking for a new card? If those last 8 CUs can extend the card’s useful life by 20%, then they’re absolutely worth 20% more money to me. I haven’t run any numbers to tell how much they would likely extend the useful life (again, I’m not close to actually buying a card), but it seems plausible that +8 CUs might be worth that in the end (both Vega cards are right on the edge of the long-term performance level I’m looking for, and it may not take much absolute boost to keep it over that level for a whole lot longer).

            • freebird
            • 2 years ago

            Well, if I could have found reviews where someone did a full-blown comparison of stock 56 vs. 64 bios, I would’ve posted the link. But, I doubt you’ll find too many “reviewers” that will do that if they expect to continue to receive parts from Team Red. The Red guys might feel that will cut into their higher end unit sells…

            U r right about what I buy, I usually go for the sweet spot for cost/perf. That’s why I had 2 GTX 260s (no 280s), a 4850(not 4870), 3 5850s (no 5870s), I cross-fired and tri-fired for quite a long time on those and then 2x290s again in crossfire. Also have 2 dual pairs of 1070s and now lonely 56 looking for a playmate…

      • Krogoth
      • 2 years ago

      It is the fastest Freesync solution on the market and it is faster than Vega 56 at stock and it trades blows with 1080 ATM. Its only real fault that its power consumption is a little more than the 1080Ti.

      For general compute performance, the Vega 64 rivals 1080Ti and Titan X Pascal.

      • Hsldn
      • 2 years ago

      Why did AMD released the Vega64 really? what were they thinking?

        • Krogoth
        • 2 years ago

        They got to release product to keep the shareholders happy. They had to do it before Volta reaches public channels which would likely made Vega even more pointless.

        Vega 64 is defintely faster than Vega 56 at stock. The Vega 56 only matches the 64 if you overclock it where the 64 is pretty much near its overclocking ceiling at stock and doesn’t get any returns from it.

        • freebird
        • 2 years ago

        They needed several cost slots for vendors and need to make more money than just selling them all for 56 prices. Currently, tear-downs of Frontier Ed., Vega[super<]64[/super<] Air/Liquid and Vega [super<]56[/super<] all show the same board design and parts. It was probably cheaper to just use the same design for all when it comes to the reference boards, but also has the side effect of allowing the Vega[super<]56[/super<] to perform like a Vega [super<]64[/super<] with a simple BIOS flash. Let's be honest, most people won't want to mess with a BIOS flash, even with the dual BIOS and I doubt if you flash it back to the orginal Bios they would be able to tell and void your warranty. Side Rant... I could have sworn I checked the Warranty of the PowerColor Vega[super<]56[/super<] before purchase and it was listed as 3yr when I bought it in early September, but now on Newegg.com it is 1 year??? I will definitely go MSI or someone else next time.

        • dragontamer5788
        • 2 years ago

        Its a solid compute card at a solid price. NVidia is making more money in the datacenter than from gamers today… not very difficult when you consider that “Compute GPUs” often cost $4000 or more!

        AMD has the hardware now, but to take over the data center, they need to switch tons of programmers from CUDA to OpenCL, which is easier said than done.

      • DoomGuy64
      • 2 years ago

      The biggest problem with Vega 64 is that drivers and games aren’t optimized enough to utilize it’s extra resources, so an overclocked 56 can easily match the 64 because the card is not yet shader limited.

      Reference coolers aren’t the best design either, but that standardizes the hardware so much that bios flashing a 56 to a 64 works really well, making the 64 pointless.

      I personally would prefer a custom triple fan card, but since nobody is offering that, bios flashing is the main perk. There isn’t any point to the 64 when you can just flash the 56 to hit 64 clocks and it performs the same.

        • Airmantharp
        • 2 years ago

        If history is any indication, they never will be. Vega is GCN with more compute, which is not what games crave, but is what will sell AMD parts at higher markups to professionals. Great for AMD’s continued survival, but they won’t rival Nvidia for the top gaming spot.

        [note that Nvidia strips a lot of compute from their consumer parts, which makes a whole lot of sense given their intended uses]

          • freebird
          • 2 years ago

          Yeah, and supposedly there is a 12nm refresh due 1H2018 and rumors of Navi (probably Professional only) in the 2H2018 on 7nm which if they WOULD manage to get a 7nm Navi out in 2H2018 it probably would be LIMITED amounts and expensive; ergo a Professional version.

        • Chrispy_
        • 2 years ago

        I don’t have a problem with reference coolers; The Vega cooler is very well made and I believe that any GPU generating a couple of hundred Watts of waste heat should be dumping it outside the case if possible.

        I know open coolers can be made larger to allow either quieter fans or higher voltage/clocks but that only serves a relatively small demographic – those that build their own PCs with enough case airflow and CPU cooling to cope with the extra 200+ Watts of ambient heat being dumped into the case. For the enthusiasts that prefer quiet PCs, small PCs, multi-GPU PCs – and of course all the people with compatible pre-built PCs, a reference cooler that exhausts heat outside the case without stressing the internal cooling of everything else is not only welcome, it’s often [i<]mandatory.[/i<] Edit: I'm not sure why you're being downvoted so much - there are definitely architectural improvements that no game engines take advantage of yet - making Vega behave much like a higher-clocked Fury.

          • Waco
          • 2 years ago

          He’s getting downvoted because those types of resources will likely *never* be utilized for gaming in the card’s useful lifetime. It implies that things will get better when they very likely won’t be.

            • DoomGuy64
            • 2 years ago

            Uh, no? A lot of Vega’s optimization features are usage case specific and dependent on drivers. I believe the release drivers didn’t even support all the features, and had to be added in later. Both developers and AMD can vastly improve Vega’s speed by tuning for the architecture, and old dx11 games are not going to see much of an improvement compared to the Fury.

            It’s 2017, and games still don’t use dx12 by default. Depending on better dx12 adoption, Vega WILL get better, and you are outright wrong for ignoring that very important point. Vega is more dependent on optimization than Pascal, and games that do the work will see noticeable improvement.

            If anything, extra resources can be utilized with modding such as reshader and sweetfx, because it has the headroom for it. Of course, this headroom is questionable if being used in dx11 and not supported with asynchronous shaders.

            Basically Vega’s performance comes down to the old hyperthreading issue of yesteryear. AMD needs to backport async and other enhancements to dx11 for their future designed cards to see any benefit today. Can they do that? Maybe. Will they do it? Well, they’re not Nvidia, and Nvidia certainly would do that, while AMD seems overly content to not improve last generation technology. But it can be done, and is highly dependent on AMD doing it’s due diligence in driver optimization.

            • Waco
            • 2 years ago

            I’ll believe it when I see it. I had a 2900XT that had many of the same types of unutilized units when it launched. The promises are much the same and are rarely realized.

            • DoomGuy64
            • 2 years ago

            That card was junk from the get go, and had real issues pointed out by reviewers. AMD didn’t have a good dx10 card until the 4870 because of poor design choices that most people knew were bad.

            Vega is completely different, aside from power use. The architecture is there, but the software is not utilizing it. That’s why the 56 easily matches the 64. It’s NOT utilizing those extra resources, and only AMD is to blame for the drivers not doing anything.

            Nvidia would simply hack these features into dx11 or gameworks, while AMD is segmenting these performance enhancements only to DX12. I think it’s lazy programming, because you could clearly run some dx11 shader recompiler that would increase performance. DX11 is modifiable enough that this is possible. Like if the game runs a ton of post processing, run those asyncronously against the base graphics, instead of synchronously. Simple concept, but AMD isn’t doing it. That’s 100% of the performance issue. AMD is relegating these features solely to dx12/Vulkan, and dx11 games are suffering because of it. It’s not bad hardware, just short sighted developers who aren’t taking advantage of it.

            • Waco
            • 2 years ago

            You keep telling yourself that. Time will tell.

            • DoomGuy64
            • 2 years ago

            It’s a fact Jack, based on actual hardware features of the card, and the real world gains that happen when utilized. A+B=C. Vega’s design is an improvement of Fury, which is also designed for dx12.

            None of the asynchronous stuff works in dx11, afaik. This is something we’ve known since Hawaii, and we also know how well games like Doom run when you can take advantage of those features.

            You are the one staring reality in the face, and going, “Nope! Reality doesn’t exist.”

            AMD made Vega for dx12, and [i<]if games don't use dx12, you won't see the gains.[/i<] That, or AMD needs to backport those dx12 features to dx11, which they aren't doing, aside from maybe the DSBR, which I question how well utilized that even is. Vega's performance is 100% a question of getting game developers to support dx12 and enabling performance enhancements like Async. UE3 games still running dx9 just aren't going to see the benefit, and yes TIME WILL TELL, as developers move away from DX9 UE3 based titles, because that's been a ridiculously outdated engine pushed to use modern graphics in outdated APIs for far too long. VERY INEFFICIENT. Colonial Marines is a good example of this garbage lazy development. UE3 needs to die, or at least enable dx10 mode, because it's completely unacceptable to still be using dx9. It may "work", but it is also the worst way to make a game in 2017, and needs to stop. dx11/12/Vulkan, or go home game devs. edit: looks like Waco's got other accounts for vote trolling besides his gold account, because I doubt anyone else is reading this, and if someone was, voting without a reply is showing that you have nothing to say, and are just being an opinionated knee jerk reaction fanboy who can't take the heat in a debate. Not to say Waco's much better, because he has nothing to say, and says it anyway. His best argument is "time will tell", which is true, but it's also the 3 monkeys, where you are admitting you are close minded. Ever since Kepler, Nvidia has designed their video cards for last gen performance, which drops off a cliff through planned obsolescence, while AMD has designed theirs for future performance. Time will indeed tell, but it won't be what you think it is. Vega is the next Hawaii in terms of forward design longevity. The downside is that AMD isn't optimizing for dx11, but that's not indicative of dx12 at all, so falsely equating the two is a non argument.

            • Waco
            • 2 years ago

            Okay. Like I said, this exact situation has happened a few times in the past for AMD. I believe it when I see it.

      • watzupken
      • 2 years ago

      Vega 56 is better, but there are 2 ways to close this gap,
      1) Overclocked GTX 1070 – There is quite a fair bit of OC headroom on this card which will bridge the gap to some extent,
      2) GTX 1070 Ti is just round the corner.

      Long story short, I feel Vega is too little and too late. No matter how Raja tries to sell this, i.e. like fine wine which takes time to mature, the wine is 1 year late and have a lot of catch up to do.

        • DrDominodog51
        • 2 years ago

        GPU Boost 3.0 uses almost all of the overclocking headroom a 1070 has. One won’t gain much from overclocking it.

      • End User
      • 2 years ago

      Some of us are stuck with FreeSync displays.

        • Chrispy_
        • 2 years ago

        So use a Vega56 like the rest of us.

        At stock speeds it’s never more than 10% slower than a Vega64, and in some cases (like Witcher 3) the Vega56 is faster.

        Vega cards are power limited and in most cases, the extra shaders of Vega64 are [i<]not[/i<] providing extra performance but they [i<]are[/i<] using power that the Vega56 is instead spending on higher clocks. Whether you BIOS flash or not, there really isn't any significant benefit of buying a Vega64 right now, other than misplaced bragging rights (if you're going to brag, where is your 1080Ti SLI?)

          • End User
          • 2 years ago

          I see now you were specifically referring to 64. I was going by the title of the article and referring to Vega in general. My bad.

          In all honesty I have yet to read a Vega review. I’ve been waiting for non reference coolers to appear before I looked into Vega seriously.

          I game at 2560×1440 so a multi card setup is necessary. It would be a waste of money to buy 1 1080Ti now. 2 would be madness.

    • cynan
    • 2 years ago

    [url=https://www.massdrop.com/buy/msi-geforce-gtx-1080-armor-8g-oc<]Massdrop's got a GTX 1080 for $470[/url<]. Not sure if that includes destiny... Edit 2: And there's an additional $10 MIR from MSI. Edit: These things (high end Nvidia GPUs) have been selling out so fast, that unlike a lot of drops, it's shipped out fairly quickly (within a couple of weeks).

    • Kretschmer
    • 2 years ago

    If you’re snagging Vega 64, you’d better hope that PSU prices are coming down, too.

      • Krogoth
      • 2 years ago

      Power consumption angle is way overstated. Performance GPUs have given up being easy on power since Fermi/Tahiti. A decent 500W PSU have become the bare minimal to handle any performance GPU.

        • Airmantharp
        • 2 years ago

        For once I agree with Krogoth!

        Also, since it’s hard to buy a decent high-quality ATX PSU with less than 650w, given just how close the 550w-750w range is usually priced, I don’t see any rational single-GPU build being problematic.

        And even for multi-GPU, 650w should be fine for most cards. Running a 6700k and twin GTX970’s on an old Seasonic X-650 right now!

        • Freon
        • 2 years ago

        The entire Pascal line is great, the most performance/power efficient GPUs the planet has ever seen, for example the 1070 basically dropping 980 Ti levels of performance from 250W to 150W, and yet you protest? And that’s from already more-than-competitive efficiency of Maxwell.

        You seem to have just moved the goal posts on performance to try to make your point.

          • Krogoth
          • 2 years ago

          GP100 and GP102 do sacrifice some of that remarkable efficiency to get their levels of performance. The point is that performance GPU have not easy on power for the last decade and required powerful PSU to keep them happy.

          You have seen have forgotten there was a period where performance GPUs didn’t require massive coolers and external power cables to keep them happy.

          • freebird
          • 2 years ago

          If you are only running one GPU, it doesn’t matter that much to most people…for gaming and it’s been that way since the Pentium 4 days…

          But since I use multiple GPUs to mine also, I agree with you on that note. The GTX 1060s with 6GB & GTX 1070s rule the roost in hashrate/price/efficiency there.

        • psuedonymous
        • 2 years ago

        I’m running a 1080Ti on a 400W SFX supply, and it’s not even being taxed (I could even have gotten away with the 350W SFX if I was willing to run it at full tilt with its tiny loud fan).

        You really don’t need the monster PSUs that are popular, power consumption is vastly overestimated.

      • Puiucs
      • 2 years ago

      You are totally fine with a 500W PSU even if you decide to OC the HBM2 memory and a bit the GPU (if you undervolt it).

    • chuckula
    • 2 years ago

    Sign of the Apocalypse #8,304: An Nvidia Card is considered to be a good deal on a price/performance basis.

      • Airmantharp
      • 2 years ago

      Once you get faster than AMD, everything’s good price/performance 😀

        • Krogoth
        • 2 years ago

        Not really.

        The price/performance ratio starts to hit diminishing returns after you go beyond RX 470 and 1060 and it plummets as you go beyond the RX Vega 56 and 1070 with a minor bump at 1080Ti and slopes down again as you the approach Titans and Frontiers.

          • Airmantharp
          • 2 years ago

          You missed the joke: since AMD chooses not to compete in the high-end gaming GPU market, there is no price/performance comparison to be made 😀

    • ptsant
    • 2 years ago

    VEGA 56 at $350 for Black Friday. That would be interesting. Unless of course nVidia comes out with a 1070Ti at $400…

      • DPete27
      • 2 years ago

      Sounds like they’re expecting October 26 launch for the 1070Ti. Not sure about pricing, although Nvidia isn’t typically one to create a price war without provocation. Vega isn’t exactly putting the pressure on Nvidia from a pricing standpoint, and I doubt it ever will considering how much more expensive it costs to manufacture.

        • Voldenuit
        • 2 years ago

        Multiple rumors point to $429 MSRP (presumably reference card?), but given availability, market demand and AIB custom cards, could be hard to guess actual street prices shortly after launch.

        That massdrop deal for 1080 $450 after rebate is looking like a sweet deal.

          • Freon
          • 2 years ago

          Wonder what supply/price of the 1070 will do after the Ti release. Maybe yields for the die are higher and production of the 1070 will simply drop and the card will become a lower volume niche between the 1060 and 1070 Ti… Hard to tell.

            • Marees
            • 2 years ago

            The 1070 is based off 1080 (GP104) board.
            The 1070Ti is based off 1080Ti(GP102) board.
            So the yields of 1070 & 1070Ti are independent of each other.

            Btw. Nvidia restricts overclocking of 1070Ti in software, so that it doesn’t exceed 1080 performance.

    • Bumper
    • 2 years ago

    I like that little bit of poetry at the end. Nice touch.

      • dpaus
      • 2 years ago

      They have to be subtle when referencing Chinese hoaxes…

        • cynan
        • 2 years ago

        I suppose if anything is going to [i<]globally[/i<] warm your PC, it's going to be a air-cooled RX Vega. Especially an overclocked AIB model featuring a direct-flow cooler.

Pin It on Pinterest

Share This