Rumor: Retailers reveal Radeon Vega FE pricing

AMD's Radeon Vega cards may be the most anticipated product family of the year, and now we may have an idea of how much the first cards in that series will sell for. Retailers SabrePC in the USA and Scan in the UK have product pages up for the Frontier Edition, as spotted by the eagle-eyed folks at VideoCardz. Assuming those product pages are accurate, the air-cooled Frontier Edition will run $1199, while the liquid-cooled version of the card will run a whopping $1799. Across the pond, the air-cooled card will go for £1139.99, while the liquid-cooled FE will go for £1656.49.

The product pages otherwise don't reveal anything we haven't already learned about Vega up to this point, so all that's really left to do is to count down the days before the Frontier Edition's June 27 launch date. Stay tuned for more info as we hear it.

Comments closed
    • southrncomfortjm
    • 2 years ago

    Guessing miners will be buying these up like crazy if they can find em.

    • moose17145
    • 2 years ago

    Awesome! Can spend a boat load of money to get one of these and hook this display up to it!

    [url<]https://www.newegg.com/Product/Product.aspx?Item=9SIA0ZW5F71608[/url<] On a more serious note... how did TR miss this little gem being released? 27", IPS, and a 30-144hz freesync range... no word on the the gamut range... but it looks like it is ticking all the right boxes that people have been asking for for a while now...

      • ImSpartacus
      • 2 years ago

      I’ll wait for Freesync 2.0 and guaranteed LFC.

      Also, 3440×1440 is pretty appetizing. Also HDR.

      This isn’t a bad monitor by any stretch, but it doesn’t check [i<]all[/i<] of the boxes.

        • ferdinandh
        • 2 years ago

        HDR is the new 3D.
        1. higher framerate
        2. Freesync
        3. Blur reduction
        4. resolution
        If you have these and still have money to burn; get HDR but don’t enable it until games and movies truly support it. But keep in mind that in five years you probably will hear as much about HDR as you do today about 3D.

          • synthtel2
          • 2 years ago

          3D doesn’t work well without glasses and substantial amounts of extra processing power, it’s generally a series of awkward hacks, and everyone has rightly jumped straight to VR instead. HDR is basically downside-free and is trying to provide benefits that apply in a much wider range of situations. The number-crunching cost is miniscule (since the heavy stuff is all done in RGBA16F anyway) and it solves annoyances in render pipelines rather than creating them. There’s a bit of room for improvement in the standards, but what we mainly need now is just cheaper displays.

          At some point we won’t be hearing about it because it’ll be ubiquitous, though I think five years is a bit optimistic for that. 😉

      • RAGEPRO
      • 2 years ago

      Eh. No blur reduction and IPS (which means poor contrast.) I’ll pass.

    • ronch
    • 2 years ago

    I wonder how long it would take Vega to replace Polaris in the midrange.

      • ptsant
      • 2 years ago

      Fury is still not midrange.
      I would guess Vega won’t. Maybe HBM will trickle down with the next iteration.

      • kalelovil
      • 2 years ago

      AMD have stated Polaris GPUs are for the <$300 market, Vega GPUs are for the >$300 market.

      Navi has scalability as one of its bullet points, so it might span the whole spectrum.

    • Tristan
    • 2 years ago

    lol, cooling worth more than 1080, for 10-15% more perf .

      • ImSpartacus
      • 2 years ago

      It’s no sillier than the Titan lineup.

      Honestly, I think this will become AMD’s Titan.

        • Krogoth
        • 2 years ago

        Bingo

    • ronch
    • 2 years ago

    Frontier Edition was obviously inspired by Founders Edition. Should’ve been Fixer Edition instead.

    • DavidC1
    • 2 years ago

    If you guys are talking about mining performance in Ethereum, it won’t be noticeably better than Polaris.

    Ethereum is not a big fan of HBM. Fury was barely faster despite its much greater bandwidth.

    The ideal Ethereum card would be 4096SP, 512-bit GDDR5 based on Polaris.

      • Chrispy_
      • 2 years ago

      So two RX470’s for $150 each….?

      Oh, wait. The damn miners have pushed RX470 prices to $400 a pop!

      • Krogoth
      • 2 years ago

      Memory bandwidth doesn’t matter for cryptocurrenices. It is compute resources on the GPU itself that matters. It tends to fare better on AMD architectures as of late simply because they use more shader units then their Nvidia counterparts.

        • synthtel2
        • 2 years ago

        That used to be true, and then everyone figured out that that let ASICs show up and ruin the party. Being at least a little bit BW-bound is a goal of a lot of this stuff these days.

    • ET3D
    • 2 years ago

    That’s quite low for professional cards. I expected higher prices. At this price I expect some gamers will get it.

      • ptsant
      • 2 years ago

      It seems to be positioned like the Titan, as a “semipro” or “prosumer” card. The real “pro” cards are prefaced with “Pro” (Pro WX series) and are proportionally more expensive. I believe they said that the frontier edition will run with pro drivers but I don’t know if it will get the same amount of validation and it probably doesn’t have some of the other pro features (ECC RAM most notably).

      Otherwise, the same Vega is already on sale as a pro compute card at a (probably) obscene price point.

        • RAGEPRO
        • 2 years ago

        It’s interesting to note that, as far as I know, there is no such thing as ECC GDDR5 memory or HBM. Existing GPUs that support “ECC” do so by using a portion of the local storage to emulate the extra bit that ECC requires.

        I presume this is a function of the memory controller that is disabled in the consumer GPUs; in that case, I imagine there’s little reason this card couldn’t support an ECC mode albeit with reduced capacity. But it probably won’t, because I suspect the margin on this card is actually pretty poor despite the high price.

      • Kretschmer
      • 2 years ago

      Pro drivers often don’t carry the gaming optimizations. It would be a dumb purchase (especially at 3x the price of a 1080 Ti).

        • Srsly_Bro
        • 2 years ago

        I’m not sure you know the price of 1080 Tis.

          • Voldenuit
          • 2 years ago

          I bought my 1080 Ti for $635 a couple weeks ago, so the liquid cooled Vega FE is 2.83x the price.

            • Srsly_Bro
            • 2 years ago

            Please don’t be so ignorant as to generalize a bargain price for the average retail price of this GPU. The Vega FE is 2.83x the price of YOUR card that YOU purchased.

            Follow the link and see the current retail prices for 1080Tis.

            [url<]https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=1080+Ti&ignorear=0&N=-1&isNodeId=1[/url<] I swear....

            • Voldenuit
            • 2 years ago

            I mean, seeing as RX 580s are going for $700 a pop right now, the air-cooled Vega is a bargain at $1200!

            EDIT: It’s worth noting that this Vega card pricing is also a single data point, and for all we know, is unrepresentative of the ASP of the card once it hits the market. It might even be on the high end of the pricing scale.

    • RoxasForTheWin
    • 2 years ago

    I am smitten by that blue cooler, it looks beautiful

    • lilbuddhaman
    • 2 years ago

    Dear AMD,
    Use your mining hash-rate as a selling point, cuz your actual gfx performance is kinda meh.
    -Cryptobandwagoner

      • ptsant
      • 2 years ago

      I haven’t noticed AMD talking about mining hash rate, but nVidia has even made mining versions of its cards.

    • the
    • 2 years ago

    Those prices are and at first I though they were being set by the crypto currency market which has once again spurred GPU prices through the roof. Doing a quick search at Sabre PC which has a Vega listing doesn’t indicate price inflation there (though they are pretty close to MSRP and sold out). For a pro-card, these aren’t expensive but AMD better have something special to be priced higher than the Titan XP.

    • NoOne ButMe
    • 2 years ago

    higher expected…
    $1000/1250 expected…

      • Pwnstar
      • 2 years ago

      Don’t forget the price includes 20% VAT, which we don’t have. So expect $1,000/$1,500.

        • Jeff Kampman
        • 2 years ago

        This is incorrect; there is no VAT associated with the dollar amounts in the piece taken from the USA retailer.

          • Chrispy_
          • 2 years ago

          They’re probably adjusting in advance for our weak £ given that we’re continuing down the ignorantly destructive path of Brexit as of Monday :\

            • derFunkenstein
            • 2 years ago

            I had assumed (perhaps incorrectly) that the “surprise” result of the recent election would mean that England would work to reverse the previous Brexit decision. Is that not going to be the case?

            • Chrispy_
            • 2 years ago

            Nope. The blue team still won, so nothing has changed. They have egg all over their face because the whole point of the election was to get MORE power in parliament. It turns out they lost 13 seats and their red competition gained 30 seats, and the loss of 13 seats mean that they’ve now lost total control because they have to rely on an outlier group called the DUP to even get a majority in Parliament now.

            Britain is even more divided than ever, hence the weak currency – the markets hate uncertainty.

    • ermo
    • 2 years ago

    These cards appear to be priced with an attitude.

    They’d better be able to back that attitude up in real world benchmarks.

      • ImSpartacus
      • 2 years ago

      Dude, just look at the clocks and the compute resources.

      If you had a 64CU Polaris clocking at 1.6 GHz with ~480 GB/s of bandwidth, then it’d compete with Nvidia’s best.

      So unless Vega is worse than Polaris, AMD will be fine.

      • ptsant
      • 2 years ago

      These are semi pro cards which are targeted to people doing actual work. The gaming-oriented VEGA is coming later. The price is very high, but not unreasonable for workstation hardware.

      Otherwise, I agree. Performance and especially drivers must be good.

    • tay
    • 2 years ago

    And you thought nGreedia Founders Edition were expensive.

      • MathMan
      • 2 years ago

      When your father was a teenager, he used the Micro$oft spelling.

        • UberGerbil
        • 2 years ago

        [url=http://i.imgur.com/amS1bP6.jpg<]obligatory[/url<]

      • Topinio
      • 2 years ago

      lol wut.

      Scan has several choices for more expensive professional GPUs, up to and including an [url=https://www.scan.co.uk/products/32gb-amd-firepro-w9100-pcie-30-(x16)-gddr5-524-tflops-workstation-gpu-supports-upto-6-4k-displays-6x<]AMD £3.1k[/url<] and a [url=https://www.scan.co.uk/products/16gb-pny-nvidia-quadro-gp100-hbm2-pascal-3584-cores-4096bit-717gb-s-103-tflops-singlep-nvlink-dp14-d<]NVIDIA £7.1k[/url<] option.

        • curtisb
        • 2 years ago

        That’s apples to oranges. The FirePro and Quadro lines are professional cards, not consumer cards. For starters, the drivers are tuned completely different. You won’t get the same gaming performance on them as the Radeon or GeForce lines, because that’s not what they’re designed for or targeted at.

          • MathMan
          • 2 years ago

          Quadro and FirePro GPUs have some extra HW features enabled, so there’s really no good reason why they should be slower than an equivalent gaming GPU.

          And that seems to be the case for the current Oascal Quadro GPUs:
          [url<]https://hothardware.com/reviews/nvidia-quadro-p6000-and-p5000-workstation-gpu-reviews[/url<]

          • LostCat
          • 2 years ago

          They’ve been saying this first release is not meant for gamers a while.

            • curtisb
            • 2 years ago

            They’re marketed as Radeons. They can say that, but it won’t stop people from buying them for gaming because that’s what the Radeon line is. Who knows why they’re saying that, but one reason is likely that they want their piece of the miners pie…and miners will pay through the nose for them. It could also be that the drivers aren’t optimized for gaming yet, but they’ve talked about Vega so much that they know they need to get something onto the market before people start screaming “paper launch” and “vaporware.” Vaporware has already been brought up on this very site…

            • NoOne ButMe
            • 2 years ago

            will you also argue the RADEON WX series is aimed at gaming?

            It’s meant to be a fill-in between professional and gaming cards.

    • Kretschmer
    • 2 years ago

    Note that these are NOT the gaming GPU cards. Expect these SKUs to be insanely marked up.

    • chuckula
    • 2 years ago

    This is actually brilliant (and I’m not being sarcastic).

    AMD is finally pricing coin-miner cards in a way that guarantees it a profit instead selling the cards to retailers who make all the money instead.

    And if the prices are too high for the miners, it means the cards will at least be available for real people to buy, so it’s a win-win.

      • designerfx
      • 2 years ago

      Wha? If this card is even 3x as good at mining as an RX 580 people will buy them until they are out of stock. Price isn’t even going to be a concern at $1200, let alone if it was $1800US. Have you looked into cryptomining?

        • defaultluser
        • 2 years ago

        3x the performance of an RX 580? The latest specs say 12Tflops = 2x faster FP32 than RX 580!

        This may find some buyers, but AMD will have resolved the availability issues by the end of July. Not seeing how this will make big sales if it’s 5x the price of an RX 580, for 2x the performance.

          • NoOne ButMe
          • 2 years ago

          mining not just that…
          I remmeber see 1080 < 1070. Because latency of GDDR5x or something I read.

          Don’t see HBM2 help that much, but could be 3x speed still 🙂

            • Voldenuit
            • 2 years ago

            Yep. [url<]http://www.legitreviews.com/wp-content/uploads/2017/06/ethereum-best-mining-gpu.jpg[/url<]

          • Voldenuit
          • 2 years ago

          It’s possible they may have a different proportion of shaders to pixel pushing than the 580.

          Right now, the 580 at 6 TFlop/s performs similarly in games to a 1060 at 4 TFlop/s.

          If Vega is indeed 12 TFlop/s, it would line up with a 1080 at 8 TFlop/s unless they change the allocation of resources.

            • NoOne ButMe
            • 2 years ago

            4.5-5 TFLOPs you mean. Some custom cards even are over 5 TFLOP at their shipping configs.

            FE cards are at least 4.5 every review i can find with clocks. same for all AIB cards.

      • pdjblum
      • 2 years ago

      i give you a hard time for all your negative AMD comments, the least i can do is thank you for showing you can be objective and positive

      i have noticed other comments from you over the past week that also were pleasant to read

      i will be able to read and learn from your negative comments now that i know you offer up positive ones about AMD as well

      personally, i find most of the tech companies pretty amazing for the crazy cool stuff they bring us at prices most of us can afford, especially amd, as they truly have been a disruptive force with their amazing offerings at awesome price points as of late

      thanks again

      • DavidC1
      • 2 years ago

      3x? More like 20% better.

      TFlops are one factor, memory is another. The popular Ethereum mining won’t be much faster than Polaris.

      • ptsant
      • 2 years ago

      It depends on electricity vs upfront cost. Vega is certainly more power efficient than 2x RX580, but it is so much more expensive. If you have cheap electricity, the RX580s are better. If you don’t have cheap electricity, you probably shouldn’t be mining.

        • Zizy
        • 2 years ago

        With current mining, electricity is not really much of a factor. Prices are too high for it to matter. Sure, when the bubble bursts, electricity costs will matter, but at that point everyone will be selling their GPUs anyway.

        The main advantage of Vega here is that you can squeeze the same number of them in a system and enjoy better mining/hour.

      • ratte
      • 2 years ago

      This,,,, again with another rather balanced comment.

      Maybe it’s all the heavy drugs I’m on at the moment but have another thumbs up

      • Voldenuit
      • 2 years ago

      A bit premature here. These are retailer store prices, not AMD official prices.

      For all we know, the MSRP is $699 and SabrePC decided to gouge miner *and* early-adopter prices.

      • jts888
      • 2 years ago

      Why doesn’t AMD just sell cards with stripped down memory for mining?

      It can’t take much bandwidth or even more than a few MB of capacity to just do brute force partial hash inversions.

      If HBM2 delivery really is the problem with Vega launch sliding so far, it would seem like an even bigger win-win to sell Vega cards with only 1 stack of moderately clocked 1-High HBM and maybe even the display outputs missing for better ventilation, just so they can sell more ASICs overall and not having the miners ruin the graphics market and market share.

    • Waco
    • 2 years ago

    At that price they better be amazingly good…

      • chuckula
      • 2 years ago

      That blue color…

      And a a $600 markup just for liquid cooling….

      Are we [b<]SURE[/b<] Intel didn't buy out the RTG and nobody told us?

        • derFunkenstein
        • 2 years ago

        Intel-style pricing was part of the CLA

      • xeridea
      • 2 years ago

      This card is geared more towards workstations, and these types of cards always cost a lot more. The gamers version should be a lot cheaper.

        • Waco
        • 2 years ago

        Sure, but these cards don’t have any features that the regular consumer/gamer cards don’t also have (unless I’ve missed something).

          • DancinJack
          • 2 years ago

          Unless they’re cutting down FP capabilities like Nvidia? Haven’t heard that though.

            • Waco
            • 2 years ago

            They’ve never done it in the past, and they’re following the “unlocked” route with their server chips as well.

            It’d be a surprising departure from tradition, one that would likely burn some of their good faith mindshare.

            • ImSpartacus
            • 2 years ago

            Vega 10 is slow DP just like GP102 or GP104.

          • NoOne ButMe
          • 2 years ago

          Pro drivers I think…

      • Chrispy_
      • 2 years ago

      At compute it looks like they’re going to outperform a Tesla P100 by about 25%.

      Before you say ‘meh’, the P100 costs over $7000

        • chuckula
        • 2 years ago

        Bear in mind that at 32-bit compute the Pascal Titan XP also outperforms the P100 by hitting 11Tflops. The P100 is really designed for 64 bit first and foremost with the other numbers being there mostly for Nvidia marketing.

        We’ll see if Vega is a fully-enabled FP64 card (meaning roughly 6.5 TFlops of FP64 if it’s a true 2:1 design).

        The pascal Titan XP is a $1200 card direct from Nvidia, and I don’t think the lower-end price of $1200 for Vega is a coincidence:
        [url<]https://techreport.com/news/31706/nvidia-titan-xp-retakes-its-rightful-place-on-the-throne[/url<]

          • NoOne ButMe
          • 2 years ago

          seems Vega10 is 1/16th native. Vega20 brings FP1/2

            • Chrispy_
            • 2 years ago

            Hmm, hadn’t read this, but it seems you are right.

            I guess the deep-learning and algorithm companies will be gobbling these up but they’re not going to be faster than Fury/FuryX for the sort of environmental/solar sims we’ll be running. If only Teslas didn’t cost a bajillion bucks each.

        • MathMan
        • 2 years ago

        If you absolutely need fast FP64 support and a considerably higher BW than Vega, price is probably not a huge issue.

        • Waco
        • 2 years ago

        The relevant comparison for “pro” cards is FP64, not FP32.

          • Chrispy_
          • 2 years ago

          If you’re doing scientific compute, and accuracy is essential yes.

          I hate it when people pigeonhole “compute” to FP64. We use FP32 for simulations and environment modelling all the time. A lot of them are trying to model random behaviours anyway so a tiny amount of (consistent) error doesn’t have a huge effect on the result.

            • MathMan
            • 2 years ago

            Fair enough.

            But why, then, do you compare against a P100 and not a Titan Xp?

            • Chrispy_
            • 2 years ago

            HBM2 has much lower latency which our wind/solar simulations seem to care about a great deal. We tried an old Titan (non-X, non-XP) and the Fury X with HBM ran circles around it for simulations (was about 5x quicker).

            It depends on the specific task you’re going to be running. You’re falling into the same hole as Waco by trying to pigeonhole and generalise “compute” into a specific set of requirements. You simply can’t do that!

            • MathMan
            • 2 years ago

            Individual use case trumps everything else. If your Fury X ran 5x faster than a Titan, then go for it.

            But let’s not use that as proof for anything at the architecture level.

            You and I both know that there’s no good reasons for it to be like that (not even HBM, which doesn’t have lower latency to begin with), and that a much more reasonable explanation is that some programmer didn’t really know what he was doing or never bothered to optimize for an Nvidia GPU. (Which, again, is fine from an individual product point of view.)

            • Waco
            • 2 years ago

            ECC is a requirement for real use, and this card doesn’t have it.

            • Chrispy_
            • 2 years ago

            Please explain. If throughput is more important than accuracy, why would ECC matter?

            • Waco
            • 2 years ago

            I never said throughput was more important.

            My opinions are colored by my experience, which is HPC physics simulation. Wrong answers are useless, but many other workloads don’t need precision to the same degree and others don’t even care about ECC.

            • Chrispy_
            • 2 years ago

            Ah, there you go. I thought by “real use” you were generalising. In your case if that’s physics sims, then I guess that’s it.

            Like I said in a later response above, it’s really difficult to pigeonhole and generalise GPUs for non-gaming, simply because (unlike gaming) no two use-cases will be the same;

            Int, FP16, FP32, FP64, bandwidth, latency, RAM size, architecture, scalability, power-efficiency, driver features – all of these can have order-of-magnitude implications for non-gaming usage and the only way to really know is to try that specific usage on a bunch of cards and see what influences it the most.

          • ptsant
          • 2 years ago

          Not if you are doing Deep Learning. I have no idea what are the requirements of the usual suspects (3d, CAD, video processing) but the market in general mostly seems to want FP32. You could be right about the absolute high end, of course.

        • sreams
        • 2 years ago

        The P100D is even faster, but it costs over $140,000.

          • Chrispy_
          • 2 years ago

          Bargain! I’ll take three please.

      • ImSpartacus
      • 2 years ago

      Think of this as AMD’s Titan.

      • ronch
      • 2 years ago

      I suspect they either will or some price correction will immediately ensue.

Pin It on Pinterest

Share This