Updated: Radeon Vega Frontier Edition launches today for $999 and up

AMD's Radeon Vega Frontier Edition graphics card launches today, and we have official answers for most of the questions that have arisen about the Vega GPU's specs and capabilities over the past couple of months. The Frontier Edition family will comprise a pair of cards: an air-cooled model that's launching today for $999, and a liquid-cooled version that's launching in Q3 for $1499. AMD isn't talking about the specs or performance of the liquid-cooled card today, but it's divulging enough for me to conclude that my predictions of the chip's clocks and peak performance in May were correct for the most part.

First up, the air-cooled Vega FE will have a "peak engine clock" of 1600 MHz. With 4096 "nCUs," that'll yield 26.2 TFLOPS of FP16 performance and 13.1 TFLOPS of FP32 math. AMD further revealed that the card will perform FP64 operations at 1/16 the FP32 rate, or 819 GFLOPS. Given this card's focus on pro visualization, game development, and deep-learning tasks, the fact that it's running FP64 at the minimum rate possible for a GCN-derived part isn't that great of a surprise.

Each Vega Frontier Edition card will have 16GB of "high-bandwidth cache," AMD's new internal term for what we used to refer to as graphics RAM. This conceptual HBC is likely HBM2 memory in practice, given its 2048-bit bus width and 1.89 Gb/s transfer rate (double the 945 MHz clock rate). In aggregate, that means the card has 483 GB/s of memory bandwidth to tap.

In an interesting disclosure, AMD says the Vega FE will have peak polygon throughput of 6.4 GTris/s, a figure that may confirm the fundamental organization of the Vega GPU as four main shader clusters. AMD further told me that the card can handle 256 texels per clock, and that it has 64 ROPs. The company projects the air-cooled card will slot into a thermal envelope of less than 300W. The air-cooled card requires two eight-pin PCIe connectors, and it offers three DisplayPorts and one HDMI port.

None of the usual suspects in the PC hardware media got a Vega Frontier Edition card to test, and we expect that's primarily because of the card's pro graphics focus. AMD envisions Frontier Edition cards crunching through deep-learning prototypes before they're deployed out to Radeon Instinct server farms, powering photorealistic visualization of giant data sets, and underpinning every step of the game-design workflow. AMD says pro users will be able to flip between a "Radeon Pro Mode" and a "Gaming Mode" in the card's drivers to serve those tasks.

Product pages for the air-cooled and liquid-cooled versions of the Frontier Edition are live on Newegg now, although prospective buyers can only sign up and wait for availability news at this time. If you've got a line on a Frontier Edition, let us know if you can be without it for a few days.

Comments closed
    • abiprithtr
    • 2 years ago

    “…my predictions of the chip’s clocks and peak performance in May were correct for the most part.”

    juanrga, is that you?

    • chuckula
    • 2 years ago

    Interestly thread in the forums: [url<]https://techreport.com/forums/viewtopic.php?f=3&t=119788&view=unread#unread[/url<] [Funny how downvotes tend to accumulate over any suggestion that these Vega parts should actually be tested in the real world. Must be Ngreedia fanboys who are afraid of Vega... yeah, that's it. It's also interesting how nobody was screaming at TR to not test the 7900X even though the reviews went live a week before the official launch. Here we have a product that purportedly launched on Monday but apparently any real-world numbers must stay secret.]

    • psuedonymous
    • 2 years ago

    13.1 FP32 TFLOPs for Vega to Fury’s 8.6 FP32 TFLOPs scales with the exact same 1.52x as the clock rate: 1600MHz vs. 1050MHz. A clock bumped Fury with the memory controller swapped from HBM to HBM2 (half the width, a bit less than double the per-pin speed) is looking pretty likely. The SPs themselves have that doubled FP16 performance from packed operations, and we’ve seen that same change in Polaris for consoles.

      • stefem
      • 2 years ago

      Is Vega FE clock boost or base?

      • Voldenuit
      • 2 years ago

      I don’t think we can assume identical architecture from theoretical maximum TFLOPs.

      IIRC, Fury was never able to fully utilize its shader resources because of the scheduler.

      In theory, AMD could keep the same CU architecture yet still drastically improve real-world performance by expanding or redesigning the scheduler. So I think it’s a bit too soon so call Vega a die-shrunk Fury.

      • brucethemoose
      • 2 years ago

      That would make the die pretty small, right? I thought Vega was a big chip.

    • Unknown-Error
    • 2 years ago

    Rumored to be fast, very fast but very very power hungry.

      • Krogoth
      • 2 years ago

      GP102 and GP100 aren’t exactly that much better either.

        • stefem
        • 2 years ago

        Hmmm… last time I checked there was a big difference in efficiency

          • Krogoth
          • 2 years ago

          GP100 and GP102 still pull a ton of current to operate and they were not as efficient at gaming as their lesser Pascal brethren.

          1080 and 1070 are still the king at power efficiency with gaming workloads.

          It doesn’t matter though as GP100 and GP102 were designed with high performance in mind not power efficiency.

    • Wall Street
    • 2 years ago

    If Vega really is still limited to four triangles per clock as the 6.4 GTris/s rate suggests, then Vega is in for a world of hurt and will probably remain noncompetitive in many gaming workloads. It sounds like geometry shaders aren’t active for all workloads.

      • HisDivineOrder
      • 2 years ago

      AMD prematurely releases product with huge hype that doesn’t game as well as it does everything else? Wow! This is so unprecedented…

        • Hattig
        • 2 years ago

        Note that the Game Mode drivers are running alongside the Pro Mode drivers (as that is the exact use case given by AMD, use a Pro App to design game, then launch game into game mode to see how it behaves on game optimised drivers).

        That is going to affect performance, because the likely way they do that is some form of virtualisation, which may mean an upper limit of resource allocation that is not the full Vega card, as well as any virtualisation overhead. The results may be 3/4 of a Vega, for example.

        The fact is that the FE owning guy also did a Pro benchmark, and it was doing excellently there, when it had access to 100% of the GPU.

      • Krogoth
      • 2 years ago

      Wait until gamer-ordinated versions of the silicon to come out. The original GTX Titans (GK110) weren’t exactly stellar performers at gaming at their debut. They were somewhat faster their 680 and 7970 and were eclipsed by gamer-ordinated versions of GK110 a.k.a 780 and 780Ti.

      Their gimmick was general compute performance (mainly FP32).

        • Freon
        • 2 years ago

        I don’t think AMD has enough resources to really spin off significantly different silicon for the pro v gamer parts. I would expect them to just burn off a fuse to hamper gamer parts in FP64 or whatever.

      • brucethemoose
      • 2 years ago

      Wrong reply.

    • torquer
    • 2 years ago

    more quadrilaterals

    • ronch
    • 2 years ago

    6,400 million triangles per second.

    Hey I still have my Voodoo3 3000 here. 7 million triangles per second.

    • deathBOB
    • 2 years ago

    Seems weird not to provide review samples. Do they just not have cards available?

      • DancinJack
      • 2 years ago

      This is usually how it works for “pro” cards. Nvidia doesn’t usually give out Quadro cards either. AMD hasn’t historically, at least as far as I know, given out FirePros for review either.

      I’m sure there have been exceptions, but IIRC this is the regular process.

        • BobbinThreadbare
        • 2 years ago

        Pro card drivers are usually more accuracy focused than speed focused so the results aren’t apples-to-apples with consumer cards.

      • Flapdrol
      • 2 years ago

      Considering the news of the past few months I don’t think they have a lot of them.

      And new series of gpu seem to sell out more often than not, so might as well sell all of them.

    • Voldenuit
    • 2 years ago

    [quote<]This conceptual HBC is likely HBM2 memory in practice, given its 2048-bit bus width and 1.89 Gb/s transfer rate (double the 945 MHz clock rate).[/quote<] Jeff, why report transfer rate per channel bit? That makes it hard to compare card bandwidths. 1.89 Gb/s x 2048 bit bus gives 3870 Gb/s, which is 483 GB/s of total GPU memory bandwidth. It's a lot easier for readers to compare with existing card specs (484 GB/s for 1080Ti, 320 GB/s for 1080 etc) with the latter number.

      • Pwnstar
      • 2 years ago

      Exactly. Nobody cares what the bitrate is. We want bytes.

      • jihadjoe
      • 2 years ago

      Even the unit used is incorrect. The first part should be GT/s (Gigatransfers per second) times the bus width in order to get the transfer rate in Gigabits/s, then divided by 8 (or perhaps 10, depending on how the GPU does it’s memory signalling) to get Gigabytes/s.

    • Anovoca
    • 2 years ago

    That yellow R in the corner just reminds me too much of the rockstar games logo

      • morphine
      • 2 years ago

      Just wait a couple days, they’ll start suing anyone that adds a third-party cooler to their card.

    • chuckula
    • 2 years ago

    [quote<]If you've got a line on a Frontier Edition, let us know if you can be without it for a few days.[/quote<] I think we need some AMD power to get TR a review unit. Here it goes: Wassonjuice Wassonjuice WASSONJUICE!!

      • juampa_valve_rde
      • 2 years ago

      Buffalo?

        • chuckula
        • 2 years ago

        No, these Vegas are launched (at least as far as pulped wood is concerned). The Buffalo has moved on to greener pastures.

    • chuckula
    • 2 years ago

    [quote<]The Frontier Edition family will comprise a pair of cards: an air-cooled model that's launching today for $999, and a liquid-cooled version that's launching in Q3 for $1499. [/quote<] Now that's what I call aggressive pre-pre-launch price cutting: [url<]https://techreport.com/news/32105/rumor-retailers-reveal-radeon-vega-fe-pricing[/url<]

      • blahsaysblah
      • 2 years ago

      It’s a good advantage that they are not required to have insane margins else its a failure/destruction of stock prices.

        • chuckula
        • 2 years ago

        We’ll see about that.

        AMD’s stock price is mostly based on assumptions that their margins are going to actually hit the same level of Nvidia/Intel in the foreseeable future.

        For all the hype about AMD, when Lisa Su actually announced real margin target numbers at their investor’s day press conference it precipitated the largest single-day drop in over 12 years: [url<]http://www.marketwatch.com/story/amds-stock-plunges-toward-biggest-loss-in-over-12-years-2017-05-02[/url<]

          • NoOne ButMe
          • 2 years ago

          It is not based on those Margins. AMD was clear they aim to get margins over 40% for the company at FAD 2017…

            • chuckula
            • 2 years ago

            1. Lisa Su announced AMD’s target margins.
            2. The next day the stock tanks.
            3. Do you think that 1 & 2 occurred in that order because investors were excited at what Lisa Su said?

            • NoOne ButMe
            • 2 years ago

            The stock did drop 1.50$
            It since that time has climbed up again.
            Falling in the last few days.

            • cynan
            • 2 years ago

            …And crapping all over itself along with the rest of the tech sector today. Yuchh.

Pin It on Pinterest

Share This