AMD’s early Vega graphics card takes a turn in San Francisco

Although Ryzen CPUs were most definitely the star of the show at AMD's event in San Francisco this week, the company had another tantalizing product hidden in plain sight in its demo room. The company finally did away with the gaffer tape and solid cases and showed off a Vega graphics card in the buff. 

The card itself bears a strong family resemblance to the reference Radeon RX 480 cooler before it, but this early card has some extra length to it for the (presumably diagnostic-related) USB 3.0 plug on its snout.

One can also see a ton of components behind the GPU itself, much like the Radeon R9 Fury's board layout. 

The card uses an eight-pin-plus-six-pin PCIe power arrangement.

We've had ample opportunity to play with Vega-powered hardware before now, but we've never been able to see the card that was pushing those pixels. The fact that AMD is cooling a presumably large, HBM-equipped graphics chip with nothing more than a beefy blower cooler may bode well for Vega's performance per watt. We'll just have to wait and see.

Comments closed
    • Legend
    • 3 years ago

    Is it typical for the cpu cooler to be moving air top to bottom instead of front to back?

      • Chrispy_
      • 3 years ago

      No, it’s against the direction of airflow in a typical case; They probably did that to make the GPU more visible since that’s the cooler and fan orientation that is least “in the way”.

      It could be to enhance cooling on the back of the pre-production graphics card but I’d imaging the air coming off the CPU isn’t terribly useful in that regard, being only a few degrees cooler than the back of the GPU in the first place, hence why I think it’s primarily just to show the card off better.

        • BurntMyBacon
        • 3 years ago

        I believe you are correct in that this orientation is to enhance the cooling on the graphics card.
        Though, I think there is perhaps a bit more of a difference between the air temperature and the graphics card than you give credit for.

        GPUs this large often get hotter than CPUs. There exist many examples of cards with some pretty extreme hot spots on the back of the card.
        I’ve selected a relatively tame, power efficient, midrange, Founder’s Edition GTX1060 for discussion purposes:
        [url<]http://www.guru3d.com/articles-pages/geforce-gtx-1060-review,10.html[/url<] The hot spot here is ~86C. My CPU never breaches the 60C mark, but lets assume 70C for this setup. There is also the thermal resistivity of the heatsink to account for, so the CPU heatsink will be significantly cooler on average than 70C where the fan forces air across it. I've seen thermal images that suggest 45C is a good estimate for the hottest point on the fin portion of a tower cooler, though I can't vouch for the method, so lets give a lot of leeway and assume 60C. Air itself is considered an insulator, so it will only pick up a portion of the heat energy at the fins. In a closed chassis with poor airflow, this heated air will make its way back around and be reheated until such a time as it reaches equilibrium. In an open setup, or a setup with good airflow, the air only makes a single pass or possibly two before being exchanged for air at or near room temp. This setup is most likely open. Even if we use the temperature at the fins and disregard the thermal resistivity of air, we are left at 60C as the maximum air temperature. This gives a 26C difference between the air temperature and the hot spot. The temperature difference is made less significant by the fact that air will also be slow to pick up heat from the video card, but it is still noteworthy. A hotter GPU, or more powerful fan on the CPU tower (greater temperature difference between base and fins), or taking into account that the air temperature will in fact be lower than the fin temperature would increase this difference.

    • house
    • 3 years ago

    That RADEON logo better be RGB .

      • nico1982
      • 3 years ago

      I like radioluminescent paint more.

    • Billstevens
    • 3 years ago

    I never thought I would say it but Nvidia and AMDs gambles on freesync and Gysnc tech have pushed me to hold out for Vega. If not for a pricey freesync monitor I would have had a 1080 a long time ago…

    I hope Vega ends up being strong competition right out of the gate.

      • Firestarter
      • 3 years ago

      gsync is dead in the water IMO, it’s only a matter of time until Intel announces support for adaptive sync in their upcoming iGPUs

        • GrimDanfango
        • 3 years ago

        Gsync is very much alive, for the moment… but there will presumably come a point where nVidia will have to intentionally avoid supporting DisplayPort standards, beyond the point where they’re no longer optional.
        I wouldn’t put it past them to do even that… but I don’t think they’d be able to keep it up indefinitely. At some point they’ll have to give it up.

        That is assuming they don’t manage to keep leaning on VESA to keep the adaptive-sync component “optional” in perpetuity.

    • ronch
    • 3 years ago

    I think AMD’s struggles with energy efficiency in the Bulldozer and latter Thuban days ultimately pushed them to take efficiency very seriously. Few folks believed they could do it but it seems all the things they’ve learned with Bulldozer and later iterations, what to do and what not to do, are really paying off now. Guess you could say everything happens for a reason, eh?

      • Airmantharp
      • 3 years ago

      In this way they’re catching up to Intel and Nvidia. And they have to- Intel and Nvidia didn’t go down the optimization for performance/watt route just for the hell of it; they were competing only with themselves, and generation over generation it was one of the main things that got them more business.

      Even if AMD is able to man up to Intel/Nvidia top-end performance (they’re close) and compete positively on price (which they’ve always been good at), they’ll still be limited in terms of market penetration if they cannot get very close on performance/watt.

        • ronch
        • 3 years ago

        Higher efficiency is a very good way to convince buyers that your product is good. True for cars, true for airplanes, true for bathroom cleaners, true for computers. It’s all about economics. It shows the product isn’t wasteful with resources,or at least makes better use of what you put in.

        • Spunjji
        • 3 years ago

        What confuses me here is that power efficiency used to be AMD’s forte, but back then nobody gave a damn.

        For the longest time (with exceptions like the 7000 series) Nvidia had focused on producing the largest, fastest chips possible for every generation at any cost – monsters like the GTX 280 and 480. Meanwhile after the lemon that was the HD2900 ATi / AMD had spent a lot of time releasing chips that offered serious power and cost efficiency.

        Similarly, the Athlon 64/FX and especially the early X2s clubbed the Pentium 4 and later Pentium D to death in terms of power efficiency – but nobody seemed to really care.

        The way I see it, AMD are finally getting back to a position where they have access to decent, relevant manufacturing processes on the CPU side and the opportunity to get their house back in order on the GPU side, to do what they always did well – low-cost, moderate efficiency, decent performance.

        I guess I’m still taken aback that power consumption suddenly became an “enthusiast” metric. I always cared because I like building quiet PCs, but that used to be seen as weird. Times change πŸ˜€

          • Airmantharp
          • 3 years ago

          This was really only true with Netburst, a failure that AMD emulated spectacularly with their Bulldozer project.

          In terms of AMD vs. Nvidia, Nvidia chose to focus on HPC with their big chips, literally up until the last generation where they started building ‘low-precision’ big chips that we now know as Titans and -Ti variants. Those HPC big chips required more juice to compete with AMD’s more steamlined mid-sized parts, but as of the GTX600-series Nvidia’s mid-sized parts have outrun every AMD offering, and those parts have always been more efficient for gaming workloads.

          But back to the present: it is exciting to see AMD taking performance/watt seriously; it’s also a necessity, because at any given tier of implementation (size/noise sensitivity/price), TDP has a limit, so better efficiency translates directly into more performance!

          • freebird
          • 3 years ago

          Yep, nearly ALL the enthusiast sites back in the days of the P-IV mini heater barely cared an iota of perf/watt back in the day until core2 came along… how convenient. I’m sure Intel marketing money played no part in it though….

            • jihadjoe
            • 3 years ago

            Because back then things really weren’t consuming that much power. The first Presshot was 89W, and later on the Extreme Edition topped out at 115W. Performance increases were easy to do, just make the chip consume a bit more power.

            These days though we really have run into the TDP wall, and every increase in performance has to come from some sort of efficiency gain.

            • freebird
            • 3 years ago

            Actually, the code name was Prescott which came after the original Northwood P4, but the Intel P4 660 was a 115w part was well as the P4 EE 3.73 listed as TDP 115w but could actually have a Max Dissipation of 148watt

            Not consuming that much power? How much power are Intel’s offerings today rated for? Only 6 & 8 cores are listed over 100w.

            where as the Athlon 3700+ listed at 89w TDP and the Athlon 3500+ the following year dropped to only 59w TDP. A lot had to do with AMD running lower clocks and SOI, where as the P4s need high clocks to do the same work, which required more power for higher clocks.

            AMD was also introducing features such as x64 and Virtualization into their consumer line earlier and in nearly all their line up, where as Intel segmented their lines with 3 digit codes to tell the series 500 vs 600; speed 530 vs 660; and whether certain other features were enabled or not 650 vs 651?

            There is always going to be a thermal “sweet point” for ANY micro cpu architecture which also depends on what manufacturing features were used to build which chip features

            • I.S.T.
            • 3 years ago

            It’s funny how you’re forgetting that the Phenom and Phenom IIs had higher TDPs than 89 watts, and then when FX happened the proverbial **** hit the fan.

            It’s just so, so funny.

            No, no it’s not. Drop the fanboy [i<]bullcrap[/i<]. Ryzen(SUCH A DUMB NAME) is gonna return AMD to its rightful place as a proper Intel rival, like back in the Pentium 3 to early Pentium 4 days, then the Athlon 64 days. Revel in that. I know I am. You are living in the conspiracy theory past. Embrace the present and love the good future that is coming. Not everyone is against the poor underdog AMD. They can make a crappy product. [i<]Everyone can[/i<]. Intel is not the only one. The Pentium 4 was only good for a short time between the 2.4 GHZ to 3.2 GHZ HT days. So, about a year, year and a half. Before then, AMD was a rival or better in most ways, then when the Athlon 64 came, it beat Intel significantly in all but media creation(The one good spot for the P4 arch), and even then for most situations had acceptable performance. Unless you were doing hardcore media work(Including re-encoding of audio/video) and/or professional media work, Athlon 64 was the better choice. Then the Core 2 hit. Get over these facts, already.

            • I.S.T.
            • 3 years ago

            Have you ever considered the idea that the heat generators of the late P4/P-D era might have possibly educated people on the issue of performance per watt?

            Nope. Instead you go into some grand paranoid conspiracy theory that everybody is against Intel and AMD is the poor victim instead of Intel out performing AMD and AMD having screwed up a good four five years of CPU so bad they had to do a 100% restart.

            Have you also considered that as hot as those old Intel CPUs got, the highest end SKUs didn’t get so hot that they had to be packaged with a watercooler? Because AMD’s highest end current SKU sure as hell does.

            Nope. AMD CPUs burn as much power and dump as much heat as the old crappy P4s/P-Ds in your eyes. All the emperical data means nothing to you, nosiree.

            For the love of God, man. Learn how to read. Apparently, you don’t know how if you have missed the last seven or more years of AMD not matching Intel in performance and heat, and then getting far, far worse when the FX series hit. Surely, someone with reading issues is the only explanation as to why you have missed so much data and have resorted to conspiracy theories as to why people dislike AMD so much right now.

            Right?

            Right?

            Right?

            I’m ****ing wrong, aren’t I?

          • chΒ΅ck
          • 3 years ago

          Wasn’t it because AMD tried to save engineering costs by planning traces using automated methods rather than manually?

            • freebird
            • 3 years ago

            That was surely part of their problem…but that was strictly a Bulldozer problem. Die was less efficient because poor management decisions, i.e. modular design; and thinking they could build faster/cheaper that way. In the case of designing CPUs saving pennies on design can cause 100x worth of problems.

            My point was AMD had built some very power efficient CPUs back in the Althon 64 vs P4 days, but perf/watt wasn’t talked about… mainly in my view because Intel didn’t want it talked about…websites back then wouldn’t be around long if they got on intel’s bad side.

            • K-L-Waster
            • 3 years ago

            [quote<]My point was AMD had built some very power efficient CPUs back in the Althon 64 vs P4 days, but perf/watt wasn't talked about... mainly in my view because Intel didn't want it talked about...websites back then wouldn't be around long if they got on intel's bad side.[/quote<] I think you're assigning more importance to Intel's influence on this than is really the case. Up until the P4 vs Athlon battle, power and heat weren't a major concern because they weren't causing problems. I do remember PC magazines talking about the power and heat increases, but mostly as an abstract issue -- i.e. the increases were noted and some hand wringing made about how it couldn't continue, but it was still one of those down-the-road problems that consumers didn't need to take specific action about. (It's a bit like how fuel efficiency in cars is always a talking point, but most customers don't actually base their buying decisions on it until gas prices go through the roof. It only becomes a decision making determinant when it actually causes you pain.) Prescott is where it actually became a problem, where Intel had to back off from releasing the 4 GHz parts due to heat and power usage so their approach of simply cranking up the clockspeed to stay ahead of AMD was no longer viable. And that is also where AMD went from being that other company that only low budget users and cranks considered to being a real option for a much broader range of customers.

          • K-L-Waster
          • 3 years ago

          Intel got a lot more power efficient because when they abandoned Netburst the architecture they fell back on was a derivative of their laptop chips. They then worked on getting more compute out of a lower power core. This served them well when a) laptops became more important than desktops in sales to consumers, and b) IT teams started concentrating on maximizing the density of CPUs in their data centers rather than relying on individual monster servers.

          I’m not as familiar with the history for NVidia, but the fact they have been aggressively chasing GPGPU computing and super computer / data center business as well as laptop design wins would again give them reason to consider efficiency in addition to raw power.

          Enthusiasts may not mind having systems that double as space heaters, but data centers and laptop owners do — and those are the areas with the most sales dollars available.

          AMD seems to have learned this later than the others, but at least they *have* learned it. Maybe now we can get products from them that don’t require noise and/or heat compromises.

      • freebird
      • 3 years ago

      Energy efficiency is difficult when you are working several nodes behind your competitor… SIO was able to make up for some of the difference back in the days of the Athlon64

      Everyone seems to ignore the fact that AMD also had to “compete” with another disadvantage that has somewhat dissipated now after 20 years… the constantly working from a manufacturing processing node that was a generation or 2 behind that of Intel’s… now that they are on a semi-equal footing, ie Intel 14nm > GF 14nm and the benefits of 10nm not looking so great, AMD may truly be able to “compete” with Intel. Let’s just hope Intel doesn’t revert to anti-competitive practices like they did when AMD was competitive approx. 15 years ago…

      At least GF shouldn’t have to worry about “idle” production lines this year…and we’ll see if indeed AMD leverages Samsung for additional production capacity.

        • BurntMyBacon
        • 3 years ago

        I feel it appropriate to repost this:
        [quote=”BurntMyBacon”<]I've compiled a list of dates (according to Wikipedia) of the first processor architecures launched by AMD and Intel on each process node since 90nm. Let me know (preferably with a source) if I copied a date incorrectly. Intel 90nm - Feb. 2004 (Prescott) 65nm - Jan. 2006 (Cedar Mill) 45nm - Jan. 2008 (Penryn) 32nm - Jan. 2011 (Sandy Bridge) 22nm - Apr. 2012 (Ivy Bridge) 14nm - Sep. 2014 (Broadwell) AMD 90nm - Nov. 2004 (Winchester) 65nm - Feb. 2007 (Lima) 45nm - Jan. 2009 (Deneb) 32nm - Sep. 2011 (Bulldozer) 22nm - N/A 14nm - Mar. 2017-Est. (Zen) Three things of note: 1) Not once did Intel reach the next-gen node before AMD reached the equivalent node until 14nm. 2) Not once did AMD beat Intel to a next-gen node. 3) For most nodes, AMD spent as much or more time competing against Intel's next-gen node as the equivalent node. (90nm being the notable exception) Many don't realize that AMD's first 65nm chip was Lima (Athlon64) and not Agena (Phenom). Agena didn't launch until Nov. 2007, notably, only two months prior to Intel's 45nm Penryn.[/quote<]

    • ronch
    • 3 years ago

    Just a thought : in the computer industry there are three main players : Intel, AMD and Nvidia. Of the three, only AMD does serious CPU and graphics. And of the three, AMD is the smallest in terms of resources, assets, etc. And AMD competes with both Intel and Nvidia in these two companies’ areas of main expertise. With boatloads of cash and other resources, Intel pours money to concentrate on CPU development and manufacturing and Nvidia pours money to concentrate on graphics and other side projects that largely have to do with their graphics chips. AMD, with their limited resources and being the David, competes again not one but TWO Goliaths. And that’s obviously so hard that few companies can pull it off especially in a cutthroat, fast paced industry that needs very specialized, rocket science skill sets to compete in. And in light of Ryzen setting the Internet on fire for the last few days and now Vega starting to make some more noise as well, you really have to applaud AMD for achieving what they have done. Nothing short of a miracle. This should make for a good case study in academia.

      • Airmantharp
      • 3 years ago

      Your fanaticism is showing.

      While their survival against these two juggernauts may appear heroic, understand that when you say ‘compete’, you’re generally wrong. AMD isn’t ‘competing’, for example, in HPC, in datacenters, in super computers, in high-end gaming, in anything that requires robust internal software development and external developer relations like VR and HPC, and so on.

      This isn’t a criticism of AMD; they’re doing [i<][b<]extremely[/b<][/i<] well given their resources, but they are limited in reach. We're celebrating their return to the consumer space, and your main point does explain part of their survival as it allowed them an edge in competing for the console contracts that they won, but they have a long road to go if they are going to fully compete with Intel and Nvidia.

        • ronch
        • 3 years ago

        [quote<]Your fanaticism is showing[/quote<] And your arrogance is showing.

          • K-L-Waster
          • 3 years ago

          This isn’t a movie or a video game: it’s business in the real world.

          Audiences of entertainment vehicles may like plucky underdogs who somehow manage to stick it to a more powerful opponent just before the credits roll. But in business, you either produce a profit and good returns for your investors / owners, or you go out of business. Capital doesn’t care about feel-good stories, it only cares about the return.

            • ronch
            • 3 years ago

            I think we’re being a bit dishonest when we insist that we aren’t fans of any company. I may not care what brand of shoes I buy or what brand of soap I use, but one way or another we are fans of certain brands, or at least put more trust or preference towards a brand. Most folks don’t care which CPU they’re using as long as the computer works, but gerbils do. Some insist on Intel, some root for AMD. Some pony up the cash for relatively expensive phones like the iPhone even when perfectly good but cheaper alternatives exist. I think it’s human nature to want to be a part of something. A movement, an alliance, a story. And in the x86 CPU there is a story that has been going on for a long time.

            • Airmantharp
            • 3 years ago

            I think you’re being a little reductionist when you apply your personal paradigm to others. I care about the product. I’ve skewed toward Intel/Nvidia not because I’m fans of their products, but because they’ve worked hard to earn my business.

            That’s not fanboyism, that’s just being smart with my money.

            If AMD steps up- and they will have to step up and earn it- they may very well get my business.

            • K-L-Waster
            • 3 years ago

            Same here. I’ve owned AMD and Intel CPUs in the past, and I’ve owned GPUs from NVidia, ATI (pre-AMD buyout) and Matrox. When I buy new equipment I do so based on which product matches my needs and budget. Recently that’s meant Intel and Nvidia because they’ve had the best overall combination for my needs — however, with RyZen and Vega that may change. Or it may not: depends on how the products stack up.

            Being a fan really belongs in things like sports, music, video games, etc. Equipment purchases should really be made on a cost/benefit basis rather than a rah-rah-go-team basis. (“If it had no logos on it, would I still want it?”)

      • Freon
      • 3 years ago

      I think you’re putting the cart before the horse. We have yet to see any independent benchmarks with this product. Prerelease hype and actual performance are not the same thing. Nor is actual commercial success, revenue, profit, etc. AMD has been failing at the later for years.

      As a consumer I hope they’ll do very well, but the realist in me says they’re unlikely to beat Intel in gaming benchmarks as Intel is likely going to retain a significant IPC advantage, and only some specific conditions will truly take advantage of their higher core count. I.e. why we see so much Cinebench and Blender stuff published, but pretty much nothing else.

      They’re going to have to market it as a multitasker, and keep pushing the specific conditions and benchmarks where they win out. It’s really yet to be seen how successful these efforts will be.

    • Waco
    • 3 years ago

    [url<]https://youtu.be/rYzc_H9cgqM?t=94[/url<] πŸ™‚

      • freebird
      • 3 years ago

      You meant to post this youtube I think… which is what i’ll be singing when I add it to my Ryzen system…

      [url<]https://www.youtube.com/watch?v=ui0EgRsFVN8[/url<]

        • Waco
        • 3 years ago

        I’d rather go with “Hail to Vega” if it’s good. πŸ™‚

    • Kougar
    • 3 years ago

    I’m going to assume the bit on the end is because it’s a development/engineering board. But even with that removed the card looks way larger than Fury X… what’s all that space for if it’s HBM2 equipped, just for a large cooler?

      • Jeff Kampman
      • 3 years ago

      It’s air-cooled, not liquid-cooled.

        • Turd-Monkey
        • 3 years ago

        That’s fair, but it seems odd that the PCB is that long.

        The PCB for the 480 does not extend the full length of the cooler, and it doesn’t have the layout benefits of HBM.

        [url<]https://techreport.com/r.x/rx480review/bottom.jpg[/url<] From: [url<]https://techreport.com/review/30328/amd-radeon-rx-480-graphics-card-reviewed/3[/url<] Also, is that a USB 3 B type connector at the end?

          • ImSpartacus
          • 3 years ago

          I think the dev cards would be forced to extend the PCB all the way along the cooler because they need to support that little debug nub at the end.

          Linus did a video about these dev cards during CES and got Raja to talk about that debug part.

          [url<]https://youtu.be/Y8tDaPLHxiE?t=3m15s[/url<]

          • Kougar
          • 3 years ago

          Yep, exactly! Seen plenty of coolers larger than the cards.

          Wondering if there’s extra beefy power circuitry going on. The PCB size and 8+6 power arrangement gives me the impression they’re pushing this one for aggressive clocks.

            • ImSpartacus
            • 3 years ago

            Vega 10 is strongly rumored to be a 4096 SP part with clocks in the 1500 MHz range and possible “IPC” architectural improvements on top of that.

            So yeah, it’s definitely going to be a 225-300W part.

            • Kougar
            • 3 years ago

            Would be very cool if so, but too much of a gamer’s wishlist in those rumors for me.

            4096SP is the same as Fury X. Also AMD already stated they redesigned GCN to remove the 4 shader engine design limitation, so there is no reason they are capped at 4096 anymore.

      • Lordhawkwind
      • 3 years ago

      My Fury Pro is a massive card although the actual PCB is only the same size as a Nano. The extra length is the shroud for the cooler and I have the Sapphire Tri-X cooler which is awesome. Virtually silent even under heavy load. Wish I could say the same for my i7 7700K tower air cooler but it keeps it under 30c at idle at 4.6ghz on all 4 cores.

      • Wall Street
      • 3 years ago

      When LTT was originally shown the engineering sample, Raja said that this Vega card had more probes than any previous AMD engineering sample to make sure that they got the power and temp characteristics right. I bet that they had to expand the board to add a ammeter and voltage check points to each of the VRMs and the current sources from the 8+6 pin, motherboard 12 V and 3.3 V supplies. There is also a strong possibility that a sample like this is designed to be much more configurable than a final product and is also possibly overbuilt. I would assume that they can make the PCB smaller if it didn’t need to have so much testing equipment.

      • Prestige Worldwide
      • 3 years ago

      Nope. It’s AMD’s response to [url=https://upload.wikimedia.org/wikipedia/commons/thumb/4/47/Nintendo-64-Memory-Expansion-Pak.jpg/1280px-Nintendo-64-Memory-Expansion-Pak.jpg<]this[/url<].

    • ImSpartacus
    • 3 years ago

    Would it have killed you to take a crystal clear image of the power connectors?

    I mean, when [url=https://youtu.be/Y8tDaPLHxiE<]linus got a video with those vega cards[/url<], they taped the hell out of it to hide the power connectors. Seems like a great scoop to be able to get a pristine picture of what AMD was previously trying to hide. I mean, none of us would be terribly surprised to see an 8+6-pin setup, but it still seems like this was a good opportunity for tr.

      • morphine
      • 3 years ago

      You know about physics and light, right?

        • willmore
        • 3 years ago

        I am betting ‘no’.

        • ImSpartacus
        • 3 years ago

        Haha, I was intending to talk more about the angle than any actual clarity metric, but I understand how that was poorly worded.

      • Jeff Kampman
      • 3 years ago

      I dug around on my SD card and found a picture that does indeed confirm the card uses an eight-pin-plus-six-pin power arrangement. I hope the quality meets your standards.

        • willmore
        • 3 years ago

        Thanks! It meets mine!

        • ImSpartacus
        • 3 years ago

        Savvy of you to take extra pics.

        And it looks perfectly sufficient to me. I can’t say I’ve ever been the type to bicker too much about photo quality, though I know there are fans that get concerned with that, so I feel your pain.

        I wouldn’t be surprised if that picture gets picked up and posted elsewhere as a rock-solid confirmation of a 225-300W TDP for Vega 10 (though none of us are terribly surprised). Good exposure for TR.

      • chuckula
      • 3 years ago

      [quote<]Would it have killed you to take a crystal clear image of the power connectors? [/quote<] YES! Now stop trying to kill Kampman before the RyZen review gets published!

    • cldmstrsn
    • 3 years ago

    Even though I bought a 1080 last year I am very interested in Vega. If they hit it out of the park with Ryzen and Vega I definitely want to do an all AMD build just like the old days.

    • UberGerbil
    • 3 years ago

    In that first picture, what is that 4-digit LED display supposed to be telling me?

      • JosiahBradley
      • 3 years ago

      It’s the temperature in Hex. It is very hot in that case.

        • Ryu Connor
        • 3 years ago

        I’m guessing there is a sarcasm tag in your response? This is where we make the joke about the product being named after a star?

        02A7

        0000 0010 1010 0111

        512 + 128 + 32 + 4 + 2 + 1 = 679

        The component solder would be failing at those kind of temperatures.

        I’d guess those LEDs are a POST code display.

          • JosiahBradley
          • 3 years ago

          Massive sarcasm tag. I knew that was a huge hex number that’s why I said it was hot.

            • freebird
            • 3 years ago

            says who? maybe they are taking their temp reading in Kelvin… πŸ˜€

            • K-L-Waster
            • 3 years ago

            679 Kelvin = 762 Farenheit… that’s going the wrong way….

      • LoneWolf15
      • 3 years ago

      Mainboard diagnostic LED.

      • davidbowser
      • 3 years ago

      I think it is ASCII hex for “buffalo”

      • Krogoth
      • 3 years ago

      On a more serious note, it is just a diagnostic code for debugging/troubleshooting for the motherboard.

      You can found such displays on higher-end motherboards.

        • thereason8286
        • 3 years ago

        i thought it was just stating that of the 8+6 pins, only the 8 pin is being used currently. So whenever it uses above the amount the 6 pin will also light up

      • mdkathon
      • 3 years ago

      LoneWolf15 is right, it’s going to most likely be some sort of POST or diagnostic code. Hard to say what it means without knowing more about the board itself.

      • BurntMyBacon
      • 3 years ago

      It’s the power usage of the card. It’s between 225 and 300 so it must be the power usage of the card.

      On a serious note, I concur with some of the others here. It is most likely a motherboard debug display. It is probable that the motherboard is a testing mode, so the motherboard may be a common board, but the code may not be something an end user would see.

    • tahir2
    • 3 years ago

    2nd brightest star in the northern hemisphere spotted 25 light years from Earth.

    Viva Vega.

      • jokinin
      • 3 years ago

      I wonder what are they saving “Sirius” for…

        • derFunkenstein
        • 3 years ago

        Satellite radio

        • chuckula
        • 3 years ago

        Sirius turned out to be a real dog.

      • Krogoth
      • 3 years ago

      That’s not right though.

      Second brightness star that is visible from the northern hemisphere sky is actually Arcturus a.k.a Alpha Bootis.

      Vega was just one of the prototypes for “1” on magnitude scale and forms of the points in the famous “Summer Triangle”.

      /astronomy nerd

        • Peter.Parker
        • 3 years ago

        I’m impressed

Pin It on Pinterest

Share This