Radeon RX 480 reference card gets dissected at Chinese site

Everyone loves a naked graphics card with all its parts laid bare for the world to see. The latest beauty to grace the front pages of websites everywhere is none other than the new hotness of the moment, the Radeon RX 480. Chinese website PConline somehow got their hands on what appears to be a Sapphire RX 480 8GB card, and it proceeded to do what anyone does in that situation: disassemble it and take pictures for voyeurs to gawk at.

Image: PConline

Much like the VisionTek card we covered earlier, the Sapphire R9 480X looks like a dual-slot card with a bog-standard blower-type cooler. A single 6-pin PCIe power plug is apparently enough to feed the Polaris GPU inside. The heatsink in the card PCOnline dissected is a standard aluminum affair, albeit with a copper center. However, it's best described as "tiny," as graphics card heatsinks go. Instead of having a single aluminum piece covering almost the entirety of the card, the main heatsink appears to be roughly the size of the blower fan in front of it, and it touches the small Polaris GPU alone. A secondary metal baseplate looks to dissipate heat from the RAM chips and VRMs. There aren't any heatpipes, golfball-textured fan blades, or LEDs in sight, either.

The 8GB of GDDR5 on board this purported RX 480 card is made up of Samsung chips, and the power section is a six-phase affair. Finally, the output section comprises three DisplayPort connectors and a lone HDMI output. According to PConline, multiple store listings for RX 480s are already showing up in China at 1999 ¥, just slightly north of $300. Given that the country's VAT rings in at 17%, the card's pre-tax price is roughly $260, not too far off from the $229 price that the rumor mill is predicting for 8GB RX 480s. Look forward to June 29 for the official release of the RX 480.

Comments closed
    • sluggo
    • 3 years ago

    Wonder if I could refit half of the cooler off my old Fermi card?

    • the
    • 3 years ago

    Hrm, got a malware warning from visiting PCOnline.

      • mesyn191
      • 3 years ago

      Probably from the ads. uBlock, or an equivalent, is almost a requirement these days for relatively safe web browsing.

        • auxy
        • 3 years ago

        Just don’t use it on TR! (ΦωΦ)

    • brucethemoose
    • 3 years ago

    For reference, here’s the stock 6870/7870 heatsink:

    [url<]http://i.neoseeker.com/neo_image/197396/article/AMD_HD_7870_7850/HD%207870%208.jpg[/url<]

    • DPete27
    • 3 years ago

    Prepare for a rehash of the Hawaii launch. Reference cooler so inadequate, that’s the only thing that sticks in people’s mind. Kills sales of even custom cards because it gets the tagline of “space heater/dustbuster”

      • auxy
      • 3 years ago

      I’m afraid of this too. The Hawaii processor was and is so amazing, and yet it has this stupid reputation like the GTX 480 (which was also a great GPU!). Ugh. (´Д⊂ヽ

      • mesyn191
      • 3 years ago

      If its true it only typically uses 100w then that HSF looks fine to me.

      Will probably benefit greatly from a better 3rd party HSF if you want to add volts and OC much past default though.

    • USAFTW
    • 3 years ago

    That has got to be the cheapest hunk of aluminium I’ve seen in a graphics card in a very, very long time.

    • unclesharkey
    • 3 years ago

    I am as excited as when ATI released the R300 in the Radeon 9700.

      • Waco
      • 3 years ago

      I’m sitting here hoping it’s not the R600 all over again…and I even bought an R600. 😛

        • USAFTW
        • 3 years ago

        I think if it perform close to a Fury it would look like a rehash of RV670 (HD 3870). That was as fast as a lame duck 512-bit GPU and drew half the power. This does the same with the 390x, not that that was as bad as a 2900 XT.

          • Waco
          • 3 years ago

          I meant the hype train. R600 was hyped to high heaven and sucked. 🙂

    • WhatMeWorry
    • 3 years ago

    Just ship the damn card already!

    • brucethemoose
    • 3 years ago

    I hope AMD learned their lesson from their older blower coolers… Everything after the reference 6850 was notoriously loud.

      • Chrispy_
      • 3 years ago

      That looks loud to me; It’s a small aluminium block for what is rumoured to be around 110W of thermal load.

      Over the last decade of GPU models, once you move into >100W territory you have two options; Heatpipes or noisy fans with much higher RPM.

      Even a lot of the 90W GTX950 cards have heatpipes to keep the noise levels down so I think AMD is just going to launch with a rubbish, cheap cooler to get the early cards out for a low, low price and then rely on the vendors to fit their third-party coolers over the coming months.

      I dunno, maybe I’m wrong – I’d like to be wrong – but it has now been 13.5 years since the first noisy GPU cooler and a lot of lessons have been learned since then. You can have quiet or you can have cheap, choose one.

        • Magic Hate Ball
        • 3 years ago

        I do find it interesting that the RX480 blower takes in air from the top and the bottom.

        Most blowers we see are only 1 side (top) input because they have PCB under them.

          • auxy
          • 3 years ago

          The GTX 480 did this too. I think as far back as the 9800GTX or 9800GX2 this was being done! (*’▽’)

          • anotherengineer
          • 3 years ago

          Well typically gpu blower coolers are very poorly designed.

          Ideally it should be more like this internally with tight clearance on one side
          [url<]http://www.comtherm.co.uk/fanscroll.gif[/url<] and intakes should have a nice radius into the center of the fan and not into the blades like the rx cooler [url<]http://www.canarm.com/HVAC/Equipment_Blowers/GDD_Series_Centrifugal_FC_DD_DWDI_Blowers[/url<] fully open to both sides, and then with turning vanes inside the cooler to direct air evenly over the full width of the heat sink. These aftermarket blowers were being make for the radeon 9800 pro way back in 2003 and nvidia cards also and were way better than the stock coolers, but wattage was small compared to today. [url<]http://pclab.pl/zdjecia/artykuly/surmacz/coolery-gpu/silencer8.jpg[/url<] [url<]https://www.arctic.ac/worldwide_en/nv-silencer-4-rev-2.html[/url<]

        • brucethemoose
        • 3 years ago

        But how much more would a slightly larger slab of aluminium cost? $1? $2? Maybe $5 for a bit of extra copper?

        Past a certain point, penny pinching just isn’t worth the reputation hit.

          • anotherengineer
          • 3 years ago

          True, but then there in the fabrication/manufacturing costs/time. To extrude a chunk of aluminium into a heat sink with fins I reckon is cheaper and faster
          than
          making some heat-pipes, some fins, a base plate, soldering the heat-pipes to the base, punching holes in the fins to slide over the heat-pipes then putting all the fins on the heat-pipes with proper spacing.

        • travbrad
        • 3 years ago

        [quote<]I think AMD is just going to launch with a rubbish, cheap cooler to get the early cards out for a low, low price and then rely on the vendors to fit their third-party coolers over the coming months.[/quote<] Better than launching with rubbish coolers and charging $100 price premium I guess (see Founders Edition)

      • Shobai
      • 3 years ago

      Notice the foam around the one end of the heatsink – it looks like AMD’s gone to a bit of trouble to ensure that all air from the blower travels through the heatsink, then exit over the VRM sink integrated into the baseplate. With the curve on the one side of the heatsink, the path of least resistance is through the shorter channels – directly over the die. It’s nice to see that sort of thought going into a design.

      Having said that, I had an HD4890 before my reference R9 290, and both of those blowers were loud and clicky. Here’s hoping that they’ve managed to find a quieter solution that performs as they need it to.

    • auxy
    • 3 years ago

    3x DP and 1x HDMI seems weird for a card targeting the low-midrange segment. I doubt most people considering $200 GPUs have 1x DP monitor much less 3x. DP is still dishearteningly uncommon on low-end monitors. ( ;∀;)

      • grazapin
      • 3 years ago

      But DP is trivial to adapt to HDMI or DVI with a passive adapter. Going the other direction requires a more expensive active adapter the last time I checked.

        • Chrispy_
        • 3 years ago

        Yep. DP to anything else digital is trivial and expect all but the cheapest cards to come with an adapter in the box.

        Does anyone know if Polaris will do analogue natively? I’d love it if they’d finally ditched that legacy burden.

        With AMD lowering the cost of entry, it seems reasonable for people with a godawful old screen to spend some of the savings on something from this decade. Newegg has 27 screens with a DVI port for under $100 and some of those at $80 are 1080p, IPS and have a low pixel response time. Why on earth would you want to keep using some old piece of junk?

        I know there were some good, expensive screens around with D-SUB only, but they’re likely to be too slow to game on and whilst you can still buy a monitor with D-SUB only today, they’re lemons to be avoided. It’ll save you $5 at most and they’re not even remotely good enough to justify being even $50, let alone $75.

          • xeridea
          • 3 years ago

          Still using Samsung 2048×1152 23″ screen from 2009, [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16824001317[/url<] I will continue to use it because I like the size and resolution. 2048x1152 seemingly disappeared, but I like it for work and games (a bit higher than 1080p, but not high enough to require a larger screen without a magnifying glass). It's old, but still works flawlessly, picture is fine (I am no artist, but it seems comparable or better than many current monitors). Never had any issues, and it has had pretty much 4-12 hours daily use since I got it. That being said, I do remember a lot of subpar monitors from that time.

            • Magic Hate Ball
            • 3 years ago

            Even my 1920×1200 from 2008 (man, time flies) that’s serving as a secondary monitor to a 144hz 1080p seems out of place.

            I never saw much of anything ever use 2048×1152… That’s a really squirrelly one.

            • UberGerbil
            • 3 years ago

            I have the same panel, in the form of the [url=http://accessories.us.dell.com/sna/productdetail.aspx?%3F~lt=popup&c=us&cs=22&l=en&s=dfh&sku=320-7641&validate=false&~lt=popup&~tab=specstab<]Dell SP2309W[/url<] from around the same time frame. It's still a very good monitor, with several features ("proximity" touch controls, integral webcam) that aren't common on more modern displays. And I'll echo xeridea's comments about the sweet spot for size/resolution. Being slightly higher res than 1080 means there's space for video controls while running at 1:1; it's also a good resolution to use in portrait mode.

          • auxy
          • 3 years ago

          See my reply to grazapin!

        • auxy
        • 3 years ago

        That’s not true… (´Д⊂ヽsigh.

        DVI and HDMI require a TMDS clock; DP has no such thing. To adapt DP to DVI or HDMI, the GPU either has to be able to send a DVI/HDMI signal over the DP connection (which will enable a passive adapter), or you have to use an active adapter which will generate the clock signal required.

        Unless these things have 2-3 TMDS clocks I think it will be a problem for some users.

          • Chrispy_
          • 3 years ago

          You use all the fancy acronyms but I’ve never had a problem running two DVI screens off two DP outputs using passive adapters.

          I reckon I must have at least 50 machines that match that [i<]exact[/i<] setup using GCN-based Radeons and hundreds more that run HDMI/DVI/DP interchangeably depending on which desk they end up (we move the PCs but not the pair of screens on their VESA mounts) Honestly, in the last several years, the only time I've ever had to use an active adapter is to provide an old D-SUB output from an all-digital card. There must be several DP-only graphics cards (the Eyefinity-6 editions) in circulation at work that get by with passive adapters.

            • auxy
            • 3 years ago

            Two, three, or even four displays isn’t surprising. Most GPUs from the previous generations have had two or three TMDS clocks. Besides that, two DVI or HDMI connections can share the same clock if they are connecting to monitors with the same timing (same resolution & refresh rate is sometimes enough, or especially if they are the same model.)

            So nothing you said changes anything I said. (;’∀’)

            • Klimax
            • 3 years ago

            For HDMI compatibility you need DP++ for passive adapters (https://en.wikipedia.org/wiki/DisplayPort#Dual-mode), otherwise you need active adapters. (Also for dual-link)

            I know it catches a lot of people off-guard. Unless DP port is marked as DP++ aka Dual-mode, then it is strongly inadvisable to think it is.

    • TwistedKestrel
    • 3 years ago

    Cooling aside, the design of the reference card itself looks clean. All the RAM is on one side, the card isn’t long, there is a huge area around the GPU die & memory that is swept clear of stuff that would interfere with a heatsink… I see pads for a DVI port too. The only thing I see partners really wanting to change the design for would be pads for additional power connectors.

    • OneShotOneKill
    • 3 years ago

    It has been a long time since I found myself stuck to my desk chair exploring every corner of the interwebs looking for tiny morsels of video cards news.

    It feels like a bit nostalgic I must admit!

      • TwistedKestrel
      • 3 years ago

      Is this hype management done correctly? I feel like the GTX 1080 buzz was over in a day, and I can’t even remember anything about the card

        • OneShotOneKill
        • 3 years ago

        As long as it is not followed by a massive flop as we have gotten used to in recent years.

        RX 480 >= R9 390x would be the goal here.

          • Demetri
          • 3 years ago

          That’s the problem; I’m seeing people in other forums pissed because someone said 480 won’t OC enough to match a stock 1070, saying that it will be a massive failure if it can’t. The hype train is getting so out of control that it’s setting people up for disappointment. Everything I’ve seen suggests that we should expect performance between 390 and 390X.

            • rxc6
            • 3 years ago

            Anybody expecting a 1070 level of performance for the price of the 480 deserves all the disappointment they get and more.

            • Chrispy_
            • 3 years ago

            I’m going to find immediate use for a few hundred 480 cards if they are as cheap, efficient and fast as rumored – and by rumored I mean $200-250, 100-110W, and Hawaii-ish

        • Chrispy_
        • 3 years ago

        1080 buzz was either:

        [quote<]Wooo, amazing card! Oh, how much? Nevermind...[/quote<] [quote<]I'm rich! What do you mean they're all out of stock? Nevermind...[/quote<] Nvidia have been slowly pushing up the price of their middle-sized die that, for a decade beforehand, had served the midrange $200 market. GF104 gave us the $200 GTX460, whilst the GF100 was the expensive big chip. With GK204, GM104, GP104, prices have crept north to the (now silly) $700 for a GTX1080 and all AMD needs to do to win favour is beat these quite frankly ridiculous prices. I don't see that being too hard right now. The only thing to upset the applecart would be if Nvidia pulled a $250 GP106 out of their ass on June 29th. It's very unlikely they've kept the lid on that so well, given that we were hearing about the GTX1080 several months ago, so it's likely that a GP106 is a good few months away for Nvidia which will hopefully give AMD a chance to sell some graphics cards.

          • Klimax
          • 3 years ago

          Just don’t forget to adjust for inflation and process costs.

            • Chrispy_
            • 3 years ago

            You mean deflation? The absolute cost of computing has actually decreased over time, starkly defying both inflation and people who forget to account for it.

            • Klimax
            • 3 years ago

            Process costs went severely up.

        • derFunkenstein
        • 3 years ago

        I remember that it’s supposed to be really really fast.

      • bwcbiz
      • 3 years ago

      … and counting the minutes until the embargo lifts and we get some real numbers… But it hasn’t been that long. I was scanning for news about the GTX 1070, too.

    • chuckula
    • 3 years ago

    [quote<]Chinese website PConline somehow got their hands on what appears to be a Sapphire RX 480 8GB card, and it proceeded to do what anyone does in that situation: disassemble it and take pictures for voyeurs to gawk at.[/quote<] Well I would have run benchmarks but apparently PConline doesn't swing that way. [i<]Not that there's anything wrong with that(TM).[/i<]

      • wizardz
      • 3 years ago

      maybe the card has some super-sekrit-bios that phones home when you plug it in to let AMD know who leaked what? hence the heatsink pr0n.

      or maybe i should just go back to lunch..

      edit: typo

      • CuttinHobo
      • 3 years ago

      Seriously. We all know size doesn’t matter, it’s how you use it!

      I sincerely hope the reason they didn’t bench it is because they found the thing in AMD’s dumpster and the card simply didn’t function. :/

      • NeelyCam
      • 3 years ago

      I was going to post the same thing. Why would they disassemble before benchmarking?

        • OneShotOneKill
        • 3 years ago

        Reverse engineering is in their DNA.

          • Shobai
          • 3 years ago

          Whereas posting benchmarks is in their NDA?

    • tipoo
    • 3 years ago

    That really is a tiny heatsink. Given rumors that some partners are clocking it at 1500MHz, maybe the default cooler size is limiting the clocking potential of the chip (or maybe AMD just chose the clock based on the peak efficiency point, but still).

    Particularly at odds with the other rumors that it wouldn’t clock as high as AMD wanted though, in which case a bit more metal could go a fair way…Interesting to see what’s the case.

      • blahsaysblah
      • 3 years ago

      Hey all this conflicting information is more PR. 🙂 Good for AMD.

        • tipoo
        • 3 years ago

        Ah, you suspect they’re Trumping 😉

          • tipoo
          • 3 years ago

          Found the TR Trump supporter!

          • derFunkenstein
          • 3 years ago

          Your current rating suggests that people are allergic to that word. Lol

            • tipoo
            • 3 years ago

            Maybe so, as my follow up comment did better, lol. Well, better that than what I first thought 😛

      • Srsly_Bro
      • 3 years ago

      Sapphire basically confirmed 1500 MHz on its Twitter page.

Pin It on Pinterest

Share This