Dual-Tahiti Radeon HD 7970 X2 pictured

Nvidia’s GeForce GTX 690 is the first dual-GPU card of the new generation. It won’t be the last, though. AMD is reportedly working on its own dually, this time with a pair of the very same Tahiti graphics chips that power the Radeon HD 7970. According to Donanim Haber, the appropriately named Radeon HD 7970 X2 will debut at Computex next week.

The Turkish site nabbed some pictures of PowerColor’s spin on the X2. The card looks like a beast, with three fans perched atop its triple-slot cooler. There’s a trio of eight-pin PCIe power connectors, too, purportedly allowing the card to draw as much as 525W.

A big red button buried among the display outputs activates a turbo mode with higher clock speeds, but there’s no word on what those are or how fast the card runs in its default config. The 7970’s stock core clock is 925MHz, although a number of manufacturers are already selling 1GHz flavors. All of the 7970 derivatives we’ve seen come with 3GB of memory, making it likely the X2 will boast 6GB of RAM. As with other multi-GPU configs, though, only half of that memory would be accessible to each graphics chip.

Sapphire’s take on the Radeon HD 7970 X2 appears to be considerably different than Nvidia’s design for the GeForce GTX 690. The GeForce occupies only two slots, has just two eight-pin power connectors, and relies on a single cooling fan (which does an excellent job, we might add). It will be interesting to see whether any of the X2 cards undoubtedly coming from other AMD partners can get away with fewer power connectors and smaller coolers.

Comments closed
    • Airmantharp
    • 8 years ago

    I really don’t see anything wrong with this card; in the end, it will have to compete on price/performance like everything else.

    I’m also surprised to see it before AMDs own HD7990, as the HD6990 was a fairly decent product. Although it does make sense when AMD is losing the performance/mm2 of die and the performance/watt comparisons so severely, so I have to wonder if AMD is either waiting for a re-spin or is collecting better bins in order to launch a true 7000-series dual-card solution, which will probably be better than this.

    You really can’t blame PowerColor for trying though. If people will buy it, there’s money to be made, right?

    • cynan
    • 8 years ago

    To all exclaiming OMG! 525W!

    1) This is not the manufacturer rated TDP. It is the amount of power available to the card as per PCIe spec. Power consumption at idle and load for GTX 680 vs HD 7970 are not so far off ([url=http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-19.html<]average power use across 6 games)[/url<]. There is no reason to think that a "stock" clocked dual 7970 card will draw substantially more power than the GTX 690, though it will be somewhat higher. 2) This is almost certainly Powercolor's design and not AMDs. There is nothing to indicate that AMD won't release an HD 7990 with a lower power spec. 3) By using 3x8-pin on the 7970 X2, Powercolor has allowed for a similar increase in power supply for the dual chip card compared to a GTX 690 over a GTX 680 (67% increase for the GTX 690 over GTX 680 and 75% increase for the 7970 X2 over an HD 7970). While the manufacturer rated TDP of the GTX 680 and HD 7970 are close (195W and 210W, respectively), the GTX 680 has 2x6-pin (225W) while the HD 7970 has 1x6-pin and 1x8-pin (300W), leaving the AMD card with more power headroom (going by the published TDP specs). Powercolor is simply following suit here. 4) This is obviously an enthusiast oriented card (look at the size of it!). If I was interested in getting one of these behemoths, I'd want one that could overclock at least almost as well as the single chip cards, and not be limited just because the manufacturer decided to cut corners on power supply, crippling clocks. I'm sure most enthusiasts (other than the few who buy these dual chip cards to shoehorn the most graphics power that they can into a micro or mini-itx system) would gladly trade the extra PCI plug for a "fully functional" dual card. 5) Crippling power supply (and achievable clocks) would likely render such a card unable to match or best a GTX 690. Why would Powercolor throw away the marketing allure of potentially being able to claim fastest card on the planet?

    • tootercomputer
    • 8 years ago

    My God, that puppy looks like it could heat a small house. They should include a pack of extra-large condoms.

    • puppetworx
    • 8 years ago

    but can it play crysis???!?
    lel πŸ˜€ πŸ˜€ πŸ˜€

    6GB of RAM? my PC has 4GB. I guess this is what progress looks like. Are there any GOOD games that can fully take advantage of such a behemoth? Maybe some custom texture packs I guess. Given the speed of the last few years of PC hardware development I can’t help but feel that PC game development is being left a long way behind.

      • BobbinThreadbare
      • 8 years ago

      It’s a dual card, so the memory is mirrored. Only 3 gigs of usable memory.

    • derFunkenstein
    • 8 years ago

    I like the self-destruct button, myself. “ZOMG my gaming is classified! i must make her blow before the FBI raids my house!”

    • Meadows
    • 8 years ago

    <Neppa> What, 525?! There’s no way that could be right! </Neppa>

    This is disgraceful.

    • Silus
    • 8 years ago

    525 watts!!!!

    If this was NVIDIA’s card, this thread would have 300 posts with all the usual jokes of nuclear power plants needed to power the thing…

    I’m assuming that this thing’s clock frequencies are above the stock HD 7970 or at least the same for both chips. No way this type of wattage is conceivable with anything lower than 925 Mhz, unless there’s some problem with the Tahiti chip. NVIDIA made the GTX 590 with two larger chips with everything enabled, but with lower frequencies and it’s power rating was much lower…
    However, if this is remotely true, then it’s explained why AMD can’t get the phantom 7990 card out of the door…why has been rumored even before we ever heard about the GT 690, yet the GTX 690 is already out and the 7990 is no where to be found…

      • Deanjo
      • 8 years ago

      [quote<]If this was NVIDIA's card, this thread would have 300 posts with all the usual jokes of nuclear power plants needed to power the thing...[/quote<] and a few "frying eggs and bacon" videos.

      • cynan
      • 8 years ago

      We don’t know it will actually use all 525W at stock clocks. That is just the max rated draw per the PCIe spec.

      PCIe slot = 75w (60w@12V, 15w@ 3.3V)
      6-pin PCIe plug = 75W
      8-pin PCIe plug =150W

      So with the PCIe slot and 3×8-pin plug you get a max rated power consumption of 150×3 + 75=525w.

      Extra power handling is great if you like to overclock as power requirements really start to ramp up at higher clocks. Perhaps Powercolor is just going for a more enthusiast oriented design (as in MSI Lightening HD 7970 which has 2×8-pin plug – OMG! 375W!!). Personally, I don’t see why they even bother to put 6-pin plugs on high-end, enthusiast oriented cards (other than to save costs on other power circuitry components).

      • Goty
      • 8 years ago

      [quote<]If this was NVIDIA's card...[/quote<] I guess it's a good thing it's not coming from AMD, then, eh?

      • rrr
      • 8 years ago

      To be fair 525W comes from limits delivered by PCI-E slot/connectors (75W for slot, 3x150W for 3×8-pin connectors). We can’t be certain it’s actual power consumption, and probably not, since makers usually add extra connectors if card is close to the limit (see 5850/6870, both very close to 150W, theoretically could get by with 1×6-pin, but for headroom second one was added)

        • Silus
        • 8 years ago

        I know that! But that fact is always forgotten for NVIDIA cards. NVIDIA cards with high TDP ratings are always slanted for consuming that much power, when they don’t.

          • Krogoth
          • 8 years ago

          There were a number of GPUs that were bashed because of their high power consumption when fully loaded.

          2900XT, 480, FX 5800, HD 4890, Voodoo 5 5500 etc.

          480 is the most recent example of that lot. The main problem with the 480 it was delayed for over six months, Nvidia overhyped the crap out of it and when it finally came out it was underwhelming to say the least. 480 mirrored the 2900XT in so many ways.

            • rrr
            • 8 years ago

            2900XT also failed to compete with 8800GTX, barely fighting with GTS 320/640.

            480 at least outperformed 5870 pretty consistently, even if not by an astounding margin, certainly not large enough to compensate for power/temp/noise woes for many.

            HD4890 was bashed for power consumption? Never heard of it. Load power consumption @stock was about on par with competing 275. Problem was high IDLE power consumption – and all GDDR5 based HD4xxx had this problem, not just 4890. And even then I have not heard that much wailing over it, even though it was clearly not a desirable situation.

      • Krogoth
      • 8 years ago

      Citation Needed*

        • Meadows
        • 8 years ago

        Look who’s talking.

      • pogsnet
      • 8 years ago
        • Silus
        • 8 years ago

        Too bad that that fact is always forgot when talking about NVIDIA cards that have high TDP ratings…

      • Fighterpilot
      • 8 years ago

      Yes,people should rush out and buy the GTX690,after all it’s affordable….cough..only $1049 at New Egg…..oh wait…that’s right…there aren’t any anyway.
      More NVidia vaporware.
      [url<]http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709+600315498&IsNodeId=1&name=GeForce+GTX+600+series&Page=2[/url<]

    • danny e.
    • 8 years ago

    shoulda designed it with just one fan.

      • HighTech4US2
      • 8 years ago

      525 Watts

        • Bensam123
        • 8 years ago

        moar whats = moar fanz!

        • Krogoth
        • 8 years ago

        Correction, the power circuitry on this “7970X2” can theoretically pull up to 525W. I doubt the actual card draws that much at stock. It is closer what two 7970s graphic cards would draw.

        I suspect Powercolor threw in another PCIe power connector to balance the power draw and give extreme overclockers headroom.

      • chΒ΅ck
      • 8 years ago

      A 140mm fan ought to do it!

      • Arclight
      • 8 years ago

      I’m actually thinking how it would look with a fanless cooler. How big would that heatsink have to be πŸ™‚

    • HighTech4US2
    • 8 years ago

    Some clarifications:

    AMD is NOT working on it’s own dually as stated, if it was it would be called the 7990.

    AMD partners decided that since AMD is dragging their feet (in not producing the 7990) to implement their own dually (the 7970 X2)

    Now as to the X2 card itself (Triple slot, 525 watts) kind of shows why AMD isn’t making a 7990.

      • cynan
      • 8 years ago

      [i<]Now as to the X2 card itself (Triple slot, 525 watts) kind of shows why AMD isn't making a 7990[/i<] Not necessarily. AMD could be working on a more modestly clocked lower power HD 7990. This just shows that the Powercolor prototype has a more balls-to-the-wall enthusiast implementation than an AMD reference version would probably have.

        • Waco
        • 8 years ago

        They’ve managed it for 3 generations already…they’ll make a dual 7970 even if it’s got “low” clocks.

          • HighTech4US2
          • 8 years ago

          And since it would lose to the GTX690 who would buy it?

          And if you think AMD would price it lower than the GTX690 then that would make either SLI or CF a better choice.

          I really don’t see a market for a low clocked 7990 and neither does AMD.

      • Goty
      • 8 years ago

      [quote<]Now as to the X2 card itself (Triple slot, 525 watts) kind of shows why AMD isn't making a 7990.[/quote<] Yeah, because AMD can't just produce an underclocked Tahiti to use in a 7990. Also, there are rumblings that the 7990 will be announced during the AFDS (unlikely, I think) or shortly thereafter (more likely). Careful, HighTech, your fanboy is showing.

        • HighTech4US2
        • 8 years ago

        [quote<]Yeah, because AMD can't just produce an underclocked Tahiti to use in a 7990[/quote<] So you believe that AMD will willingly make a 7990 that will absolutely LOSE to the GTX690. And as for fanboism you do seem to be full of it.

          • Goty
          • 8 years ago

          The 7990 might lost to the GTX690, but I couldn’t say without seeing benchmarks: I’m not clairvoyant.

          Also, how am I a fanboy for pointing out the fact that you’re wrong?

    • mcnabney
    • 8 years ago

    These uber-cards really need to rethink their cooling.

    For the price and performance these are pushing I think that they should shelve the two and three slot fans and just come with a liquid cooling solution. Anyone paying this kind of money isn’t going to mind paying a tiny bit more and the more efficient cooling should allow better stock clocks anyway.

    I mean seriously, this thing can make some heat considering the 500W+ power intake. I just don’t see how fans can still be the best solution. It doesn’t even look like much air is getting pumped out the back, even though it has three slots to do it in.

      • willmore
      • 8 years ago

      And PowerColor is known for their water cooled versions of cards, too. So, it’s not like they don’t have the expertice to do so. And for whomever thumbed you down, I don’t know if you think it’s because water cooling is difficult or expensive or what, but it’s not. It can be a lot less expensive than high end air coolers and it works a lot better. The ony water cooling I’ve seen that doesn’t work better than air are some of those all in one bargin coolers.

        • Deanjo
        • 8 years ago

        [quote<]It can be a lot less expensive than high end air coolers[/quote<] Gotta call BS on that. The waterblock for a GPU alone costs more then pretty much every air cooler out there and that doesn't even include the rad, pump and fans.

          • willmore
          • 8 years ago

          I walked out of Fry’s with a complete liquid cooling system made by Danger Den. 240×120 radiator, fans, pump, water block, tubing, etc. for $80. I’ve seen high end air coolers go for that much.

            • Deanjo
            • 8 years ago

            I’ve also picked up high end air for less then $20 when a promo is on. An equivelent water block for the a dual GPU is still going to set you back around $100. DD’s water block for the GTX-590 is $110 on sale. Also on a card like this, you are definiately going to want a larger rad then a 240×120 to do a full loop system.

            Just look at the cards that do offer water cooling out of the box. None of them are remotely close to what the fan based cards are in terms of price. In fact it is usually cheaper to get a fan based card and add your own water block then to buy one preassembled.

            • willmore
            • 8 years ago

            Yes, it usually is. Why is that so? Is there much more expense in making a water block than an HSF? Probably not. Both will require custom machining. One requires more internal work while the other requires either brazed or extruded fins. One has a fan and that other has to deal with being water tight. The only real increased costs are two fold. One is the plumbing, pump, and radiator. The other is volume. They just don’t run enough volume on the water cooled units to bring the NRE down.

            But, when you’re talking about what is likely to be a $1000 card, WTF not go all out? Come on, you’re going to thrown down $1K for a GPU but you’re going to balk at $100 for a radiator and pump?

            I’m not saying we should expect a flood (sorry) of water cooled GPUs, but I am going to say that at the $1K bracket, air cooling starts to look silly.

            • Draphius
            • 8 years ago

            unfortunately those kits arent the best quality and usually dont last long, sometimes taking more expensive components with them in the process. if you can spend a grand on these u can afford a proper water loop. still i do agree they should be coming with some sort of sealed unit from the start, probly would cost less then the copper and aluminum in the air cooled version and it would be a one or 2 slot card depending on connections and layout

        • superjawes
        • 8 years ago

        The thumb down might have come because it would be instroducing liquid into a computer…I know it’s fine if you’re careful and take care of it, but it’s still water in the system, and if that fails, it could take most of your system with it.

        At least that’s one of the frequent arguments I’ve heard against liquid cooling. I never buy anything that needs to draw 525 Watts, so cooling isn’t an issue for me.

          • Kurotetsu
          • 8 years ago

          It’d be nice if someone like Corsair did what they accomplished with CPU liquid coolers, a totally self-contained GPU liquid cooling system that just works right out of the box. You still get the water hazard, but some of the risk is taken off your shoulders since the loop, the rad, and everything else is pre-assembled.

            • Deanjo
            • 8 years ago

            PNY had a GTX-580 that had self contained water cooling. Also Coolit’s OMNI tried to accomplish this as well for the GTX-590.

            • Washer
            • 8 years ago

            Corsair rebrands those kits, nothing more.

      • UberGerbil
      • 8 years ago

      Well, if you’re already going triple-slot width anyway, a couple of squirrel-cage style coolers (inlet and exhaust) might start to be a more efficient design. But the traditional fans are much easier (and cheaper) to source, so….

      • d0g_p00p
      • 8 years ago

      I agree. It would be really nice to get a video card that had a sealed water cooler attached. Just like the ones that you can get from Corsair and others for CPU’s. I would gladly pay extra for that option.

      • Bensam123
      • 8 years ago

      Water cooling or a better designed cooler. I mentioned similar in other video card posts before.

    • vascaino
    • 8 years ago

    Holy mother of Jesus… That thing is gigantic.

      • dashbarron
      • 8 years ago

      Amen.

      • Alchemist07
      • 8 years ago

      Thats what she said.

      • gamoniac
      • 8 years ago

      It’s so comically huge it brought a chuckle.

Pin It on Pinterest

Share This