Dual Tahiti GPUs team up on PowerColor Devil 13

Although AMD has yet to answer Nvidia’s GeForce GTX 690 with a comparable dual-GPU Radeon, that hasn’t stopped PowerColor from coming up with one of its own. The graphics card maker has unveiled the Devil 13 HD 7990, which squeezes a pair of Radeon HD 7970 chips onto a single board. Each GPU is clocked at 925MHz, the default speed of the 7970 before it got the GHz Edition treatment. However, there’s a switch in the port array that pushes the GPU clock speeds to an even gigahertz.

The switch doesn’t appear to affect the memory transfer rate, which is 5.5 GT/s, just like the standard 7970. Each GPU is equipped with 3GB of memory, giving the card a total of 6GB onboard. Since the GPUs are running in CrossFire, the effective memory is still 3GB.

PowerColor encourages folks to overclock the card further, and it’s equipped the thing with 16 power phases, digital PWMs, and fancy electrical components to help with that mission. The cooler is also a monster; it’s a triple-slot design with 10 heatpipes and the ability to disspiate up to 550W, according to PowerColor.

As is fashionable for ultra-high-end cards, the Devil 13 comes with a couple of accessories. There’s a purportedly premium toolkit with a bunch of different screwdriver blades. PowerColor also includes something called a PowerJack to help bear some of the card’s heft. PowerColor doesn’t say how much the thing weighs, but it does reveal that the card is 12.4" long. The product page also notes that three eight-pin PCIe power connectors are required to keep the Devil 13 juiced.

There’s no word on pricing or availability, but you can bet the dually won’t be cheap. GeForce GTX 690 cards sell for $1000 and up.

Comments closed
    • grantmeaname
    • 7 years ago

    It only costs infinity dollars!

      • BestJinjo
      • 7 years ago

      It’s too late and too power hungry for gamers compared to the GTX690. However, bitcoin miners will be all over this like white on rice.

    • ALiLPinkMonster
    • 7 years ago

    Holy microstuttering, Batman!

    I’ll take three.

      • shank15217
      • 7 years ago

      Microstuttering isn’t as big of a deal as you make it this generation, go read the TR gtx 690 review, adding a second card in sli/crossfire makes a noticeable difference in frame times and frame rates at all response time caps.

    • MadManOriginal
    • 7 years ago

    [quote<]also includes something called a [u<]PowerJack[/u<] [/quote<] Thusly named because that's what it will do to your wallet.

    • Chrispy_
    • 7 years ago

    This, the 690 and all cards of this ilk confuse the crap out of me.

    Why would you buy one of these when it’s better/cheaper/easier to buy two of the base cards this is assembled from.

    The only point at which this becomes necessary is if you’re using an MATX board yet [i<]still[/i<] want quad SLI or Crossfire. At this point, it would be [i]almost-illegally[i/] negligent of me not to point and mock you for being unbelievably silly in your choice of motherboard.

      • dpaus
      • 7 years ago

      Ask any girl if she`d rather have one 8 incher or two 4 inchers……

        • stupido
        • 7 years ago

        LOLz! 😀

        • My Johnson
        • 7 years ago

        You just won the Internets today. Congratulations!

        • Byte Storm
        • 7 years ago

        This is the greatest reply I have seen on this site!!!

        • DancinJack
        • 7 years ago

        I guess it is Friday, isn’t it?

        • anotherengineer
        • 7 years ago

        Well with two 4 inchers you could put one in the fridge and save it for later………..

        • Beelzebubba9
        • 7 years ago

        I think mine would demand three 6 inchers…

        What would that be? SLI with a triplet of GeForce 660 Ti?

          • dpaus
          • 7 years ago

          [quote<]three 6 inchers... What would that be?[/quote<] Very difficult to get set up properly, and in the end, rather disappointing. I bet she'd be unhappy with the level of driver support. And she'd have to deal with constant conflicts between them, interrupt clashes, power issues, clearance problems, it would be noisy as hell, and the performance would never quite live up to expectations. And let's not even start on the ever-present 'inside the second' issues. Yep, it's Friday.

        • Dposcorp
        • 7 years ago

        Reminds me of the time this woman said she would not sleep with me unless I had 4 inches to offer her.

        I said” Beotch, I dont fold my stuff in half for no woman” and then I left. 😉

        • highlandr
        • 7 years ago

        An interesting comment about a product that is obviously designed for people who need to overcompensate for something…

          • Chrispy_
          • 7 years ago

          Well, if you have a tiny [i<]motherboard[/i<] I can understand being jealous of the guy who has easy access to four PCI Express slots. <Beavis> Heh, hehe: "slots" heheh >:) </Beavis>

      • derFunkenstein
      • 7 years ago

      Well, if you get two of these and run them in Crossfi–

      nevermind. You’re right. This is kind of retarded. 😆

      • RobotHamster
      • 7 years ago

      Generally two cards use more power than the equivalent dual GPU card. So overall you save money, it will just take a while and likely the newer cards will have been released by then. But it will still save money.

        • Chrispy_
        • 7 years ago

        True to some extent – a 690 uses 50W less under load than two 680’s but I don’t think someone buying a $999 graphics card is worried about saving money.

        The deprecation alone would, without any doubt, completely outweigh any cost savings from using marginally less power anyway 😉

Pin It on Pinterest

Share This