EVGA offers a sneak peek at Nvidia’s next dual-GPU monster

CES — What’s this? We dropped by EVGA’s suite here at CES today to see what they’re up to, and among the various video cards on display was this rather intriguing number:

EVGA wouldn’t give us many details about the card in question, but they did say it’s a future dual-GPU product. Given that EVGA is an exclusive Nvidia partner, that essentially confirms something we’ve half expected for some time now: a dual-GPU, SLI-on-a-stick video card must be on the way, presumably as a member of the GeForce GTX 500 series.

As you can see, there are three DVI outputs on the card, made possible by the presence of two GPU display engines. This card should be capable of driving three monitors and allowing games to run across them all at once via Surround Gaming, Nvidia’s answer to AMD’s Eyefinity.

This is a decidedly high-end product. The card is relatively long and sports a pair of eight-pin auxiliary power connectors, suggesting that a beefy PSU will be needed in order just to run one of these babies.

Judging by our mystery card’s apparent power and cooling requirements, we’d expect it to house a pair of GF110 GPUs, perhaps de-tuned a bit from the fastest single-GPU GF110 implementation, the GeForce GTX 580, simply so that two chips can be powered and cooled on a single card. The presence of eight Samsung K4G10325F one-gigabit GDDR5 DRAM chips on the back of the board, four per GPU, means there’s at least 1GB of total memory on the card. One possibility is that there are eight matching DRAM chips on the other side of the board, or 1GB per GPU. If true, that would mean only four of each GF110’s six memory controller/ROP partition units are enabled. It’s also possible there are 12 DRAM chips on the other side of the card, with five memory/ROP units enabled per GPU, making this card more like a pair of GTX 570s. The latter configuration may be more likely.

The presence of an SLI connector also suggests the possibility of a four-way SLI configuration based on a pair of these video cards running in tandem.

That nifty triple-fan cooler is a custom job from EVGA. The company was showing off a couple of cards with coolers designed around this motif, including a smaller and rather handsome GeForce GTX 570.

We already know AMD has plans for a video card—code-named Antilles and expected to be sold as the Radeon HD 6990—based on a pair of the Cayman GPUs that power the Radeon HD 6950 and 6970. Now it would seem Nvidia has a rather direct answer to Antilles in the works, as does its close partner EVGA. We don’t have any word yet on the timing of this monster’s availability, but judging by its presence here with a custom cooler attached, it surely can’t be too far from release.

Comments closed
    • beck2448
    • 9 years ago

    This card is sick!

    • franzius
    • 9 years ago

    “Now it would seem Nvidia has a rather direct answer to Antilles in the works, as does its close partner EVGA”
    I wonder what an indirect response would look like? Maybe a wooden mockup with screws?

      • dpaus
      • 9 years ago

      Nah, it would be one of their internal cartoons (like they did to Intel)

        • Meadows
        • 9 years ago

        Tell me about it, those were bad.

    • d0g_p00p
    • 9 years ago

    It’s too expensive (should be under $99) who needs a card like this? only if you want e-peen bragging rights. Too much idle power, my 7800GT plays every game maxed out with all details at 2Kx2K, only a fool with too much money and not enough sense would buy this, it does not support Linux = fail, another “gimmick” (Krogoth), etc etc etc..

    edit: can it run Crysis?

    I remember when people used to post about how hyped they were to see things like this on TR. Now whenever a high end product has a review or a blurb about top of the line hardware is posted all I read in the comments is the same standard complaints.

    Same with high end video cards, motherboards, power supplies over 500w, CPU’s you name it and the TR commentators will quickly post to tell you what a waste of time and money that product is.

      • macca007
      • 9 years ago

      Cards like this are indeed needed not for epeen status but just to play the latest games on full hd on max settings.
      No way in hell a 7800 card runs every game maxed out with all details as I have dual 480GTX cards running in SLI, Overclocked i7 950 and Vertex 2 SSD, I am lucky to hit 25 fps in some Direct X 11 games. Metro 2033 1900×1200 res with all settings on max will bring most enthusiasts pc’s to a crawl!
      2 things I would say where it’s getting lame is in the cpu and hdd department, All the fuss over multi cores yet most software is still stuck on a single core or dual if we are lucky, No need for 4 or 6 cores in games at present, I just want more Ghz, Have to overclock my cpu just so it can keep up and feed my dual graphics cards some numbers to crunch otherwise it can bottleneck the system.
      Hdd’s same applies we want speed, capacity isn’t really needed, Only so much porn one can store and look at hahahaha
      Also the higher the capacity the more lazy people get, Just store more stuff on there then when the drive fails they lose everything, Instead of investing in 2 or 3 smaller cheaper drives as backup!
      It’s a vicious cycle in the tech world and can be an expensive hobby if you want to be above the general masses. 😉

      • kamikaziechameleon
      • 9 years ago

      Tech is a waste.

    • destroy.all.monsters
    • 9 years ago

    I like the design.It has a cool muted look and it’s in black. I’t doesn’t have leds all over it and it isn’t some irritating color – or worse – conglomeration of ugly or irritating colors. There are no garish stickers with anime inspired girls on it.

    I suspect the “prison bars” around the fans are partially to differentiate themselves from Arctic Cooling and the like as well as the vendors that sell those things pre-attached. “Sports car-ish” might be a good term for it. Not to say that’d pass Strunk and White et. al.

    • r00t61
    • 9 years ago

    I remember when my Radeon 9700 Pro had a heatsink the size of a post-it note and a tiny little 40 mm fan.

    At the rate video cards are growing in size The Powers That Be will need to re-jigger the ATX spec. I wonder when we’ll start seeing mainstream video cards in a triple-slot form factor…

    Is this double Fermi card bigger than the old Voodoo5 6000? Now there was a card that made me stand up and say, “Hello!”

      • travbrad
      • 9 years ago

      You could go back further to my first card, which had no heatsinks or fan (and judging from that picture had BLOOM inside the case). 🙂
      [url<]http://www.ixbt.com/video2/images/retro2003-2/riva128.jpg[/url<] Not sure about the beating the Voodoo5 6000, I think the 3 fans just make it seem bigger. [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814127532[/url<] looks longer to me (if you compare shots where the PCI-E connection is visible). Of course the difference is the Voodoo5 6000 wasn't released (except in very limited quantities), while these cards are. That MSI card is only 1-2" short of the 6000 (which was around 14"). It was massive for it's time but cards nowadays aren't too far off.

    • JokerCPoC
    • 9 years ago

    Even with water cooling It’ll never get down to a single slot, So the GTX295 single pcb is a safer bet still(All cause this card has a 3rd DVI output), Albeit older. Make It 2 mini HDMI and 1 DVI and put all 3 in a line and then It might be worth water cooling.

    • danny e.
    • 9 years ago

    Engineer: Here’s the sample!

    Marketing: “Uh, it’s wide open! Someone could put their fingers right in the fan. Put some bars across the front to prevent that.”

    Engineer: “That will restrict airflow too much and this thing will catch fire.”

    Marketing: “Well, it doesn’t matter if the gaps are big.. we just want the bars there so it ‘appears’ safer.”

    Engineer: “But then they could still put their fingers in the fan.”

    Marketing: “Yeah, but its a bit less likely and anyways it looks safer. Appearances are what matters.”

    </airport security farce>

      • sigher
      • 9 years ago

      I don’t think it’s to protect fingers but more to protect against things stopping/breaking the fans and to keep some clearance and keep cables out a bit, it’s in a metal computer case and you don’t touch it unless the fans are not spinning since the power if off, and those fans aren’t airplane propellers, they are plastic fans.

    • donkeycrock
    • 9 years ago

    Wouldn’t this card have better cooling if it didn’t have the bezel around the fans?

    The first thing i would do is take that thing off.

    • Krogoth
    • 9 years ago

    I don’t see the point.

    GTX 580 is already fast enough to handle 4Megapixel gaming minus AA/AF on current titles. It effortlessly handles 2Megapixel gaming with heathly doses of AA/AF.

      • Meadows
      • 9 years ago

      Exactly, you said the point. [i<]Minus AA/AF.[/i<]

      • sigher
      • 9 years ago

      Tried crysis warhead? That does 33FPS on full with 4xaa/16AF, on plain HD resolution, and on intensive areas probably much lower.
      And playing without any AF is of course not something you do, that’s just silly, you can go as low as 4x but not lower.
      And missing out on any AA is pretty noticeable and annoying really.

      But it’s true that there comes a point where you should just wait for the next gen instead of paying a ton of money on 1500W PSU’s and $1200 graphics cards.
      And most games have a medium setting that’s fine, and often the super-high settings are more a show-off feature and not even really part of the designed game but just spruced up to test hardware.

    • kamikaziechameleon
    • 9 years ago

    WOW, I had the 9800 GX2 and it had so many heat issues, I’m very weary of aquiring another dual GPU solution.

    • can-a-tuna
    • 9 years ago

    Looks like a prototype or halo product. Something not manufacturer under 300W TDP. I’d like to see nvidia’s roadmap about this.

    • flip-mode
    • 9 years ago

    It’s a good thing the backplate is vented.

    • LiamC
    • 9 years ago

    2 x 8-pin? Outside the PCI-e spec then. Warranty issues will be intriguing…

    Or is it just one of those halo “engineering sample” products that never see the light of day?

    Smells like a marketing exercise to me.

    • ub3r
    • 9 years ago

    This card will fail after 2 weeks of dust build up.

    • Meadows
    • 9 years ago

    The first thing on my mind was this card is a crime, what with its being behind bars.

    Also quite curious why it has just 3 [i<]DVI[/i<] ports, of all things.

      • Deanjo
      • 9 years ago

      It has a mini HDMI port as well.

      • flip-mode
      • 9 years ago

      T’was worth a chuckle.

    • willyolio
    • 9 years ago

    only three outputs with two GPUs? AMD’s been managing six out of a single GPU for a while. I mean, if anyone’s got a grand to burn on a ridiculous, over-the-top video card, three monitors is just a far too ordinary setup to justify these levels of insanity.

      • Silus
      • 9 years ago

      Too ordinary ? So you think that it’s normal to see anyone with more than 3 monitors, especially to play games ?

      It’s not even often to see people with 3 monitors…much less with more than 3…

    • ClickClick5
    • 9 years ago

    So….what? Near 600+ watt draw?

    This is almost a double edged sword. EXPENSIVE card AND the power bill for the months you use this, high too! :p

      • FuturePastNow
      • 9 years ago

      150W per 8-pin connector plus 75W from the PCIe connector for a total of 375W.

      • sigher
      • 9 years ago

      In many european places power is upto 5 times (no joke) more expensive than in the US, so count your blessings.

      • continuum
      • 9 years ago

      I’m going to buy one to cook eggs on. Or maybe steak. 😉

    • nunifigasebefamilia
    • 9 years ago

    Meh, at some point I stopped being excited about something twice as more powerful, but at the same time twice as big, twice as expensive and twice more power hungry. I much more excited about something times smaller, times more power efficient without giving up too much performance. Weird, did not expect that I would come to such radical shift in my view of what is cool.
    Why I decided to share this with all of you? Don’t know 🙂 Maybe it’s just time to go to bed…

      • derFunkenstein
      • 9 years ago

      If you think it will only cost twice what one of those would cost, you’re crazy.

        • nunifigasebefamilia
        • 9 years ago

        You took my words too literally.

      • Chrispy_
      • 9 years ago

      Well, until developers stop writing cross-platform titles that must run on anemic consoles, we won’t have many games that stress these monster graphics cards.

      As a result, saving money and making things quiet seem to be the temporary new hotness, especially when less than 200 bucks buys you a decent gaming experience at 1080p resolution.

    • ThorAxe
    • 9 years ago

    Looks like Antilles will have a short lived reign (if it even gets released first) as the fasted single video card.

      • Goty
      • 9 years ago

      That honestly depends on if this card really uses two GF110 chips or something more along the line of fully-enabled GF104s.

        • ThorAxe
        • 9 years ago

        I don’t see the point of Nvidia bothering if this card turns out slower than Antilles.

        • JustAnEngineer
        • 9 years ago

        A pair of defect-free GF104 GPUs would make a very powerful card, and it would push the limits of the PCIe power specification. A pair of GF100b/110 GPUs would use an amazing amount of power for a single card.

        • Silus
        • 9 years ago

        Only of that ? I would say it depends on what Antilles is as well. Most expect it to be two fully enabled Cayman chips underclocked. But Cayman is more power hungry than Cypress and Hemlock with two fully enabled Cypress chips, is only able to meet the PCI-e spec (294w), with much lower clock speeds. If Antilles does indeed use two fully enabled Cayman chips, it won’t be clocked as high as even the HD 6950, otherwise it will also be over the PCI-e spec and thus pretty much the same as this dual GPU eVGA model.

          • Krogoth
          • 9 years ago

          Just a nitpick, Cayman uses slightly less power than its Cypress predecessors.

          Even so, I doubt AMD can sandwich anything more than 2×6970 chips together.

            • flip-mode
            • 9 years ago

            Also, the Cayman cards already have 2GB of memory. It seems unlikely that Antilles would need more than that, and if my assumption there is correct that means a less substantial power hike going from Cayman to Antilles since you’re not talking doubling up the RAM.

            • Silus
            • 9 years ago

            What ?

            [url<]https://techreport.com/articles.x/20126/15[/url<] Larger chip on the same process. There are no miracles...

            • Chrispy_
            • 9 years ago

            Methinks Krogoth is looking at idle power by mistake.

            • Silus
            • 9 years ago

            Probably, but even then he should’ve said “Cayman consumes slightly more power than Cypress / Cayman matches the power draw of Cypress, at idle”

          • Goty
          • 9 years ago

          That’s definitely going to be a factor as well, but I think we can all agree that AMD has been able to squeeze more performance out of a given power envelope recently. Cayman has slightly more room to work in than GF110 when it comes to having to downclock/cripple chips in order to squeeze under that 300W ceiling, and we also have to take into account the fact that Cayman is roughly on-par or slightly faster than GF110 once you start to get above 1920×1200, which is the sort of regime these cards are meant to work in.

    • quasi_accurate
    • 9 years ago

    You can club someone to death with that thing…

    • l33t-g4m3r
    • 9 years ago

    This card is laughably ugly.

      • Silus
      • 9 years ago

      Yeah, games will surely not want to be played on it, because that thing inside a case, is extremely ugly…

        • travbrad
        • 9 years ago

        Apart from that, I’ve seen much uglier cards. At least it doesn’t have lame stickers of generic RPG characters or anime. Or…FLAMES (well not until you power it up :D)

Pin It on Pinterest

Share This