EVGA doubles down on GTX 560 Ti

EVGA has taken the wraps off its GeForce GTX 560 Ti 2Win, a new graphics card with dual GF114 GPUs running in SLI. The GPUs are clocked at 850MHz, slightly higher than the GTX 560 Ti’s base frequency of 822MHz. EVGA hasn’t deviated from the 560 Ti’s 4 GT/s memory speed or the amount of RAM typically associated with the GPU. Each graphics chip has access to 1GB of its own memory.

To fit everything onto the card, EVGA uses a lengthy 11.5" circuit board that might be difficult to cram into smaller enclosures. Getting the 2Win to play nicely with adjacent expansion cards may also prove difficult. Although the card’s expansion plate only covers two slots, this side profile suggests the cooler’s shroud could encroach on a third slot.

The outputs include a Mini HDMI connector and three DVI outputs. That DVI trio gives the card out-of-the-box support for Nvidia’s 3D Vision Surround tech, which spreads glasses-assisted 3D over three displays.

EVGA says you’ll need a 700W PSU with at least 50A of 12V power to run the 2Win. You’ll also need to scrounge up $520 to buy the card at Newegg, which is a little bit more than the cost of two comparably clocked 560 Tis. Dropping over $500 on a graphics setup that effectively has only 1GB of memory seems a little perilous to me. Let’s hope EVGA is cooking up one of these puppies with 2GB of RAM per GPU. After all, a pair of GeForce GTX 560 Ti 2GB cards will only set you back $540.

Comments closed
    • Aveon
    • 8 years ago

    This cards seems worth every money .
    I wish TR could do a review on it if they get their hands on a sample card.

    Until then here is review

    [url<]http://www.guru3d.com/article/evga-geforce-gtx-560-ti-2win-review/[/url<] It looks promising and affordable faster solution than the GTX 580. BTW you cannot SLI em:(

    • squeeb
    • 8 years ago

    Board length reminds of my old Voodoo 5 hehe.

    • Aveon
    • 8 years ago

    Good one EVGA you pulled a ZOTAC stunt

    [url<]https://techreport.com/discussions.x/21724[/url<]

    • gmskking
    • 8 years ago

    What exactly is the point of releasing this card?

      • crabjokeman
      • 8 years ago

      Getting rid of GF114 chips twice as fast…
      I’m thinking a lot of folks bought GTX 460 1GB’s or Radeon equivalent, and there’s not a lot of demand right now. I could be wrong though…

    • ModernPrimitive
    • 8 years ago

    For that cash I’d rather have a poorer (in most cases) performing 580 or put up with microstuttering on 2 560ti’s in SLI. Not to mention the 500 series is soon to be replaced….

      • kuraegomon
      • 8 years ago

      Where “$soon” >= “6 months”? I suspect that most of the target audience for this card flip cards yearly. The real problem is the VRAM. I bought a 3 GB 580 – if this card had been available at the time I purchased (< a month ago), I’d have seriously considered buying it – if it came with 2 GB per GPU.

      With 1 GB of VRAM per GPU, this is a completely pointless card. If you’re gaming at 1920×1080/1200 then you don’t need a multi-GPU config, except for a couple of cards. If you’re gaming at 4 MP, then you need 2 GB+ even if you’re only going to keep the card for a year. Releasing the card the way it is a bean-counter’s call.

        • Deanjo
        • 8 years ago

        [quote<]Where "$soon" >= "6 months"? [/quote<] Actually, Where "$soon" <= "5 months" 1st Quarter of 2012.

          • JustAnEngineer
          • 8 years ago

          Remember that NVidia’s first quarter runs from February through the first week of May. Their fiscal year doesn’t match the calendar year.

      • dashbarron
      • 8 years ago

      Kind of with you. I have dual 570’s for a bit more.

      • Aveon
      • 8 years ago

      [url<]http://www.guru3d.com/news/nvidia-geforce-gtx-560-ti-to-be-upgraded-to-448-cuda-cores/[/url<] check this out it's upgraded version of GTX 560

    • swaaye
    • 8 years ago

    These dualies seem to start showing up right at the tail end of a generation. A lot of R&D for something that must not sell all that well.

    • can-a-tuna
    • 8 years ago

    Turd is a turd even when doubled.

      • geekl33tgamer
      • 8 years ago

      IIRC, a single 560Ti performs quite well for the price range it sits in? The price for this – That’s another issue entirely (and don’t even get me started!)…

      • swaaye
      • 8 years ago

      What’s wrong with the 560Ti? It outperforms a 6950 in some games. Price is probably a bit too high but that’s it. I own both and I have no complaints about it. In fact it has simply worked better with some games, like Rage.

        • paulWTAMU
        • 8 years ago

        my brother went with one (it got him a free Arkham City and he wanted the game so…). He’s had no complaints since getting it.

      • crabjokeman
      • 8 years ago

      Ignorance is still ignorance, even when egregious..

    • derFunkenstein
    • 8 years ago

    Now you can get your micro-stuttering with ONLY ONE CARD! Sign me up!

    • albundy
    • 8 years ago

    will this be available to the 99%?

    • RtFusion
    • 8 years ago

    I have a general question regarding dual-GPU card configs. Why can’t they share the entire memory pool? Instead of having xGB for GPU 1 and xGB for GPU 2, why not having xGB+xGB for both GPUs?

    Or is there no benefit of sharing memory or is there some limitation in memory controller design?

      • TurtlePerson2
      • 8 years ago

      It’s a computer architecture nightmare. How do you tell if memory is being used by one chip before you modify it on another chip? If both chips want to write to the same address, which one gets priority?

      It’s not impossible, CPUs share some caches, but it’s extremely difficult. GPU makers have probably decided that the performance speedup isn’t worth the added complexity.

        • Voldenuit
        • 8 years ago

        Not to mention it would halve the available bandwidth to each GPU. Memory bandwidth is probably more important for performance than having 4 GB of addressable VRAM…

      • Forge
      • 8 years ago

      Because each GPU needs to draw the screen. GPU #0 draws frames 1,3,5,7,9, GPU #1 draws frames 2,4,6,8,10.

      Each GPU draws independently. For them to share memory, they’d have to both be working on each frame.

      Instead of SLI 1GB GPUs, you’d end up with a single 2GB GPU and a very expensive and hot external memory controller.

        • RtFusion
        • 8 years ago

        Ah alright. Thanks for the responses, I really appreciate them. I hope you guys don’t mind if I throw another out with regards to dual-GPUs.

        Is there some some technical, design, or some engineering (or some combination of) why GPU dies themselves can’t sit in the same package? Something like AMD’s Magny Cours Opterons.

      • JMccovery
      • 8 years ago

      The best way to make such a system work is to have an external memory controller, that every GPU can connect to. Similar to the old chipset-bound memory controllers of the FSB age.

      The controller would make sure that one processor doesn’t write to a memory address that another processor is reading from.

      A small PFGA/ASIC similar to the Lucid Hydra and a GPU form of NUMA could make ‘shared’ memory possible on video cards.

    • DeadOfKnight
    • 8 years ago

    What is this garbage? Nevermind, I don’t want to know.

    • kamikaziechameleon
    • 8 years ago

    such a pointless product at that price.

      • sweatshopking
      • 8 years ago

      I wouldn’t say it’s pointless, it handily beats a 580, but i wouldn’t buy either.

Pin It on Pinterest

Share This