By the way, Surround Gaming is delayed until July

I suppose we should have posted this bit of news separately before now, but in case you didn’t read to the very end of my Eyefinity write-up earlier this week, here’s the relevant bit of reporting from it:

Nvidia still intends to counter with its own version of Eyefinity, dubbed Surround Gaming, and a depth-enhanced variant that works with funny glasses called 3D Vision Surround. The green team’s original timeline for delivery of drivers with support for those features was mid-April, as the firm told us upon the introduction of its GeForce GTX 470 and 480 graphics cards. That deadline came and went, and we heard nothing but the sound of crickets chirping, so last week, I started agitating on Twitter about the driver release date. Shortly thereafter, we found out Nvidia’s Tom Petersen had posted a blog update with a new timeline: the end of June. That’s not too far off in the grand scheme of things, but I feel sorry for any die-hard Nvidia fans who bought dual GF100 cards expecting to be gaming across multiple displays in April. Nvidia may have a good thing with its 3D-enhanced version of multi-monitor gaming eventually, but one wonders how long it will take to reach the level of refinement Eyefinity has—which isn’t really a final destination by any stretch.

Of course, “the end of June” is just a softer way of saying “not ’til July.” So there you have it.

Comments closed
    • SiliconSlick
    • 10 years ago

    Yes, well, for those who no doubt will doubt:
    §[<http://hardforum.com/showthread.php?t=1520653<]§ Try not to be the skeptic like the one poster there with the ironic sig. ROFL

    • SiliconSlick
    • 10 years ago

    DON’T WORRY or feel sorry: ” but I feel sorry for any die-hard Nvidia fans who bought dual GF100 cards expecting to be gaming across multiple displays in April.”
    THEY ARE ALREADY.
    The driver has “leaked”. Granted, some issues and not all games are supported, but it’s out there in the wild, been out there. No deprivation.

    • yogibbear
    • 10 years ago

    I couldn’t wait… so i bought a 260gtx just the other day for super cheap to tide me over… bye bye 8800gt it was nice knowing you.

    Now i will pick the real winner of 2nd gen dx11 thanks.

    • SomeOtherGeek
    • 10 years ago

    “funny glasses”? Who wants that kind of technology?

      • Meadows
      • 10 years ago

      Everyone. No holographics yet.

    • stmok
    • 10 years ago

    I’m still curious of what Nvidia is going to do once Intel’s CPU and Larrabee technology merge (as well as AMD’s CPUs and Radeons with their “Fusion” concept); to become one processor.

    (Granted, we won’t see something serious until 2015; where AMD is going to combine GPU elements into the processor core itself with its 2nd generation Fusion.)

    …Where does this leave Nvidia in the long term scheme of things?

      • Krogoth
      • 10 years ago

      Fermi architecture is your answer. 😉

      • Meadows
      • 10 years ago

      g{

      • jdaven
      • 10 years ago

      Nvidia has the Tesla (big iron servers), Quadro (workstation graphics which probably will not be integrated into a CPU anytime soon) and Tegra (cellphones) lines. If they are even remotely successful with these product lines, they will be okay.

      Don’t think too much about your little desktop used for checking email and playing some games. Nvidia has bigger plans than you! 🙂

    • MadManOriginal
    • 10 years ago

    By the way, NV is trying *really* hard to keep high powered graphics relevant in this console port era.

      • Krogoth
      • 10 years ago

      AMD is having the same difficulties.

      Outside of Eyefinity mode, it is difficult to justify getting anything more than a 5850.

        • MadManOriginal
        • 10 years ago

        Yes but this article was about NV so I limited it to them.

      • Cyco-Dude
      • 10 years ago

      well, with lcd’s getting larger and larger (or more specifically, cheaper and cheaper, thus making the previously too-expensive large models more affordable), you’ll still need more horsepower to drive them, especially with online fps games (120 fps or bust!). plus, you’ld want a buffer for future games that take advantage of newer, resource-hogging features and eye-candy. i think there will always be room for high-end, or at least the bang-for-the-buck middle-high end.

        • MadManOriginal
        • 10 years ago

        Compared to ‘HD’ TVs? 720p resolution was passed long ago for monitors, 1080p or 1920×1200 has been afforable for a good while too. It’s not ‘size’ it’s pixels that count.

        • OneArmedScissor
        • 10 years ago

        Resolution has not gone up in years. In fact, it has been decreasing, at least slightly.

        It’s not as if there’s a legitimate reason for everyone and their dog to have a 2000p 24″ monitor, so why overpay for a “buffer?” “Future proofing” is throwing money at a non-existent problem.

        • Farting Bob
        • 10 years ago

        Size for a given price is better than ever on PC monitors, but its still hard to find more than 1920×1200 (or even 1080p these days..) on a monitor of any size without going to extremes. Really multi-monitor setups will be the only thing that challenges future generations of discrete GPU’s.
        They can add 3D, 120hz etc etc but at the end of the say, if a GPU only has to push x number of pixels a second and that number isnt increasing nearly as fast as GPU processing power does then future $200 cards will be more than enough no matter how big your single screen is.

    • Damage
    • 10 years ago

    Note to self: July comes immediately after June.

      • bthylafh
      • 10 years ago

      History may show that you were right the first time. 🙂

    • eitje
    • 10 years ago

    I am vindicated in my Radeon purchase!

Pin It on Pinterest

Share This