Nvidia’s GRID VCA is an end-to-end GPU virtualization system

Nvidia is invested in remote GPU virtualization on several fronts, including its GeForce GRID cloud gaming servers and its Project Shield handheld device. At its GPU Technology Conference today, the firm unveiled another addition to the mix: GRID VCA, which is billed as the "world’s first visual computing appliance." During his GTC keynote address, Nvidia CEO Jen-Hsun Huang was adamant that GRID VCA is a full system rather than just a server. The hardware resides in a rack-friendly 4U chassis, and it comes with software that manages the host and client sides of the equation. 

Source: Nvidia

On the hardware side, the appliance pairs dual Xeon processors with up to 384GB of system memory. As many as eight GeForce GRID compute cards can be stuffed into the chassis, and each of those cards has dual Kepler-based GPUs. With 16 GPUs in total, a single GRID VCA box is capable of fueling 16 simultaneous user sessions, each of which offers full GPU acceleration. Those sessions can be controlled by PCs, Macs, or ARM-based systems.

Nvidia provides the client software for those systems, and it’s also supplying the hypervisor for the server. Client machines can connect to multiple sessions at once, a capability demoed during the keynote using a single MacBook to manage three independent workspaces. Interestingly, the software license supports an unlimited number of client devices.

GRID VCA is currently in beta, and it looks like two models are planned. The base config will sell for $24,900 and offer 16 CPU threads, eight GPUs, and 192GB of RAM. This puppy should support eight simultaneous sessions and will cost $2,400 per year for software licensing. The top model doubles up on everything in the base config, including the software licensing cost, but is priced at a bit of a discount: $39,900.

Comments closed
    • spigzone
    • 7 years ago

    Looks like no Nvidia in Steam Box, or much of anything else online gaming oriented … pivot to the professional market.

      • HisDivineOrder
      • 7 years ago

      Yeah, I remember the day they introduced the Quadro and then the Tesla. Suddenly, PC gaming GPU’s were taken off the market. There was a firesale. nVidia announced that all TWITBP would go away in favor of a new server initiative called, “This is the Way It’s Meant to Be.”

      EVGA went out of business. Drivers stopped coming. People with Geforce cards turned on their PC’s only to have a message pop up where a driver update alert should be that said, “Peace out!” before the driver automatically uninstalled itself and deleted all copies of the installation file.

      Soon, there was panic in the streets. Dogs and cats, living together! Total chaos. Even President Obama announced a public day of mourning for the loss of his beloved 580 SLI system he’d called, “Sweetness.”

      Then a giant monster made of marshmallow stomped around NYC. Or… wait. Did I see some of that in a movie? Did absolutely NONE of the above actually happen?

    • webkido13
    • 7 years ago

    We have been waiting for something like this to run our OpenGL scientific applications on here at our university for at least six months. I would order one TODAY if I could. I’ve been waiting for NVIDIA to finally bring something to the market and stop talking about how it is just around the corner. NVIDIA take my money. I have it (well my university). Finally give me a product I can buy.

      • kcarlile
      • 7 years ago

      See my other posts. There are cheaper ways to do this–you can get that chassis full of GTXes for <$20K (maybe a LOT less than $20K) from places like Penguin Computing and Thinkmate. If you use FOSS on it, then all you’ve got is the hardware cost.

    • shank15217
    • 7 years ago

    this solution looks like a bunch of shortcuts. real gpu virttualization should be able to pool all the gpus together and share them among users, some getting more resources while others less. A true solution should allow admins to carve out gpu compute and ram.

      • kcarlile
      • 7 years ago

      Well, you can’t give one user MORE than one GPU, but with other solutions (posted elsewhere), you can certainly carve out partial bits of GPU to hand out. The latest version of Univa Grid Engine is supposed to be able to schedule directly to GPUs, but we haven’t tested that part yet.

      • cygnus1
      • 7 years ago

      wait one or two generations on the video cards. once they have unified system memory access, i see the virtualization abilities increasing significantly

    • OU812
    • 7 years ago

    Are all the posters here as clueless as they seem. Five of the current nine main posts talk about games or gaming.

    This is not a game server.

    [quote<]The NVIDIA GRID™ Visual Computing Appliance (VCA) is a powerful GPU-based system which runs complex applications such as those from Adobe®, Autodesk and Dassault Systemes, and sends their graphics output over the network to be displayed on a client computer. This remote GPU acceleration gives users the same rich graphics experience they would get from an expensive, dedicated PC under their desk.[/quote<] [url<]http://www.nvidia.com/object/visual-computing-appliance.html[/url<]

      • Thrashdog
      • 7 years ago

      This. As a sysadmin for an architectural firm that likes its end user systems light and portable, I’m really excited about this stuff. Virtualizing graphics hardware like this may ultimately be a great way to get around the necessarily weak mobile GPUs that we have to deal with otherwise. That said, at 25 grand up front plus 2.5k in annual licenses, I would have to skip a system upgrade cycle to buy in. Perhaps once prices come down and there’s a midrange product further down the ladder…

        • kcarlile
        • 7 years ago

        There are other ways to get at it. I keep posting it, but in case you haven’t seen it, look at virtualgl.org and nice-software.com.

          • Thrashdog
          • 7 years ago

          Unfortunately Autodesk has transitioned to Direct3D-only for hardware graphics acceleration. OpenGL-based solutions aren’t much good in an AEC environment any more.

            • kcarlile
            • 7 years ago

            Good point.

          • Laykun
          • 7 years ago

          Hey thanks for posting these! I just ran virtualgl and it works amazingly well running a game I’m developing on my linux box at full 60fps over to my Macbook Air :D. The game is quite intensive and uses a deferred lighting model with PCF shadows.

        • My Johnson
        • 7 years ago

        OK. Product development. Makes sense.

        • shank15217
        • 7 years ago

        how light can you possibly go? also you may want to look into remotefx from ms which seems to be a better value for what it does.

          • kcarlile
          • 7 years ago

          Pretty damn light. I haven’t tried my software and rig against a netbook, but I have run it on a C2D with really cheap graphics (integrated?), and it runs like a charm.

          Ha. Just remembered that one of the ways I test is using a VMware View VM (which is probably worse than an integrated GPU) and it works very well. Especially considering I use the VM from home.

      • Bensam123
      • 7 years ago

      I don’t know, maybe it’s because the entire time this has been mainly marketed as a system for remote gaming. The whole GRID/Shield thing.

      • beck2448
      • 7 years ago

      Clueless is right.

      • shank15217
      • 7 years ago

      games that use the gpu are no diferent than a 3d content creation application as far as the hw is concerned so get off your high horse. I am sure crysis 3 can push the gpu just as hard as anything out there.

    • spigzone
    • 7 years ago

    AMD’s CES presentation indicated that during GDC they would be revealing a great deal more on it’s alliance with the (now) Sea Micro powered CiiNow … probably accompanied by an announcement of a slew of partners of various sorts … cable providers, game developers, telecoms, online gaming companies and so on.

    Just saying.

    … and a lot more on how games are being be optimized for AMD hardware.

    • kcarlile
    • 7 years ago

    I don’t think this is really for remote gaming. This is for remote visualization, oil/gas, bio, physics, that sort of stuff. I mean, it would probably work for gaming, but I’m betting that’s not really their focus.

    Ironically, since I’ve been working on that kind of thing, I recognize that chassis. It’s an older Tyan box, with a couple of Westmeres, with PCIe 2.0 bridge chips in it. It also has a rather unfortunate flaw on the onboard ethernet (uses a defective Intel chipset), so be ready to sacrifice one of those precious video cards for an ethernet card. Otherwise, it’s a decent box, as long as you’re not interested in changing out the GPUs frequently (like I do), since they’re all screwed in rather than having a plastic retention clip or something like that.

    I’ll probably check this thing out anyway, particularly if it can run RHEL. with RHEL clients. That licensing model isn’t too bad at all, and the price may well be OK, although the question is what the GPU is in there.

    • shank15217
    • 7 years ago

    16 player remote quake anyone?

    • smilingcrow
    • 7 years ago

    I knew Jen-Hsun Huang was small but judging from the photo he’s built on a smaller node than Tegra 3.

    • kc77
    • 7 years ago

    No wonder OnLive went bankrupt. One 4U only allows 16 users?

      • axeman
      • 7 years ago

      Shouldn’t someone have figured out how to run two simultaneous sessions on one GPU by now, even if it means limiting it to 720p?

        • kcarlile
        • 7 years ago

        OnLive was limited to 720p….

      • l33t-g4m3r
      • 7 years ago

      Exactly, which makes you wonder where the funds are coming from, and more importantly: WHY?

      Yeah, this system might not be what OnLive was comprised of, or what it’s intended for, but Onlive probably used something close, and of similar value. I don’t think for a second that cloud gaming can EVER be profitable, and the only purpose it would serve is complete control of your software prioritized over expense. Even at that, the expense is too great to be viable long term, nor would most users tolerate the loss of owning a game disc combined with horrible latency. Cloud gaming is merely the brainchild of psychotic corporate control freaks who don’t have a grasp on reality, much like EA with Sim City.

      Perhaps the people running cloud services even know this, and are merely taking advantage of funding available for inventing new drm scams. It sometimes seems like this is the case, because of the propaganda associated with such services geared toward a buyout from larger companies. I think Sony recently bought out one of these companies too. Quite sad, really.

    • axeman
    • 7 years ago

    This is interesting, but I can’t imagine how cloud-based gaming could ever really take off. First of all, it’s just going to move the bottleneck to the network interface, which on mobile devices could be a brutal trade-off. That might be more of a short-term problem as ultra-high bandwidth connections become more commonplace. Second, how on earth are the controls not going to suffer badly from increased latency, that pesky speed of light and all? This isn’t going to be like hard core gamers using wired mice and TN panels to reduce lag, this is going to be very noticeable for anything that involves real time control of what’s on the screen. From what I’ve read, the latency of the network connection isn’t even half of the issue currently – encoding the video frames using whatever codec (H.264) introduces even more latency. Seems like an overhyped gimmick so far. Especially considering who’s hyping it, long history of overselling things….

    Also, the pricing is hilarious. It’s going to gost over 1,500 per user session before software licensing? This makes no sense.

    edit, whoops, math, looks like 16 users is only with the 39k box, that’s even worse for the pricing!

    • Neutronbeam
    • 7 years ago

    But can it play Crysis?

      • nanoflower
      • 7 years ago

      I don’t think the hardware that can play Crysis at the highest settings has been invented yet.

        • ColeLT1
        • 7 years ago

        Not exactly true, I played Crysis 1 then warhead on max settings @ 1920×1200 on my old PC a couple years back. I never saw my fps drop below 50, most of the time it was in vysnc locked at 60, but I am picky about my games, I don’t play them until I can max them out and have smooth gameplay. I changed the AA from 16x to 8x and my fps never would drop below 60.

        edit: On a 1st gen I7 950 @ 4.1ghz max turbo and 2x GTX460 1GB.

    • heinsj24
    • 7 years ago

    Cool. Now instead of buying a physical video card I can license one for much, much more… and still have to purchase a video card.

    • MadManOriginal
    • 7 years ago

    Sweet! Up until now I could only use my computers by intuition.

      • Celess
      • 7 years ago

      Cool… I have to get one of these as a Second computer.

        • Celess
        • 7 years ago

        …as a server so I can finally play the Sims.

Pin It on Pinterest

Share This