Nvidia VR Funhouse launches alongside new GeForce drivers

Nvidia's latest Game Ready drivers are out this morning, and version 368.81 is all about VR. This update is optimized for Raw Data, Obduction, Everest VR, and The Assembly. The fun doesn't end there, though. Nvidia is also releasing its VR Funhouse demo today to show off what's possible with its VRWorks APIs.

VR Funhouse, which is powered by Unreal Engine 4, takes players inside a carnival midway filled with mini-games. Those vignettes show off Nvidia FleX, Flow, HairWorks, PhysX Destruction, and Multi-Res Shading. Players can whack moles, hose clowns with paint, pop confetti-filled balloons, and shoot flaming arrows to see how VRWorks affects the experience. This demo works exclusively with HTC's Vive and its hand controllers for now.

Unlike most VR titles, however, VR Funhouse demands resources far beyond the general baseline of an Intel Core i5-4590 CPU and a GeForce GTX 970 or Radeon R9 290 graphics card. To run the title at low settings, Nvidia says gamers will need a GeForce GTX 980 Ti, GTX 1060, or GTX 1070, plus a Core i7-4790. Medium quality steps those requirements up to a single GTX 1080 and a six-core Core i7-5930K.

Maxing out the title also appears to work with a Core i7-5930K and a single GTX 1080, but Nvidia suggests gamers without the most powerful Pascal will need two GTX 1070s, GTX 980 Tis, or Titan Xes in SLI for maximum fidelity. Folks who want to play with VR Funhouse's GPU PhysX will need to dedicate a GTX 980 Ti or better card to the task.

Nvidia is offering VR Funhouse free of charge, and it says it'll open-source the software later this summer so developers can learn from its example. The title should be available for download on Steam today, though the game's store page wasn't yet updated at the time of this writing.

Comments closed
    • TheMonkeyKing
    • 4 years ago

    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me
    Can’t Sleep, Clown’ll Eat Me

    • flip-mode
    • 4 years ago

    Geez, getting VR worthy computer components at “mainstream” prices (e.g. a capable video card for $200) is going to take at least one more process node shrink. That could take quite a while if we’re stuck at 14nm for as long as we were stuck on 28nm. Five years or thereabouts. That is kind of discouraging. It’s going to be gosh darn 2021 by then. I’ll be 45 gosh darn years old.

      • TheMonkeyKing
      • 4 years ago

      And then you’ll be ready for “VR Resthome”?

    • Mikael33
    • 4 years ago

    FYI these drivers disable the use of nvidia profile inspector to enable that new fancy v sync option on Maxwell 2, if you have it enabled it’s now the same as having v sync off.
    Edit- Nice downvote for pointing something the new driver disabled, I thought I might let fellow 9XX series users know about that.

    • DoomGuy64
    • 4 years ago

    Full play-through:
    [url<]https://www.youtube.com/watch?v=y_XLELCOfzg[/url<] Looks like just a short tech demo for VR. Cool if you already have the hardware, but not an excuse to buy into it either.

    • derFunkenstein
    • 4 years ago

    If you’ve already bought into VR what’s another $1300 for a pair of GTX 1080s so you can do PhysX plus max detail?

      • chuckula
      • 4 years ago

      This looks like a developers’ toy and developers were supposed to pay $1500 for the Radeon Pro Duo, so… $1300 is a discount?

        • derFunkenstein
        • 4 years ago

        Heh, that’s a fair point.

        The open-source release is going to be the real key. If I were just getting started in UE4 and wanted to do something for VR, having Nvidia’s code samples handy would be huge.

    • DPete27
    • 4 years ago

    Apparently even nVidia doesn’t enable SMP in their own game.

      • tipoo
      • 4 years ago

      Curious. Are they still working on the drivers or something?

    • Billstevens
    • 4 years ago

    Looks cool but to lock down to try for even most VR owners. Sad the majority of us with VR gear will probably never get to run this..

    • shank15217
    • 4 years ago

    Actually a tech demo is exactly the place you would want Game works implementation, not an actual game. A tech demo “demonstrates” the capability of hardware and what optimized software can do on that hardware. Its like a concept car show of sorts for GPU makers. I am no fan of game works but this is where it belongs.

    err That was a response to xeridea..

      • xeridea
      • 4 years ago

      Gameworks doesn’t show the capability of what hardware and optimized software can do. Gameware is not optimized, it is brute forced tessellation for fake looking effects. They are handicapping themselves by using it. What is the point of demoing it if it is junk? Concept car at car show would be showing what something could be, Gameworks is showing how NOT to make your game. All they are demonstrating is how bad Gameworks runs, even on high end hardware. It doesn’t even look that great, mostly basic looking props with some physics or fake looking fire here and there.

      Or is the demo to show what would be possible on half the hardware if it was actually optimized? Seems kind of a silly thing to show off. I am all for tech demos showing off future possibilities, and generally like to mess around with them, but anything with Gameworks is just a waste of time, or $850.

        • stefem
        • 4 years ago

        You keep insisting with the “brute forced tessellation” cliché, which doesn’t apply to reality since tassellation, where present, is used to reduce computation time while keeping quality intact.
        NVIDIA greatly explain the techniques they adopt and there is source code freely available by some months too, you may take a look before talking again so you can better comprise the kind of work that lies behind every GameWorks effect or simulation.
        If you don’t like how they look it’s an entirely different matter, facts and personal opinions should never be mixed.

    • WhatMeWorry
    • 4 years ago

    Those requirements are not Fun.

      • Concupiscence
      • 4 years ago

      Oh, c’mon, everybody has that kind of stuff just laying around.

      • Aquilino
      • 4 years ago

      But as Lionel Hutz would say:
      There are “the requirements” (shakes head) and “the requirements” (smiles wide)

      • Pzenarch
      • 4 years ago

      … aren’t some of these newfangled features supposed to help with VR rendering efficiency? If so, I’d say they’re not working too well 😛

    • xeridea
    • 4 years ago

    With those requirements, sounds like the Gameworks implementations are as bad as ever.

      • DoomGuy64
      • 4 years ago

      You know, considering how PhysX, Gameworks, and VR is mixing a lot of compute with graphics, I wonder how much of a performance boost proper Async would give Nvidia here? Most likely a new architecture with true async capabilities would completely obsolete the 1080 SLI in this demo, considering the mixed workload. I bet that would cheese off a lot of SLI/VR users, or at least temporarily. Long term memory isn’t a strength of nvidia fans who usually go for instant gratification over future proofing. Oooh shiny! :p

        • Waco
        • 4 years ago

        I think only fanboys like yourself think of “Nvidia fans” versus “AMD fans”. The rest of the world with common sense buys the best product available at the price point they’re willing to pay.

          • DoomGuy64
          • 4 years ago

          No you don’t, I do that. The vast majority of people have historically been buying nvidia exclusively because they fall for the propaganda. Before the 390, I owned a 780 and a 470, because they were unquestionably better cards. I bought the 390 because it was also the best card in it’s class, while the rest of you were buying 970’s. Anyone who objectively looked at the capabilities of both cards should have known not to buy the 970. So why do people do it? Fanboyism, and instant gratification because nvidia might have better initial performance.

          Blind brand loyalty has existed even during the 9800pro days. I bought a 9800pro because I knew that it was unquestionably the best card in it’s class. Meanwhile, people around me were buying the FX. That logic escapes me, and there is no justification other than fanboyism.

          Nvidia doesn’t support async, so AMD has an edge in dx12/Vulkan, as well as better business practices and value. Instead of admitting that truth, Nvidia fans misdirect with excuses and cover the shortcomings with examples of how well Nvidia runs it’s walled garden ecosystem. Never mind if any of those things are consumer friendly, including the founders edition cards. Nvidia has basically relegated itself to the snobs of PC gaming. People who buy SLI, and rub their E-peen all over everyone else in the forums. It’s disgusting, and I think people are getting increasingly disenfranchised by it. Both nvidia’s business model, and the fanboys who buy into it.

          I have no problem buying Nvidia products when they offer something of value. Problem is, Nvidia hasn’t been offering decent value for some time, and they cater more to the high end, yearly purchase crowd. Cards like the 970 were designed for planned obsolescence. Even their high end cards suffer from it now, always lacking some feature that makes it quickly obsolete in a years time. This gen was VR, and next will be async. So, I really have a hard time justifying higher prices for a product designed to be junk in a year. Meanwhile, AMD cards are getting a huge 20-30% boost from async, so I’m glad I didn’t buy into all that green team walled garden fanboy hype.

          If you can afford a 1080 Founders edition card, great. Just don’t pretend nvidia is offering decent cards in my price range. Even if the 1060 was out today, and fared well against the 480, I’d still question the longevity of it’s performance and value. Anyone who buys something like that is most likely purchasing a card from brand loyalty, and not objective reasoning.

          Anyways, just because I disagree with some of nvidia’s business practices doesn’t mean I’m exclusively an “AMD fan”. I still buy what is objectionably the best product at my price point, which includes looking at how future proof a card’s architecture is. Day one dx11 driver performance isn’t everything, especially when we know what dx12\Vulkan can do. That said, I do see a reason to go green if you are heavily invested in VR, being that Nvidia is so strongly focused on it, but other than that I can’t justify it for my own use.

            • Waco
            • 4 years ago

            TLDR, except:
            [quote<]Nvidia doesn't support async[/quote<] This is wrong. 100% wrong. The entire premise of your argument is incorrect. Planned obsolescence...because reasons? Can you name any that actually have any real merit? There's a reason Nvidia has been outselling AMD these past few years by leaps and bounds, and it isn't fanboyism. They've simply had the better product. You can't bitch and moan about pricing either, since Nvidia and AMD both push prices as high as they can. Fury X? Yeah, no. Radeon Pro Duo? Ha! Titan Z? I think you get my point. Keep in mind I was a day one owner of both Bulldozer and the awesomely terrible 2900XT. Both products that AMD hyped to hell and back, and promised future performance beyond launch day performance. Neither materialized. I don't trust something I can't measure, I'm unsure why you would.

            • DoomGuy64
            • 4 years ago

            Which is why I read reviews. Nvidia currently “supports” async, but their method isn’t as efficient as AMD’s. Maxwell is what doesn’t support it at all.

            Nvidia doesn’t generally have “the better product” either. They have better marketing, and a better halo card. Doesn’t mean AMD’s mid-range products are without merit. Pretty sure my 390 trumps the 970, so you had to be a fanboy to buy a 970, especially when it cost more.

            When it comes to pricing, it’s pretty obvious halo cards have halo pricing. The Pro Duo is not a mainstream product, and neither is the Titan. Both are geared toward the prosumer, although I feel the Titan could have lost some of it’s appeal after nvidia dropped double precision.

            [quote<]2900XT[/quote<] I had a X1900, and that was more than enough to tide me over until the 4800. This is what I mean about people with disposable income randomly buying any new hyped up card for the E-peen. I buy cards based on whether or not the architecture can last, and I don't upgrade until I need to. I rather dislike products that obsolete too quickly, or in certain cases aren't any good from the beginning. AMD had some flops, and so did Nvidia, but I've never got the feeling from AMD that their cards were designed to become obsolete in a set time period like Nvidia has. People who bought GCN 1.0 are doing better off than Kepler, and the VLIW4 cards have aged a little better than Fermi due to having more reasonable amounts of ram. Don't get me wrong, I loved my 470, but Nvidia should have given us 2GB instead of 1. I understand there were a few models with more ram, but they were either underpowered or prohibitively expensive and rare. Kepler was pretty decent too, up until Maxwell came out and Nvidia stopped optimizing for their old cards. It just feels like Nvidia wants you to upgrade every generation, whereas AMD keeps their cards supported as long as the architecture is viable. That said, I think the fury is another 2900XT, outside of dx12/Vulkan. Never was worth the premium, but neither was the 980Ti. I don't think either of those halo products were catering to pragmatists. The 1070 is probably the best card out today, but it's also way overpriced, making the 480 the only viable choice for the midrange market. The 1060 could be potentially be a good card too, but when you look at the specs it looks like Nvidia balanced it right on the edge yet again, meaning the 480 will age far more gracefully.

            • Waco
            • 4 years ago

            Oh, so support only means it’s supported if it’s as good as the other guy? Ugh, arguing with you is pointless.

            Your post is filled with random crap that you’ve used hindsight to hone, not logic. Your sweeping generalizations are not true, regardless of how many times you repeat them. Continue your little diatribe if you wish, but as someone who’s job it is to buy exactly what is needed to be most efficient and long-legged, I find your reasoning amazingly bad.

            • DoomGuy64
            • 4 years ago

            You wanted clarification, I said Maxwell. You’re the one still coming up with your own interpretation, and I don’t really care how you distort it. The simple fact that you are “interpreting” what I said is proof of bias, and invalidates any point you’re trying to make.

            Anyways, people can see for themselves what Async does for AMD, and it’s a lot.

            PS. You admittedly bought a 2900XT and bulldozer because of hype. Pretty obvious you don’t have good judgement when it comes to making hardware choices. Maybe you should stop buying into marketing so much, and look at what the hardware is actually capable of. In hindsight, the 1080 might just be the card for you if you want to have your cake and eat it, but those cards are not in my price range, nor is the hype going to affect my purchases of mid-range products.

            • Waco
            • 4 years ago

            Sigh. You’ve been harping this whole time about Nvidia not supporting asynchronous compute. Only one time (this chain) did you clarify that you’re talking about the past. Don’t pretend to be unbiased with this crap.

            I bought a 2900 XT because it was supposed to be future looking, and I still had some trust in ATI at the time. You know, like you claim to do with your magic crystal ball of hindsight. This was also a decade ago, do I’ll admit to being a bit more impulsive back then.

            I bought a Bulldozer at launch not for hype, but because software devs at the time were supposed to be optimizing for multi core CPUs and because I stupidly thought AMD wouldn’t flat out lie to the public. Neither materialized, but thankfully I got rid of the thing in less than a month for what I paid for it once I realized my mistake.

            Anyway, you don’t argue logically. I don’t know why I even bother replying since you always back out on a previously non-stated assumption or technicality that no reasonable person would have run with when reading your prior posts. Also, for proof you’re just being a pain in the ass:
            [quote<]Most likely a new architecture with true async capabilities would completely obsolete the 1080[/quote<] Unless you believe Maxwell to be driving the 1080 you're just full of it. 🙂

            • DoomGuy64
            • 4 years ago

            [quote<]Unless you believe Maxwell to be driving the 1080[/quote<] No, but multiple reviewers have stated that Pascal does not support async as well as GCN, and the benchmarks show that. Nvidia clearly does not get the same level of performance increase. Hell, Async brought the Fury up to 1070 levels, which is pretty good considering how poorly it has done against the 980TI.

            • Waco
            • 4 years ago

            So, back to my previous point, the facts seem to only matter when they reinforce your bias.

            • DoomGuy64
            • 4 years ago

            Whatever helps you sleep at night.

            PS. I’m not 100% on this, but it looks like you just recently bought a gold subscription to get the 3 extra votes. Lol, how immature.

            • DoomGuy64
            • 4 years ago

            1060 benchmarks are out. 480 cleans it’s clock in Doom. So much for that, although it was pretty obvious Nvidia is selling the card on day one dx11 driver performance and not specs with it’s low shader count and memory bus.

            [url<]http://www.hardocp.com/article/2016/07/19/nvidia_geforce_gtx_1060_founders_edition_review/4#.V45q7I6XPi8[/url<]

            • Waco
            • 4 years ago

            You’re clearly a genius. Why dont you go work for Nvidia to help them fix their idiotic ways?

            Also, no, I bought gold status to support TR months ago.

            • DoomGuy64
            • 4 years ago

            So you’re offering me a job working with you then?

            • Waco
            • 4 years ago

            Yes, I work for Nvidia. You figured me out. /sarcasm

    • Stochastic
    • 4 years ago

    Looks impressive. Too bad it’s using so much proprietary Gameworks tech. This is especially odd since the engine they are using, Unreal 4, has open access to its source code if I’m not mistaken.

    The last thing the nascent VR market needs is more fragmentation.

      • stefem
      • 4 years ago

      GameWorks source code is available on github, same apply to Unreal Engine 4

Pin It on Pinterest

Share This