Asynchronous Spacewarp lets cheaper PCs drive Oculus Rifts

The PC gaming community has seen more that its fair share of discussion regarding VR's technical requirements and the implicit pricetags involved. Oculus' specs call on PCs to deliver a constant 90 FPS to its Rift headset (lest users throw up), which is a high bar to clear for the hardware in most people's homes. Those requirements may become more relaxed soon, however, thanks to Oculus' Asynchronous Spacewarp technology.

Asynchronous Spacewarp purports to offer a 90-FPS-equivalent VR experience with only 45 FPS from the source hardware. The company says a machine fitted with an Intel Core i3-6100 or AMD FX 4350 CPU plus a GeForce GTX 960-class card should be up to that task. That lowers the price of a VR-ready PC to under $700—like our $530 Budget Box. In case it's not clear, that's Kind of a Big Deal for folks who couldn't afford the Core i5-4590 and GeForce GTX 970 duo that Oculus recommended from the start. A PC with those specs ran about $1000 or more.

Technically-minded readers might remember Oculus' Asynchronous Timewarp algorithm, launched last March in the Rift SDK. This existing feature allows the host PC to compensate for dips in application frame rates by distorting and reprojecting a frame as the user turns their head. Oculus calls Timewarp its "framerate insurance," but the company also points out that it doesn't help matters much when the user's head actually changes position in the physical space. Enter Asynchronous Spacewarp.

Asynchronous Spacewarp aims to sort out "positional judder," the barf-inducing spikes that occur when there's a dip in the framerate after a sizable head movement. The algorithm takes two frames, examines the differences between them, and generates a synthetic frame for the headset. Spacewarp's processing delivers one frame of each type in sequence: one "real," then one that was "spacewarped." This allows the Rift to display a scene at 90 FPS while only getting 45 FPS from the source app. That's a much more palatable requirement for GPUs worldwide to handle, particularly but not only for mobile devices. This experience may not be comparable to a native 90 FPS, but it could make VR more accessible, at least.

Comments closed
    • dmitriylm
    • 3 years ago

    This really is quite amazing. I have a GTX 970 setup and Project Cars struggled with the Rift at higher settings. With Asynchronous Spacewarp enabled via the registry the game is suddenly playable at much higher settings. Games that previously ran well can now be run with higher levels of DSR, etc. Magical stuff.

      • TheMonkeyKing
      • 3 years ago

      How about motion asynchronous behaviors? That is, when you are being shown motion, like going forward in a roller coaster, you physically move your head in a the opposite or non-linear away (like looking backwards or left, when the motion is going right)?

      Do you get judder or non-intuitive motion?

    • DeadOfKnight
    • 3 years ago

    Getting on the VR bandwagon now is about the equivalent of being an early adopter of an outrageously priced 4K monitor requiring multi stream transport to make it work at 60Hz and no variable refresh compatibility. If you all just wait for the next iteration you will get a vastly superior product at a lower price.

    • Joerdgs
    • 3 years ago

    It seems to be quite magical. People can already enable it in the current public Oculus runtime with a registery tweak. Player have reported that they can now smoothly run titles like Elite Dangerous and Project Cars on ultra settings with a GTX 970, something that was impossible before even with a 980. That’s pretty damn amazing. It’s not perfect, there are visual artifacts if you look for them, but it’s completely playable thanks to smooth headtracking. Good enough to lower the entire min-spec apparently.

    • wingless
    • 3 years ago

    Oculus really does has a clean implementation of VR. Vive has the better room-scale and game implementation though. If only we could get the best of both worlds in a single product….

    • tipoo
    • 3 years ago

    Iirc this is also how they’re managing VR with the ~Radeon 7850 + Jaguar core PS4, this plus reprojection (and a lower resolution and a few other tricks)

    • f0d
    • 3 years ago

    until the price for the VR equipment comes down there really isnt much point saving a couple of bucks on a lower graphics card for “budget” VR

    • meerkt
    • 3 years ago

    Is this different from the motion interpolation TVs do?

      • GrimDanfango
      • 3 years ago

      Considerably different… this updates the image based on positional data, rather than analysed 2d motion-vectors. It’s more like a 2.5d reconstruction than interpolation.

      • kn00tcn
      • 3 years ago

      it’s reverse video-stabilization, it’s not making new frames, it’s adjusting the position of the existing frame

        • psuedonymous
        • 3 years ago

        Not quite, it’s warping the frame by-element rather than globally. It is also a reprojection (forward in time from past frames) rather than a TV’s interpolation (using the past and future frames to create a ‘between’ frame, by delaying the future frame).

    • DPete27
    • 3 years ago

    Great! Pair this with Nvidia’s SMP and we’ve essentially cut the computing power needed for VR to 25% of it’s original requirement.

      • Andrew Lauritzen
      • 3 years ago

      45Hz is still not a great experience as you notice the jittery-ness of animation. This solution removes nausea-inducing effects of low frame rate, but it doesn’t bring the experience up to part with an application that is actually keeping up with 90Hz.

      To your original sentiment though, we can definitely go a lot further to cut the GPU processing power required for VR. SMP is one step, but it can go a lot further than that… there’s still large amounts of wasted and duplicated work being done rendering VR frames.

        • Laykun
        • 3 years ago

        It actually does. This technique compensates for missing frames in object animation and is basically designed to make a 45hz experience feel like a 90hz experience. This technique is actually usable in the current Oculus runtime and just requires a simple regedit change to make it work. It produces some pretty fantastic results. This is nothing like rendering at a solid 45fps, and you don’t notice any jitter (what it is designed to combat).

        Games that call for ASW are actually forced to render at 45fps and every second frame is interpolated I believe, I don’t believe this is a safety net feature like ATW was.

    • xeridea
    • 3 years ago

    You can modify budget build and get VR-Ready for $750, not $1000. This would still make it better though.

    • Waco
    • 3 years ago

    I’d be happy to have my GPU / monitor be able to do this to smooth out juttering as well…

      • Andrew Lauritzen
      • 3 years ago

      Problem is it smooths out head/camera motion judder but not animation judder. Usually the latter is the bigger issue on a monitor, while the former is the “barf zone” for VR.

        • Waco
        • 3 years ago

        True, after thinking more it might not be as useful (especially given it doesn’t really have the positional information a headset would).

        Frame interpolation works quite well on certain game types though. Old school Playstation 2 games like Champions of Norrath benefit extremely well from frame interpolation on modern TV sets. The slight additional lag isn’t too big of a deal, but the massive increase in apparent framerate is.

    • neverthehero
    • 3 years ago

    Does this mean by chance that better systems will run better as well?

      • DeadOfKnight
      • 3 years ago

      180 FPS!!! I dunno, probably. I mean you will still be limited by the refresh rate, but it would definitely help in more intense scenes to have stronger hardware to prevent huge frame time spikes from crossing that threshold. I’m still not sold on this yet though, this article is quite vague on how this works. I guess it’s a trick that only works when displaying 2 images simultaneously, possibly related to Nvidia’s Simultaneous Multi-Projection feature?

      Could it be we were again fooled into believing that Nvidia had some exclusive capability as we were with variable refresh monitors? I figured AMD would have called them on it by now. What this does seem to imply is that we were getting 45 fps for each eye all along and they found a way to not cut the performance in half like stereo 3D does. These features may be able to improve stereo 3D as well. I think both people using it would be very pleased.

      • Laykun
      • 3 years ago

      It does, you can crank settings way higher now on more powerful systems. A common thing to do is increase super sampling for noticeably better visuals.

      • psuedonymous
      • 3 years ago

      Yes, you can push settings (e.g. supersampling) higher because you now have double the render time to work with. The downside is that while head motion and to some extend object motion will be extrapolated to 90Hz, the game world will still be running at 45Hz, which may have some issues with very fast interactions if game physics logic is tied to visual update rate.

        • Laykun
        • 3 years ago

        If the physics logic is tied to the visual update rate you’re playing a game made by crappy programers (yes, Skyrims/Fallouts engine is a pile of garbage).

    • Chrispy_
    • 3 years ago

    Uh, wait.

    Wasn’t it the latency that caused barfing, not the actual framerate? I thought the 90fps thing was merely the easiest way to ensure [b<]low latency[/b<]. I imagine this to be true because people don't barf all over the cinema at only 24fps. In reality, having to wait for two 45fps frames to be rendered and then spending some time to algorithm some fake pseudo-frame is going to send the latency skyrocketing. And yes, I DISTINCTLY REMEMBER Carmack giving this huge lecture about latency with the Oculus rift. I guess that was all just hot air, was it?

      • stefem
      • 3 years ago

      It probably doesn’t wait for two frame, it just work on the last two frame rendered (and sent to the display)

      • tenfour04
      • 3 years ago

      Looking at the second diagram, where ASW frames are colored green and “real” frames are colored gray, I’d say the first diagram is showing that an ASW frame is extrapolated from the previous two “real” frames, using up-to-date position. The ASW frame would have the same near-zero latency that a timewarped real frame has.

      • DPete27
      • 3 years ago

      I’m thinking the same thing. Essentially they’re saying that you only need to track head positioning at 45Hz whereas before it was 90Hz.

      [Add] Unless….would it be accurate to say you can copy the previous frame for “free” and while maintaining a 90Hz polling rate for positional tracking use the intermediate polls to just shift the previous image ever so slightly in the direction of movement? That may work well for slow-ish movements, but if your viewing direction changes drastically, “spacewarp” would incur image distortion. Either way, I don’t see how spacewarp could ever be as good as having the GPU deliver every frame at 90Hz.

      • GrimDanfango
      • 3 years ago

      The point of positional timewarp is that even if the image data is two frames old, it is warped into an almost-correct position based on tracking data sampled at that instant, so while it might introduce slightly perceivable ghosting artefacts around any paralax movement, it’ll still feel “right” to your eyes.
      Actually, it functions well enough to not feel nauseating as low as 5-10fps provided the scene is mostly static and there aren’t animated characters around… 45fps should be almost imperceivable outside of very specific situations – such as rapid sideways head motion past very-near objects that contrast strongly with the background.

      The point is, while the image you see might not be exactly correct, the apparent position of it relative to your head will be almost latency-free.

        • GrimDanfango
        • 3 years ago

        Hmm, that said, I was presuming this was obviously an evolution of the same Positional Timewarp that Carmack had been working on ages ago, rebranded to sound cooler… but reading the details about it, on the surface at least it sounds like they’ve gone back to a more conventional motion-prediction approach.
        I’m not sure if they’re just presenting it that way to make it easier to explain to people, but it would seem odd if they’d just abandoned the true positional reprojection effect they already had working at some point, even before the CV1 release… it needed some refinement to minimize edge artefacts, but otherwise it basically worked perfectly, even down to surprisingly low framerates.

      • Andrew Lauritzen
      • 3 years ago

      It’s the latency between moving your *head* and seeing the screen update. With ATW/ASW, the key point is that they grab the latest head position right before doing the warp for each frame that they are inserting. That’s why it’s a “warp” of the previous frame(s) in the first place.

      i.e. what you’ll notice on the screen when ATW/ASW kick in is that animation seems a bit jittery, but your view of the virtual world is still buttery smooth. It really does work exceptionally well and is a key differentiating feature vs Vive. This just adds a bit more robustness to an already “pretty good” solution (ATW).

        • Chrispy_
        • 3 years ago

        I get ATW, I think it’s a great hack to remove perceived latency and it’s almost flawless when the camera is static relative to its environment.

        Where ASW confuses me is how it’s supposed to work differently to ATW. Let’s say you’re playing a game seated in a moving car and you look out of the side window at speed. The act of turning your head updates the interior of the car 90 times a second by taking that 45 fps car interior scene and interpolating it every other frame. It works well with ATW because the interior of the car is mostly static relative to your head position.

        When you’re looking out of the window at ninety degrees to the direction of fast travel, it’s entirely likely that nothing from the previous frame will be usable whatsoever – i.e. everything in view has moved out of view in that 1/90th of a second. This is an extreme example for emphasis, obviously but the same problem will occur at slower speeds when just a quarter of the view out of the window is relevant to the next frame.

        How does any kind of warp generate entirely new geometry from that? Surely if it’s generating new geometry it is rendering a new frame and it’s 90fps after all?

          • Andrew Lauritzen
          • 3 years ago

          Well to short-circuit this reply, if nothing in your view is the same from 1/90th of a second to another, you won’t actually be able to perceive anything useful – it’ll just look like a big blur so it doesn’t even matter. In reality for any consistent viewpoint even at “high speeds”, lots of the frame is the same or reusable. Remember that there are effectively some “gutters” on what is rendered and presented to the VR API as well beyond what see on an individual frame. Incidentally some Oculus folks I know say that PCars is one game that a) tends to drop a lot of frames in practice and b) looks near indistinguishable when running at 45Hz w/ ASW vs. 90Hz.

          45Hz is still not necessarily going to be “as good” an experience in every game, but depending on the nature of the content of the game it may or may not be noticeable. Clearly the Oculus folks – who have a pretty high bar compared to others on this stuff – have deemed it an experience worthy of their min spec bar, so I’m inclined to trust them.

            • Chrispy_
            • 3 years ago

            Okay, the gutters thing makes sense to help with (re)projection but that opens up a whole new line of questions for my curiousity – like what would the performance improvement be with less wastage on the original gutters etc, and are the gutters necessary regardless of ASW, or do they exist primarily for ASW?

            I haven’t used a CV1, my experience is DK1, DK2, Vive so if ASW is good enough to make 45fps a VR-friendly framerate, then I will give it the benefit of the doubt until I try it for myself.

            Certainly with G-Sync/Freesync I believe that a steady 45fps is enough animation fluidity for most people in most scenarios.

    • bittermann
    • 3 years ago

    It’s magical algorithms!

      • Neutronbeam
      • 3 years ago

      Let’s do the Timewarp again!

    • chuckula
    • 3 years ago

    [quote<]The company says a machine fitted with an Intel Core i3-6100 or AMD FX 4350 CPU plus a GeForce GTX 960-class card should be up to that task.[/quote<] WRONG! AMD invented the word Asynchronous and only GCN can do it.

      • Jigar
      • 3 years ago

      Nice try, Nvidia fanboy.

      • Jigar
      • 3 years ago

      Used all your Gold subscriber power ?? Still doesn’t change the fact that you are Nvidia fanboy.

        • chuckula
        • 3 years ago

        Downvoting me for saying what you actually believe to be true?

        Looks like we have the first confirmation that Vega isn’t all it’s cracked up to be.

Pin It on Pinterest

Share This