You’ll need a powerful GPU to drive the Oculus Rift

The Oculus Rift is coming in the first quarter of 2016, and we now know what class of system it'll take to deliver a smooth VR experience with the headset. Oculus posted the recommended system specs on its blog today. To drive the headset, you'll need a PC with a Core i5-4590-class CPU, at least 8GB of RAM, and a GeForce GTX 970 or Radeon R9 290-class graphics card or better.

Here's Oculus' complete list of recommendations:

  • NVIDIA GTX 970 / AMD 290 equivalent or greater
  • Intel i5-4590 equivalent or greater
  • 8GB+ RAM
  • Compatible HDMI 1.3 video output
  • 2x USB 3.0 ports
  • Windows 7 SP1 or newer

In a companion post, Oculus chief architect Atman Binstock explains that the recommended components were chosen based on three factors: the raw pixel-delivery demands of the Rift's twin displays, the need for essentially real-time rendering, and the need for low-latency frame delivery. Considerable GPU headroom is required in order to maintain the illusion of presence in virtual worlds.

Oculus says these system recommendations will remain constant over the Rift's lifetime, so that developers can target a single reference platform and consumers can take advantage of decreasing component costs over time.

Mac and Linux users will find some bad news buried in Binstock's post. Rift support for those platforms is being "paused" for now. Oculus cites the need to deliver "a high quality consumer-level VR experience at launch across hardware, software, and content" for Windows and PC users when the Rift launches.

Comments closed
    • UnfriendlyFire
    • 4 years ago

    Wait, you mean I can’t use a GT 610 for this? Or run it on a laptop with an Intel GMA IGP (pre-sandybridge)?

      • Meadows
      • 4 years ago

      Of course you can, but it’s not recommended.

        • jmke
        • 4 years ago

        it just won’t run or give an enjoyable experience; VR presence comes from tricking your mind that you are looking at another world, any delay in movement, any frame drops, screen tearing, will take away from that, and in most cases will cause motion sickness.

          • Meadows
          • 4 years ago

          That doesn’t mean you can’t use a sub-par GPU.

    • Vrock
    • 4 years ago

    File this under “who cares?”. Six, maybe seven people will buy this.

      • Meadows
      • 4 years ago

      I actually think more people will buy this than, say, the Titan GPUs, and those seemed to sell pretty well themselves.

        • VincentHanna
        • 4 years ago

        If it came out 2 years ago, definitely…

        But now its going to be competing with several other VR products, which means that they will have to justify themselves, and can’t simply depend on their novel curiosity to entice wealthy gamers.

          • Meadows
          • 4 years ago

          Competing? Several other VR products?
          Name just two others please, because off the top of my head I can’t name a single one that’s on the market and competes for the same audience.

      • Firestarter
      • 4 years ago

      Are you sure? I’ve seen its magic and I’ll be buying it, along with a wheel+pedals and something HOTASy

    • Khali
    • 4 years ago

    This is going to be like everything else. High requirements and a high price until some competition gets going. I’ll give it a few years before jumping into the VR scene.

    • kamikaziechameleon
    • 4 years ago

    This is all well and good, a GPU upgrade and I’m ready to go.

    But truth be told I don’t want VR gaming, or motion controls for that matter. I like the static station interaction I have with my computer. Not that VR won’t be cool because I still love wii sports but honestly there is something that happens with games, they are already so immersive and honestly addicting I’m afraid of my own ability to have a casual game session for 30 mins and move on as we pioneer these new interfaces. Sure its good for gaming but its crap for the rest of my life, lol.

    Not even saying the wii or occulus are necessarily achieving that but that I feel they are moving in that sort of direction and I already struggle to self regulate games with their current level of addiction.

    And to many people the “immersion” factor outside of narratively driven games is pretty destructive. MMO’s and such have had people die playing them, because they were playing them. Just something to ponder, its such a powerful medium.

      • GrimDanfango
      • 4 years ago

      I would say that it’s not the “immersion factor” of MMOs that have caused people to die playing them… it’s the weaponised level of addiction-forming psychology that goes into making them.

      A mind-blowingly immersive experience isn’t going to keep you hooked for months and months on the same thing – a game that conditions you to depend on a steady stream of minor rewards for repetitive actions is. Well, it is if you’re susceptible to it anyway. Luckily I tend to just get bored stiff by such games.

    • mcnabney
    • 4 years ago

    1080p = fail. Way way way way too large an effective image for that resolution. Even 1440p is too low. 4k or higher – which would explain a beefy GPU requirement.

      • Krogoth
      • 4 years ago

      4K cannot be done on VR goggles yet.

      • sparkman
      • 4 years ago

      Everyone wants higher res VR but the screen tech isn’t there yet and, for VR, speed is even more important than resolution.

    • alrey
    • 4 years ago

    Why don’t they just say the overhead? Like — “your game will be 15-30% slower when using the device”.

    Announcement like this would even jump start hardware upgrades if oculus is quite good.

    • xeridea
    • 4 years ago

    If they are worried about smooth frame delivery they should just require the use of OGLN or DX12.

    • puppetworx
    • 4 years ago

    Not that I didn’t expect this but with all the [b<]talk[/b<] about 'mobile VR' I thought they might be a bit lower. Still, these are [i<]recommended[/i<] specs for very new consumer product, it's not a huge surprise.

    • Sam125
    • 4 years ago

    Those specs seem pretty tame to me. In a year or so, 970/290 level of hardware won’t be priced out of range for most people who would be interested in an Oculus so it’s smart to require hardware that’s a bit of a stretch today but will be mainstream tomorrow.

    I’m guessing the goal is to achieve 1080p resolution with fluid level framerates. With that in mind the requirements make sense.

    • albundy
    • 4 years ago

    soooo….any decent games worth mentioning for this? hopefully it wont end up like virtual boy with like 10 games ever made for it.

      • l33t-g4m3r
      • 4 years ago

      Yup. The first competitor to get decent support wins. Best case scenario is direct support from Nvidia, which would at least let you use the headset for 3d in standard titles.

    • l33t-g4m3r
    • 4 years ago

    I call bull on the requirements. It’s a low res screen. People have been gaming with 3d vision (and even head tracking) for years with older cards.

    The real problem is probably that newer console games are eating vram, so they’re setting the bar high to compensate for that. The 970 has enough horsepower and memory to be somewhat future proof, so that makes sense. But I bet any of the higher spec 7 series with 4+GB ram would do fine too. As for AMD? Yeah, the 290 is probably a requirement. Simply from drivers alone.

      • K-L-Waster
      • 4 years ago

      The thing about something like this is that you don’t just need good frame rates – you need absolutely predictable consistent frame delivery across both displays with no lag when the user turns etc. Any lag or mismatch will produce horrible motion sickness as the user’s visual experience diverges from what their inner ear is telling them.

        • l33t-g4m3r
        • 4 years ago

        That’s just keeping your framerate up, which can be easily be done on a 780 / 780Ti, which performs pretty closely with the 970. Also, you can always turn down the effects level to get higher framerate. That’s what having a PC is all about, right? HAVING CHOICES? Maybe the 970 has some special hardware voodoo for VR, but performance wise, it’s not necessary at all.

          • Sam125
          • 4 years ago

          I don’t think you understood K-L. Absolute framerates are completely unimportant with VR. What is important is the response time between head tracking and what you see on screen. If you start experiencing input lag then that induces motion sickness which completely ruins the experience.

          Which is why I have zero interest in VR. Even the best setups make me feel pretty sick because the body is saying everything is fake but the brain is being led to believe what you’re experiencing is real.

            • l33t-g4m3r
            • 4 years ago

            [url<]https://en.wikipedia.org/wiki/FreeTrack[/url<] [quote<]responsiveness is fundamentally limited to the frame rate; a 30frame/s webcam has a maximum response delay of 33.3 milliseconds compared with 8.33 milliseconds for a 120frame/s camera. To put this into perspective, a human’s reaction time to visual stimulus (finger reflex) is typically around 200ms[/quote<] Head tracking response time is dependent on the VR hardware, not your video card. Unless the video card has some special sauce hardware that accelerates VR. Either way, VR tracking has been done before, while not being dependent on your video card. Oculus Rift is the *ONLY* VR headset that "requires" a 970/290, which just seems way too shady to be true.

            • Sam125
            • 4 years ago

            Think of it this way: You’ll want the fastest refresh rates possible to reduce the liklihood of vertigo while you’d also want decent hardware to make games look better.

            A smartphone or low end hardware isn’t powerful enough for say, a flying or racing sim and that’s obviously the genre that VR is going to be the most graphically demanding.

            • l33t-g4m3r
            • 4 years ago

            BULL. I think I might have actually seen some youtube videos of smartphones being used for VR.

            None of you know what you’re talking about. Complete RDF zone here. VR has been on PC since DOS, as I remember DESCENT ONE supported some VR headsets. VR has NOTHING to do with your video card. It’s just a MONITOR that’s mounted on your head, combined with motion trackers. The motion tracking hardware is FAR more important than your video card.

            Any decent mid-range dx11 card should run the Rift, and these requirements DID NOT EXIST FOR THE DEVELOPMENT KITS. It’s COMPLETELY ARBITRARY.

            Yes, a better video card will give you a better gaming experience. BUT IT’S NOT A REAL REQUIREMENT FOR A VR HEADSET.

            I agree having a baseline helps developers, but that’s all it is. There is no REAL requirement here. It’s only a recommendation that’s being set to make VR development easier.

            Also, if the competition comes out and says they’re not going to have these requirements, then that’s the end of the Oculus Rift RIGHT THERE. VR isn’t dependent on video cards. It hasn’t been since EVER, and it still isn’t. The requirements are totally fake and arbitrary, and only serve to simplify Rift development.

            • Sam125
            • 4 years ago

            I’ll say this in a way you should understand: The Oculus is too high-end for you. There, does that make sense to you now? 😉

            • l33t-g4m3r
            • 4 years ago

            Nope. You’re just going full narcissist epeen all over yourself.

            I plan on eventually getting a new video card and maybe the Rift depending on it’s support down the road, but whether or not I buy them has nothing to do with arbitrary requirements. Idiot.

            Anyways, I’m not touching the 970 with a 10 ft pole, because it’s not any faster than my overclocked 780, and new games will need closer to 6gb of vram. I’m waiting until those cards come out before I buy anything new, not nv’s gimped short bus. Especially when they’re selling it for more than it’s actually worth. The only way I’d ever consider a 970, is if somebody like evga would put 8gb on it, and OC the hell out of it.

            • sweatshopking
            • 4 years ago

            That’s a good point. If Google is using cell phones for vr, which they are, how does any of that make sense?

            • GrimDanfango
            • 4 years ago

            Oh good lord, for crying out loud, of course ANY graphics card is capable of running VR. No, of course the Oculus doesn’t REQUIRE a 970, and VR in general doesn’t REQUIRE anything more than a smartphone.

            But presumably, if you bought an Oculus Rift, you would implicitly be expecting to be able to play some decent, modern PC games on it, at something more than absolute minimum graphics settings?

            If you want to do that, then you’re going to require more than a smartphone GPU, you’re going to require more than a middling PC GPU. You’re going to need a 970, bare minimum, (or yes, probably a 780ti would do just as well) just to run typical modern PC games at medium settings on a Rift, or any other VR device.

            This has absolutely nothing to do with some shady performance issue on the Rift, or some Rift-specific requirement in any way. They’re just telling you what any decent VR solution is going to tell you. There is no way you’ll get a satisfactory experience on anything less than a 970-class GPU.

            I think the Vive is going to have a 2560×1440 screen, and at least 90hz, which means it’s going to require even MORE powerful hardware to push it. It has nothing to do with what each company claims to “recommend” or “require”, and everything to do with the plain simple fact that to run VR well, you need to kick out an absolute crap-ton of very high resolution frames per second, very consistently.

            Sure, VR existed 20 years ago, but you know why we haven’t heard anything about it in those intervening 20 years? Because it was HORRIBLE! Because it was so laggy and slow and jarring that nobody could possibly enjoy using it.
            The new wave of modern VR kit can only solve that long-standing problem if it can rely on people having enough raw computing horsepower to drive it. It won’t be any different on any other VR platform, no other PC-based competition is going to claim to have much lower requirements, because it would quite simply be a flat-out lie if they did.

            Ultimately, of course they aren’t hard-and-fast “requirements”… you’re free to attempt to run it on any card that has a direct HDMI output. You’ll have an awful experience though.

            • l33t-g4m3r
            • 4 years ago

            [quote<]Oh good lord, for crying out loud, of course ANY graphics card is capable of running VR. No, of course the Oculus doesn't REQUIRE a 970, and VR in general doesn't REQUIRE anything more than a smartphone.[/quote<] Exactly. That's ALL I'm saying. I'm not saying it's IDEAL, I'm saying it's POSSIBLE. The problem here isn't facts, it's elitist snobbery. People who are taking PCMR FULL RETARD, and that pisses me off. [quote<] There is no way you'll get a satisfactory experience on anything less than a 970-class GPU.[/quote<] Which now you're completely discounting last gen Titans, and things like 680/7xx SLI, and the 780Ti, which is bull. Those configs can beat a 970, and it's not like you HAVE to max out the settings. Hell, you probably WON'T be able to max out the settings on a 970 anyway. This shit is just super arrogant pc elitist snobbery, where you're missing the forest for the trees. VR doesn't need to hit over 60 hz either. 90hz is just icing. It's nice to go higher than 60, but a lot of games aren't designed for it, and start to have all kinds of weird issues going over 100, while other games are completely locked in to 60. Also, the REAL ISSUE with VR has NEVER been the video card. It's ALWAYS been the VR hardware itself. If the VR hardware can keep up, then it really doesn't matter what your framerate is. Especially if these headsets would support adaptive sync. What's important is that the VR tracking is in sync with the rest of your hardware, and not behind it. That's where the motion sickness inducing lag is, not your video card. You can run a game at 144 fps, but still cause motion sickness if the VR hardware isn't keeping up with the framerate. Claiming that this is ALL dependent on buying a 970, is completely disingenuous, and a bald faced lie. The VR hardware is more important than the video card, so the 970 requirement is just there to simplify development, and create a graphics baseline. It's not actually necessary.

            • GrimDanfango
            • 4 years ago

            Firstly, last-gen Titans are “970-class”… anything that is of comperable performance to a 970 is definitively in the same class as a 970, and that’s all these recommended specs suggest. They’re not mandating a latest-generation card, just a fast one.

            Secondly, have you actually tried VR yet? You ABSOLUTELY do need more than 60fps. This doesn’t work the same way as a normal monitor. There’s no elitism here talking about 90fps being some sort of “the only right way to play”… hitting less than the refresh rate of the VR device, even for a moment, will cause it to stutter horrendously, and will quite literally make you feel sick, and make any game unplayable in that state. 90fps isn’t snobbery… for VR it is quite literally the bare minimum requirement. It will outright ruin the experience if you dip below it.

            Stop assuming this is people going on about “PC Master Race” horsecrap, and accept that this is a legitimate concern. It’s simply not a case of “VR will be so much better on an expensive rig”… it’s a case of “VR can ONLY function on an expensive rig, and people really should be prepared for that FACT”.

            • l33t-g4m3r
            • 4 years ago

            [quote<] hitting less than the refresh rate of the VR device, even for a moment, will cause it to stutter horrendously,[/quote<] Which is exactly what happens anytime you use vsync at high refresh rates. Your framerate gets cut in half, so of course you're going to see it stutter, especially when it's mounted directly on your face. DUH. SO, SIMPLY LOWER THE REFRESH RATE, then you won't have that problem. That, or enable adaptive refresh. Oh, and yes this absolutely is PCMR horsecrap. None of you are putting any effort to think outside the RDF box, and troubleshoot these issues. [quote<]it's a case of "VR can ONLY function on an expensive rig,[/quote<] Yeah, you're sure not going PCMR Full retard all right. [quote<]that FACT".[/quote<] The cake is a lie. Both VR and 3d have been around for a long time, and both are viable on current gaming rigs. Framerate is dependent on the game and your settings, while VR is only dependent on itself.

            • GrimDanfango
            • 4 years ago

            Okay, well go right ahead, buy a Rift, or whatever other VR device you care to purchase, and attempt to enjoy it on a mid-range rig. Just don’t go crying foul to the manufacturers of the device when the experience gives you a migrane and makes you feel like throwing up.

            I have a prototype VR device. I can tell you from experience that a game running at less than the native 75hz of the device renders it unplayable. Stutters in VR aren’t just mildly inconvenient dropped frames, they’re an entirely game-breaking issue.

            Even when running at a smooth 75hz, it *still* makes most people feel ill, and that’s because even 75hz isn’t quite enough for a smooth nausea-free experience, hence they’ve cranked it up to 90hz for release. No, you can’t “simply” lower the refresh rate. If you think you can, then you have absolutely no idea how VR functions, and you’re talking out of your ass.

            VR devices are not completely separate from the rendering pipeline and dependant only on themselves, they can ONLY function correctly if your GPU is powerful enough to feed them fast enough. They are absolutely and intrinsically linked to the GPU, and its ability to faultlessly synchronise with them.

            • l33t-g4m3r
            • 4 years ago

            [quote<]Stutters in VR aren't just mildly inconvenient dropped frames, they're an entirely game-breaking issue.[/quote<] I already told you what that was. Vsync, and there are ways to fix that. Perhaps you're one of those more money than brains type that can't understand simple concepts, so you just throw money at the problem. Me? I change my own brakes on the car. I have no need to pay $150/hr labor at the local dealership because I'm too incompetent to do something myself. Unlike some people, I put forth the effort to understand the underlying concept of how to do things. [quote<]Even when running at a smooth 75hz, it *still* makes most people feel ill, and that's because even 75hz isn't quite enough for a smooth nausea-free experience, [/quote<] You have no idea how VR works do you? The monitor's refresh rate, (and it IS a monitor), is NOT directly correlated to your head tracking refresh rate. THIS is where you get the nausea, when the tracking breaks sync with the video, and you get input lag. ALSO, I bet you can get some input lag simply from using vsync, because I can recall back in the day of single core processors, vsync (at least on ATI), would cause horrendous mouse lag in twitch shooters like Quake3, which is why I avoided using it. Having ANY kind of input lag will induce vertigo nausea, and you're introducing an extra factor from the VR hardware. Higher refresh rates do indeed reduce input lag, but they also are harder to hit consistently with Vsync. Therefore, lowering the refresh rate while keeping everything else in sync would be simpler. Also, using adaptive refresh would help a lot, meaning Vsync Adaptive mode, and not Gsync. Gsync would be ideal though. [quote<]VR devices are not completely separate from the rendering pipeline [/quote<] OHOHOHO, YES THEY ARE. You need to read the FreeTrack wiki. The head tracking hardware IS COMPLETELY INDEPENDENT from the rest of the system. NV might have included hardware in the 970 that accelerates head track processing, but it's still INDEPENDENT from the rest of the system, and is the BIGGEST lag factor. Head tracking MUST be as low latency as possible, or you'll get nothing but lag.

            • spaceship
            • 4 years ago

            You are correct in that the key to preventing sim sickness is minimizing the turnaround time between head movement and displaying an updated frame. It does not matter how good your head tracking is if your display is not displaying the updates fast enough. The established minimum rate for displaying updated frames fast enough that sim sickness is kept acceptably low is 90hz.

            You seem to be arguing that HMD developers should lower their standards for reducing sim sickness in their final products, just so that dropping below their established minimum does not drop as far. Clearly they have determined that to not be a good enough user experience. [url=http://www.engadget.com/2014/05/20/palmer-luckey-svvr/<]Palmer Luckey expands on the standards that Oculus is shooting for with their consumer product.[/url<] Notably, they are worried that setting the bar too low will put out a product that will sour people on VR - if they allowed for too much sim sickness, for example. You can also read [url=http://alex.vlachos.com/graphics/Alex_Vlachos_Advanced_VR_Rendering_GDC2015.pdf<]this presentation[/url<] from a Valve engineer that touches on many aspects of rendering for HMDs.

            • l33t-g4m3r
            • 4 years ago

            FPS =! VR lag. It’s a factor, but so is everything else. What I’m saying is that what is most important is reducing the Input latency of the VR hardware itself, and not your actual framerate.

            If the VR input latency is imperceptible, then it doesn’t matter if your refresh rate is 60 or 90. That just helps smooth out motion, which of course is a big plus . FPS isn’t what causes the lag though, that’s all on your peripheral hardware.

            • spaceship
            • 4 years ago

            Registering input and updating game state has zero impact on the user experience until the updated game state is rendered and displayed on the device. Input latency can be actual zero and outpace display updates by half a dozen orders of magnitude, and it will amount to nothing if it is bottlenecked by the display’s refresh rate. The minimum rate at which display updates are required in order to update frames with new input quickly enough to avoid sim sickness has been repeatedly found to be right around 90hz. That is the minimum for the display refresh rate to not drag down the entire experience. No amount of perfection elsewhere can make up for that.

            If you really believe that you have some understanding of VR that has been overlooked by literally every single person actually working on VR today, the engineer that put together the presentation I linked you included his email at the end. I’m sure that Valve and HTC would love to hear about why they don’t actually have to shoot for 90fps, seeing as how a major hurdle with their product right now is the bandwidth required for that much pixel data.

            But really, it would clarify so many things for you if you took a look at that presentation. 90hz refresh rate means 11.11ms per frame. Positional updates from predictive algorithms are queried 2ms before a frame begins in order to get a result at exactly the right time. 3 frames to go from a positional check to a display update – 11.11ms to process game state, 11.11ms to render the new game state, 11.11ms to transfer the frame to the display, then the display illuminates for 2ms. 33.33ms to go from a position update to a new frame displayed on the device. Positional updates with the Vive, which is still a dev kit, are sub-20ms. Without considering the accuracy of their positional prediction, that’s ~50ms turnaround from the headset moving to a frame reflecting the movement, with the bulk of that being limited by framerate even at 90fps.

            It also goes over image quality requirements and why a lot of things that look fine on a desktop monitor will look like garbage in VR.

            edit: Looking back over how Vive does tracking, 20ms is not a realistic number for tracking latency. 6-7ms sounds like a reasonable guess.

            • GrimDanfango
            • 4 years ago

            Head tracking refresh rate is specifically synchronised to the “monitor” refresh rate on the Rift. There’s actually a dedicated cable for that single task! It’s called the sync-cable! One end plugs into the tracker, the other plugs into the Rift.

            Even if for some reason it wasn’t… take it to the logical extreme, if you has a head-tracker that operated at 1000hz, what good would it do you if that positional data is being passed to a game that is only managing to render, say, 40 frames per second? The Rift is still only going to show those 40 frames in a second, and each of those frames is going to take a bare minimum of 25ms to render, plus the rest of the overhead. In the meantime, a totally unconnected 1000hz tracker has sent it another 25 positional updates, but the best it can do is throw those updates away until it finishes rendering the current frame, and requests the latest position just before starting the next. Why wouldn’t they directly link the display refresh, tracking refresh and GPU rendering, when the only useful situation comes when they’re all in direct syncronisation with each other? The Rift *could* just be a couple of disconnected peripherals attached to a system that is unaware of them, but they tried that, and it sucked… the Rift as it stands now is *not* just a monitor. They’ve spent the last couple of years directly working with GPU vendors to integrate everything as much as possible, because to get it to work right *requires* direct cooperation between the screen, the tracker and the GPU.

            There are various sources of lag throughout the pipeline, but that’s specifically what they’ve been working to iron out for the last few years. The rift does not suffer from needless input-processing induced lag the way monitors do – that was pretty much the first thing they threw out designing the thing, and on top of that, they’re working with the GPU vendors to include VR-specific v-sync modes that don’t pre-buffer frames, and thus introduce needless 1, 2, 3-frame latency on top of the base rendering latency.
            The lag from positional-tracking-input-to-rendering, and from finished-render-to-display is about as thin as they can make it… the one part of the latency they can’t iron out at all is the latency required to wait for a frame to actually render. The only way you can get around that is to throw a hugely powerful GPU at the problem.

            You seem to have made the very flawed assumption that the problems and solutions for peripheral head-tracking devices to relate viewpoint information for display on a static monitor ala FreeTrack are the exact same problems and solutions facing VR.
            There’s a fundamental difference. VR has a screen attached to your face, a monitor sits on your desk and doesn’t move. Disconnect between the motion that your brain expects to see and what actually gets shown is massively amplified on a VR device.

            Yes, I will grant that you could specifically set a lower refresh rate and have everything sync to that lower refresh rate, but that comes into direct conflict with the requirement for low-persistence display. Without it, VR turns into a horrendous, disconnecting, blurry mess every time you turn your head… with it, it strobes the display. Think back to the days of CRTs… what it was like looking at a CRT set to the default 60hz. It would give most people headaches or eyestrain or both. It was pretty commonly accepted back then that ~75hz was the bare minimum you could refresh a screen like that and not mess up your head looking at it, 85hz was more comfortable.
            If anything, it’s even more significant with the rift, as low-persistence requires as short a pulse length as possible. Also, well, imagine spending any amount of time with that old 60hz CRT stuck an inch from your eyeballs and covering your entire field of view!
            75hz is the bare minimum you can get away with in low-persistence mode, and with it switched off, the lower you take the refresh rate the worse the blurring will be… and it would be unplayably bad even at 90… hence the need for it.

            So… I say again, from someone who actually HAS this equipment, has read up extensively about it, and has extensive personal experience. 75hz is the absolute bare minimum for a passable VR experience. Not even an especially good one!

            You can keep raging on people convinced that this is snobbery and elitism, but when it comes down to it, if and when you try it for yourself, you will find yourself surprised by just how poor the effect can be if you don’t throw some serious hardware behind it.

            But if your brake-changing expertise has truly given you insight into these issues that Oculus’ hardware and software teams haven’t yet managed to find any better solution to, you should probably get in touch with them… there would likely be a very high-paying job waiting for you!

            • l33t-g4m3r
            • 4 years ago

            Well, this is by far the most sensible post you have made, for the most part.

            With strobing, yes. 75 hz is about the minimum you want to have set. That still doesn’t mean you have to use full Vsync though. You can still enable adaptive Vsync, which would remove stuttering, but enable tearing. That would be more acceptable than having stutter, IMO. Personally, I’d rather have the option to set 75hz instead of 90, specifically because of Vsync stutter. 90hz would be a bit overkill, and hard to hit even with a 970 in next gen titles. Unless you turn down settings, which is a possibility.

            Either way, refresh rate doesn’t cause “lag”. Why is that? Because you SEE input lag. If you can see the lag, then your video card is outpacing the VR hardware/input, and it is NOT being synced properly. This sounds more like a Vsync lag issue, where enabling triple buffering may or may not help. I have experienced this issue before, where aiming with the mouse was several milliseconds behind the screen refresh, and it was completely disorienting. This is an issue with Vsync, and maybe even the general performance of the rest of your computer. It’s not the video card or your refresh rate, it’s an issue with Vsync, which you can troubleshoot.

            These problems btw, are PRECISELY why gaming monitors have moved towards using Gsync/Freesync. Strobing, even at 144hz doesn’t fix Vsync issues. The only way to really fix Vsync is to just get rid of it. VR would probably be better off supporting adaptive sync instead of Vsync+strobing. If so, the hardware requirements wouldn’t be nearly as bad.

            I probably won’t be buying a Rift now, and will instead go with a Gsync monitor. VR can wait until they make something that supports adaptive refresh. Not supporting it appears to be causing more problems than it’s worth.

            • sparkman
            • 4 years ago

            > VR has been on PC since DOS

            And it has always been awful, until now.

            • Meadows
            • 4 years ago

            Frame rate can be variable.

            • Meadows
            • 4 years ago

            It’s not required, you dummy. It’s recommended.

            • l33t-g4m3r
            • 4 years ago

            Which was my point.

            • Meadows
            • 4 years ago

            No it wasn’t, duh. You repeatedly mentioned a perceived requirement, in all caps in at least one case. There’s no such thing here.

      • aspect
      • 4 years ago

      No, it’s about right. Most modern games get about 70-90fps at 1080p on close to max settings on the 970. The Occulus takes a 30% – 50% fps hit, and on top of that it needs to maintain a constant 75+ fps. The 970 or better is the card you need if you don’t want to play on medium or lower graphics.

      • GrimDanfango
      • 4 years ago

      There are three issues with the Rift / VR in general that means these specs are entirely reasonable, and in fact, probably even optimistic.

      Even on a 2160×1200 screen, you don’t render that resolution – the offscreen render target is quite a bit larger, the wider the FOV of the headset, the higher it needs to be, but it’s at least 25-30%. That’s so that when the lens-shader is applied, there is enough source image to warp into the curved image needed to fill the screen.

      You need to maintain an ABSOLUTE minimum of 90 fps… (75 on the DK2), otherwise the VR effect just falls apart as it starts juddering as you move your head. 90 fps is an immense *minimum* frame rate to ensure in almost any game. We’re not talking average here… in most typical games, you’d need to be running an average well over 120 fps to ensure a guaranteed minimum of 90.

      You’re rendering two different viewpoints per frame. It doesn’t double render time, but it certainly increases it.

      So… could your GTX 970 drive two roughly-1500×1500 resolution monitors simultaneously, and never drop below 90 fps while doing so?

      Really, to drive the likes of most modern games at anything approaching high settings, you’re going to need a GTX 980 per-eye, bare minimum. My single-GTX-980 can juuust about push 75fps minimum to my 1920×1080 DK2 in most compatible games/demos running at medium-to-high settings. …just.

      I think people are really going to be utterly shocked at how much raw power VR is going to require to work well, especially as they move towards wider FOVs and 4k screens.
      It’s the one thing I’m actually concerned could spell disaster for the whole VR movement… it’s going to be a sub-standard experience on anything less than a $3000+ rig, and so a lot of people are going to simply write it off as a sub-standard experience.

      I’m honestly not really sure how Sony intend to drive Morpheus using only the wimpy capabilties of a next-gen console… it’s so far short of the power required to do it well, I just can’t see it working in a way that will convince anyone it’s worth bothering. Most devs can’t manage to push games past a 30fps cap… how the hell are they going to manage a 90+ minimum?

        • l33t-g4m3r
        • 4 years ago

        [quote<]My single-GTX-980 can juuust about push 75fps minimum to my 1920x1080 DK2 in most compatible games/demos running at medium-to-high settings. ...just.[/quote<] What in the pigs flying HELL are you playing, and at WHAT settings?! 3d Vision and 144hz monitors have been around before the 970, and the Rift, and people gamed perfectly fine on them. I have a 3d monitor myself, so I know quite well people are full of it when I see these claims. You're forgetting to mention that you're playing some next gen dx11 title with the settings maxed, because older titles ran perfectly fine in 3d. If you really need that kind of hardware for VR, then that's on whatever game you're playing. It's not a VR requirement, it's a [i<]game requirement[/i<].

      • Meadows
      • 4 years ago

      I call bull on your speculation.

      • sparkman
      • 4 years ago

      You don’t understand VR. If your VR helmet hits a miniscule rendering glitch even once per secoond where it drops below 75 fps, it can ruin the experience.

      The high GPU requirements are result of the helmet needing to guarantee real-time quality.

      • Ryszard
      • 4 years ago

      If 2160×1200 is low resolution as far as you’re concerned, I’d like a go on your time machine.

    • jjj
    • 4 years ago

    2160×1200 at 90Hz is lower than expected. – TR should do some benchmarks quickly at that res, they should kinda get 60FPS and up, for the most part, on a 970.
    The “Oculus says these system recommendations will remain constant over the Rift’s lifetime,” would be really stupid and the statement is rather misleading.
    It would be idiotic to limit games like on console and i really hope developers don’t do something that crazy and ruin PC gaming for the sake of Oculus. Then, lifetime is not exactly defined in any way , it can be 1 year or 10. They really shouldn’t make those claims or try to enforce them , PC gamers know how things work.
    Dual USB means they are shipping with external sensor :(.
    We’ll see how it goes but i would rather see a lot more hardware from different people , not trusting Oculus to do things right anymore.

      • MEATLOAF2
      • 4 years ago

      Keeping the requirements the same is a good thing, it means you won’t have a moving target when building a system for VR.

      Also I’m pretty sure the requirements don’t have anything to do with the games specifically, it’s more to do with making sure you have enough throughput to run at high frames at greater than 1080p resolution, and keeping minimum frame rate nice and high. There is no reason to hold games back, just because a game has an ultra setting doesn’t mean you are compelled to use it.

        • jjj
        • 4 years ago

        Look Oculus is just a display ,that’s all. Would you want all games to get stuck for a decade at 1080p and same GPU needs if it was on a monitor? Or even 6 months? No because we are not console folks and in PC things move forward , software and hardware every day. A freeze on GPU demands means that new games would evolve very little and that would be a horrible horrible thing to do to PC gaming.
        You also don’t want for Oculus to be the only hardware provider and all the games to be for it. There is no reason not to have this kind of hardware (2160×1200) from 100 diff companies at 100-200$, 1080p headsets as low as 50$, or 4k at 350$.

        Edit: Have i mentioned horrible? Any PC gamer knows that games get more demanding at the same res and the GPU will get outdated fast, it’s not a difficult concept.

          • l33t-g4m3r
          • 4 years ago

          New games ARE going to evolve very little, because they’re mostly console ports. Also, nobody wants to upgrade a high end video card every 6mo, because that’s just asinine and a waste of money.

          The 970 should be more than enough to keep up with ports, and that’s the point. You will still be able to buy high end cards to enable ultra detail modes, so don’t be all retarded about it. It isn’t really an issue, but you’re making it one.

            • jjj
            • 4 years ago

            Your unjustified use of inappropriate language fully discredits you (and your parents).
            So good luck with that.

    • cmrcmk
    • 4 years ago

    These requirements don’t make sense to me. Aren’t CPU and GPU requirements a function of the scene complexity? Without knowing the software in use and thus the amount of data being worked on, this seems fairly arbitrary.

    Maybe I’m missing something?

      • cobalt
      • 4 years ago

      You’re thinking about it in the wrong direction — it’s the goal for developers to target their scene complexity (etc.) for. That way consumers can know if they buy these specs, they will get good performance.

    • HisDivineOrder
    • 4 years ago

    Those hardware requirements are pathetic, given the resolution and VR essence we’re talking about.

    I’m actually kinda disappointed by what that’ll provide. That better be the barebones REQUIRED-ments that lead to a bad experience.

      • Concupiscence
      • 4 years ago

      You understand that by advocating the restriction of the Rift’s hardware support list to items currently on a short list of parts reserved for the girthiest of e-peens, you’ve just isolated Oculus’ customer base to a few thousand people. I’m sure the members of that pittance of humanity would each be willing to pay as much for a VR headset as most people would cough up for a down payment on a car or a house, because that’s how economics works, right?

    • Krogoth
    • 4 years ago

    This pretty much makes the Oculus little more than an interesting peripheral that caters towards a niche.

    If Oculus wanted to get their tech to grow outside said niche, they need a killer app and more importantly make it “playable” under mainstream-tier hardware.

      • MEATLOAF2
      • 4 years ago

      If they maintain the requirements, in a few years you will probably have that kind of performance in mainstream hardware, I think that’s the plan.

        • Krogoth
        • 4 years ago

        That’s difficult to say when the cost/benefits of die-shrink silicon are diminishing while GPUs continue to escalate in size/R&D costs.

          • MEATLOAF2
          • 4 years ago

          I’m sure it’s usable on lower end hardware, as long as it has enough pixel pushing power and you turn in game settings down. I have no doubt it will be cheaper to match the requirements in the future, but just how cheap is anyone’s guess.

        • wierdo
        • 4 years ago

        Nicely timed with TSMC and gang finally coming out with major process shrinks as well, I wouldn’t be too surprised if these requirements were close to mainstream the following year (2017).

      • jjj
      • 4 years ago

      It’s better than expected. With 2160×1200 it seems they are targeting 60FPS but expectations were 2560x1440p and 90FPS and that would have been a lot more demanding.
      We’ll see how it goes with 2160×1200, min FPS might be rather low in some games and likely the ideal experience requires more GPU power.

        • RdVi
        • 4 years ago

        I wouldn’t worry too much about the res, DK2 was pentile, this is more than likely RGB stripe like morpheus and HTC Vive (probably) so the difference in eliminating the screen door effect will be more than the numbers increase alone.

        I think pentile is great for tablets and phones btw, but anything where the fillrate is the major limitation and frame rates have to be high needs pixel pitch to be as high as possible for the given resolution so as not to need to over-inflate the resolution just to compensate for a low pixel pitch.

      • derFunkenstein
      • 4 years ago

      Was it ever anything other than an “interesting peripheral that caters towards [sic] a niche”?

      • Tumbleweed
      • 4 years ago

      I think the whole “wear this gigantic thing on your face while you game” was the first clue that Oculus was something that would cater to a niche.

      • Billstevens
      • 4 years ago

      I think that is probably one reason they got pushed so hard to do Gear VR.

      Anyone who has looked at the tech knew these would be the requirements from the DK1 days. Yeah its gonna be niche for this whole generation.

      But that’s probably a good thing because content has a long way to go before these devices are friendly enough for a broader audience.

    • tootercomputer
    • 4 years ago

    FYI. That first hot link “in the first quarter” says Page Not Found.

      • Jeff Kampman
      • 4 years ago

      Fixed, thanks.

    • Bensam123
    • 4 years ago

    “Oculus says these system recommendations will remain constant over the Rift’s lifetime, so that developers can target a single reference platform and consumers can take advantage of decreasing component costs over time.”

    Yay, consolitis on the PC with bleeding edge gear…

      • Meadows
      • 4 years ago

      Why? What does this fact of fixed recommendations matter?

        • Bensam123
        • 4 years ago

        640k of memory should be enough for anyone.

          • Meadows
          • 4 years ago

          It does not appear like you even understand the topic at hand.

            • Bensam123
            • 4 years ago

            Or I do and there will be a time at which better VR requires more horsepower.

            • Meadows
            • 4 years ago

            Then you buy a stronger GPU for that particular title and be done with it.

            • Bensam123
            • 4 years ago

            Then it kinda makes a baseline pointless. And that’s a game requirement, not a hardware requirement for the Rift.

      • homerdog
      • 4 years ago

      This is probably a good thing. VR will never catch on if it always requires a $300+ 160+ Watt GPU.

        • Bensam123
        • 4 years ago

        The limits are currently artificially high, not representative of what the device currently needs, so future devices have headroom. But doesn’t take into account where we will be when we exceed those expectations.

      • RdVi
      • 4 years ago

      Maybe they will release an “elite” version of the rift in a year or two for enthusiasts while the current model is revised and refined to become cheaper and more accessible without a true spec bump.

      The elite version could boast a 50-100% resolution bump. Let’s be honest, this is what a lot of people expected to happen, and if it did, the result would be similar to the current situation anyhow; with the fidelity and complexity of the games remaining stagnant because of the ever increasing VR requirements (resolution, FOV, refresh rate).

      • kamikaziechameleon
      • 4 years ago

      given the fixed nature of the occulus hardware it shouldn’t be much of an issue. I mean this is gen 1 so going to iron out allot of stuff and locking down the requirements is just smart. They’ll probably open that up with different SKU’s and such in future gens.

        • Bensam123
        • 4 years ago

        What happens when they need faster hardware to make a more in depth simulation? Say double 4k screens.

    • Concupiscence
    • 4 years ago

    There’s no surprise about Mac users being asked to wait; outside of the Mac Pro there just isn’t hardware that can cope with those recommended requirements. It’s an entire ecosystem of functional but performance-deficient machines, and most of them function more as lifestyle appliances than number crunchers, let alone gaming boxes…

    Linux not getting support quickly isn’t surprising either, sadly. Outside of the Nvidia binary drivers the driver ecosystem’s just not reliably up to the job even if hardware support ostensibly exists.

    Not that I really care about VR. I’m just contemplating the state of things.

      • ImSpartacus
      • 4 years ago

      I think the high end iMacs have pretty potent gpus (for an equally pretty penny), but I understand where you’re going. The Mac Pro is the only performance-minded desktop in Apple’s stable.

        • Concupiscence
        • 4 years ago

        I admit that I’d forgotten about those. But looking at the specs all but the highest-end models are still rocking Intel GPUs or Kepler-era Nvidia mobile parts with a gig of discrete memory. It’s enough to push HD++ resolutions at the desktop, but asking new games to run with that kind of horsepower seems a little cruel.

        • derFunkenstein
        • 4 years ago

        And even then, the base Mac Pro ships with dual Pitcairn. If Crossfire existed for the Mac (it doesn’t) then maybe the dual GPUs would be enough.

        [url<]http://architosh.com/2013/10/the-mac-pro-so-whats-a-d300-d500-and-d700-anyway-we-have-answers/[/url<]

          • the
          • 4 years ago

          True CrossFire does exist on the Mac side but the Mac driver do enable developers to make use of multiple GPUs simultaneously. This is similar to how Mantle is used to do some multi-GPU scaling. And like Mantle, it requires special coding from developers.

          • d0g_p00p
          • 4 years ago

          Wait, what? Why is there a pair of AMD GPU’s in the MacPro’s if they cannot run Crossfire, I mean what’s the point is it only for more displays or something?

            • derFunkenstein
            • 4 years ago

            They don’t need Crossfire to spread compute load. OS X since Snow Leopard has had a pretty efficient OpenCL scheduler that will spread the load among qualified candidates (each CPU core; each GPU; etc.). You won’t be gaming with multiple GPUs, but if you’re doing heavy video renders, all of the GPU horsepower is available.

      • wimpishsundew
      • 4 years ago

      I don’t understand why this even needs to be mentioned. Everyone knows Apple computers are not meant for gaming(except some light games). Apple hardly even tries to incentivize developers to port games over. It’s no question that VR is the least of their concerns right now. If Apple decides to go VR, it’s when they can implement it to be a common feature on their machines rather than a high end solution. Their Mac Pros were designed without gaming in mind at all. Else, the default card wouldn’t be a Fire Pro board.

      You won’t see anything for Linux until after developers got it figured out in Windows. Then hope some fellow netizens will port it over for Linux.

        • Concupiscence
        • 4 years ago

        It needs to be mentioned because some Mac users still get excited at any prospect of playing games on Macs. “I paid a lot for this thing, and I want to play video games too” is a point of view that I find a little sympathetic. But the hardware hasn’t got the grunt – Apple’s prioritized aesthetics and a specific vision of computing to the point that it’s not even possible to beg, borrow, cheat, or steal a solution that can make 90+% of Macs [i<]capable[/i<] of gaming. Hell, OS X is still stuck on OpenGL 4.1, and before that it was on GL 3.3 for [u<]years.[/u<] I own a Macbook Air and it's a lovely little thing, but it is not going to run GTA V at the level a sane human would find acceptable. The Mac Pros have embraced Quadros or FireGLs over consumer level parts for a long time now - it's right in the name. I'd expect Valve's solution to get Linux support quickly, so that's not something I'd worry about too much in the short to medium term.

          • wimpishsundew
          • 4 years ago

          I don’t know what you’re talking about but it seems like it is just your biased assumptions. As far as I know, the Macbook is very functional and its aesthetics is very conservatively simple. Out of all the HPs, Dell, Lenovo, IBMs, and Apple laptops I’ve used over the last 15 years, Macbook Pros made the most sense followed by IBM. It has no extra frills, the best track pad I’ve ever used by far, fantastic screen, compact form, long battery life, and highly function keyboard w/ backlight. It’s only now that other laptops started catching up in function, form and quality. About the only thing i can ever complain about a Macbook design is the Apple logo, OSX, and the old Mac Pro’s heavy weight.

          The best laptop I’ve ever used is a Haswell Macbook Pro 15 running Windows. And I still use it today.

          Your complaints are weird. MB Air can’t play GTA V? wtf, who buys an ultra thin computer made for portability and expecting serious 3D gaming? That’s just stupid and there’s no Windows based version of any brand that can do it. You might as well complain how a PS4 doesn’t run MS Office and lacks a keyboard out of the box.

          Mac Pros are for professionals and not for gamers. They don’t even try to say it’s for gaming. All their marketing is about running professional apps.

          And honestly, as far as hardware goes, the average Macbook is much more capable of gaming than the average Windows laptop. Yes, it also cost more too but we already know that. Even with that capability, Apple still isn’t trying to sell you that their computers were design to run 3D games even though it can do most games fine.

          Where they do market their products for gaming are their mobile products. Ipads and iPhones.

          If your complaint is that it’s expensive, then I 100% agree with you. An iPod for $300? yea, that’s crazy no matter how you look at it and that’s one of their cheapest product. But as far as how it function and perform as Apple stated, it’s one of the most refined product overall you can find.

            • Concupiscence
            • 4 years ago

            Good Lord, you’re really going to jump all over me because I assert that Macs aren’t built to be die-hard gaming machines? Hell, a buddy of mine uses a MacBook Pro for his development work, but bought a little dedicated i7 box to run distcc so his compile times would be reasonable. They’re fine machines for doing a lot of work, and many tasks requiring computational oomph can be offloaded if you’re clever, but for God’s sake: [b<]my point that they're not a serious gaming platform stands.[/b<] This is not some unique observation. Professional developers I talk with bemoan the fact on a pretty regular basis, and loathe telling Mac users that as much as they enjoy the platforms they are [i<]not[/i<] a basis for great game-playing experiences in 2015. I mentioned the Air because - with the possible exception of the GPU - they aren't very different from most of the non-Pro line in terms of overall performance expectations. I'm not going to feed this line of conversation any more. I have the feeling more umbrage will be taken because I'm saying - dun dun dun - that Macs don't have the stones to play games well. It's frankly no secret that Apple's priorities lie elsewhere - the products are an effective marriage of general capability and aesthetics, but they aren't built for die-hard gaming, and at this point in time a product like the Oculus Rift would probably only work properly on a Hackintosh. That's just life.

            • wimpishsundew
            • 4 years ago

            [quote<]my point that they're not a serious gaming platform stands.[/quote<] Actually, that's what I said. Then you responded with how your Macbook Air doesn't play GTA V. You don't see the flaw in your argument? You might as well say your Prius did poorly on a racetrack. And my point is why on earth's green grass would you ever buy into a platform that clearly does not cater to games? Most titles doesn't make it on there. And the ones that does usually come comes 6 months to 3+ years after its launch for Windows. This is not about defending Apple or any platform, it's about your bad choices and making illogical complaints. I don't understand your argument of their priorities of aesthetics over function. Besides the logo, everything was about function. The only reason it looked good is because of its simplicity and aluminum exterior. Other laptops are plagued with weird curves, shapes, angles and extra buttons for stupid propriety software that slows down your computer. I know most people here hates apple and will automatically vote me down. But at least I support what I'm saying. Your complaints are more of your lack of understanding. And no, I don't care if you say Macs don't have the stones for gaming. Why would I care? Apple doesn't pay me. All I'm saying is your reasoning is retarded. Do you buy a luxury sedan and complain why it doesn't tow like a truck? but it cost like it does right?

            • _ppi
            • 4 years ago

            Well, the question is rather what Macs have available GPU with performance in line with 290/970 or better.

            And the answer is: Only MacPro. And nobody is going to play games on professional workstation.

            Of course, only very limited number of PCs in the world have the required GPU horsepower. But you can at least get it, and this is what people interested in gaming do. And so it makes no sense for Oculus Rift to be available for Macs.

            Macs will surely get the support in time, when GPUs with sufficient performance trickle to iMac/Macbook Pro lines.

            Otherwise, I would disagree that “as far as hardware goes, the average Macbook is much more capable of gaming than the average Windows laptop.” Apple has access to the exact same Intel’s, AMD’s and nVidia chips as anybody else. So it is more about which model do you pick. Intel’s graphics are inadequate for almost anything anyway.

            • the
            • 4 years ago

            A slight correction: Mac Pro’s made [i<]before[/i<] December 2013. The newer Mac Pro's top out at dual Raedon 7970/R9 280X running at lower clock speeds. Having two GPUs might be helpful for VR (one per eye), but they're way behind in single card GPU tech. A single Titan X that can go into a 2010 Mac Pro would be faster.

          • Billstevens
          • 4 years ago

          Even with the right hardware Apple has never really had any kind of game friendly OS. Not because it couldn’t handle it, but just because there hasn’t been enough market demand for triple A games on it to force developers to make their games work on Windows and iOS.

          That is how its mostly always been on the PC side of things. Funny that its the exact opposite on mobile devices.

      • the
      • 4 years ago

      Or you could be a Mac user like me with a 2010 that can actually be upgraded to a GTX 970.

        • Concupiscence
        • 4 years ago

        How well does it work in OS X? I honestly haven’t heard of anyone doing it, but I’m glad it’s possible.

      • Vrock
      • 4 years ago

      Mac Users? What’s a Mac User?

    • brucethemoose
    • 4 years ago

    Will the Rift support any adaptive sync methods?

    For a device that’s so sensitive to input latency (and, god forbid, screen tearing in different eyes), an adaptive refresh rate seems essential.

      • orik
      • 4 years ago

      Adaptive Sync is fundamentally incompatiable with backlight strobing currently, and backlight strobing is critical for reducing sim sickness.

        • brucethemoose
        • 4 years ago

        The consumer Rift doesn’t even have a backlight, right?

        I suppose you could strobe an OLED, but pixel response times are way faster than an LCD.

      • DPete27
      • 4 years ago

      I was just going to ask the same question!

    • K-L-Waster
    • 4 years ago

    As an aside – does anyone know if there are mitigation strategies for far-sighted users? i.e. if you need reading glasses, isn’t a VR headset basically a recipe for blurriness + eyestrain hell?

      • Concupiscence
      • 4 years ago

      Beyond contact lenses, I dunno. I’ve been rockin’ corrective eyewear since I was nine months old and know exactly what you mean.

      • just brew it!
      • 4 years ago

      I’ve not really paid much attention to the Oculus Rift to date, but I used to deal with AR (transparent) head-mounted displays a lot at my former day job. The trick is to design the optics such that the image is focused at infinity. So you’re not trying to focus your eyes on the screen (which most people wouldn’t be able to do anyway unless they were horrendously near-sighted); you really do focus your eyes “out there”, *past* the display.

      Actually, the reverse problem is more likely. I’m very near-sighted. The AR displays I worked with were all very out of focus to me unless I wore my glasses, and unfortunately some of the displays sat close enough to my eyes that wearing glasses was problematic. Contact lenses would’ve made things a lot easier…

      Far-sighted users should be fine. But near-sighted users are going to be SOL if their glasses don’t fit inside the headset and they don’t/can’t wear contacts.

    • Meadows
    • 4 years ago

    A few months ago you called the GTX 980 “mid-range”, now the GTX 970 is suddenly “powerful”.

    I’ll say.

    Edit: not “you” specifically, Jeff.

      • ImSpartacus
      • 4 years ago

      Yeah, I’d call a 970 a performance-mid-range card. It’s about as high as you can reasonably go before the perf/$ just dies.

      These requirements don’t seen unreasonable at all. It’s a high res, high frame rate setup. Why is it so surprising that you can’t use a <$200 gpu?

      Gaming at 1080p60 is relatively cheap now, but that’s not true when you step up to more pixels more often.

        • Terra_Nocuus
        • 4 years ago

        not to mention that the Oculus is a dual-screen setup. The v.2 Maxwell cards (apparently) have some VR special sauce, but with the GM206 (GTX 960) having only half the ROPs / Shaders / etc of a GM204, it may not be up to snuff.

      • Ninjitsu
      • 4 years ago

      The GPU is a mid range GPU from the Nvidia engineering team’s point of view, and from a cost perspective. The reason GM204 and GK204 ended up with a GTX x80 number is AMD not being competitive enough in terms of performance for a given die size.

      Otherwise the GPU would have been in a x60 Ti card. So yes, it’s mid-range, but it’s also powerful – same way an i5-4690K can stand toe-to-toe with an i7-980.

        • Meadows
        • 4 years ago

        I disagree with that use of the word.

Pin It on Pinterest

Share This