Tobii makes a compelling case for more natural and immersive VR with eye tracking

 We've heard murmurs about the benefits of eye-tracking in VR headsets for quite some time now, but even with the number of press days and trade shows we attend in the course of the year, I'd never had the opportunity to give the tech a spin. That changed with a demo we got to try this year at CES. Tobii, probably the leading company in eye-tracking technology, invited us in for a private showing of its most recent round of VR eye-tracking hardware this year. The company had a prototype HTC Vive headset at hand with its eye trackers baked in for me to kick the tires with, and I came away convinced that eye tracking is an essential technology for the best VR experiences.

Tobii's prototype HTC Vive

Tobii's demo took us through a few potential uses of eye-tracking in VR. The most immediate benefit came in setting interpupillary distance, an essential step in achieving the sharpest and clearest images with a VR headset. With today's headsets, one might need to make a best guess at the correct IPD using an error-prone reference image, but the Tobii tech gave me immediate, empirical feedback when I achieved the correct setting.

Next, the demo pulled up a virtual mirror that allowed me to see how the eyes of my avatar could move in response to eye-tracking inputs. While this avatar wasn't particularly detailed, it was clear that the eye-tracking sensors inside the headset could translate where I was looking into virtual space with an impressive degree of precision and with low latency.

I was then transported to a kind of courtyard-like environment where a pair of robots could tell when I was and wasn't looking at them, causing one to pop up a speech bubble when I did make eye contact. That cute and rather binary demo belies a future where VR avatars could make eye contact with one another, a huge part of natural interaction in the real world that's not present with most human-to-human (or human-to-robot) contact in VR today.

After that close encounter, I was transported to a simulated home theater where I was asked to perform tasks like dimming a light, adjusting the volume of the media being played, and selecting titles to watch. With eye tracking on, I had only to look at those objects or menus with my head mostly still to manipulate them with the Vive's trackpads, whereas without it I had to move my entire head, much as one would have to do with most of today's VR HMDs. It was less tiresome and more natural to simply move my eyeballs to perform that work as opposed to engaging my entire neck.

Another more interactive demo involved picking up a rock with the Vive's controller and throwing it at strategically-placed bottles scattered around a farmyard. With eye tracking off, I was acutely aware that I was moving around a controller in the real world to direct the simulated rock at a bottle. This motion didn't feel particularly natural or coordinated, and I'd call it typical of tracked hand controllers in VR today.

With eye-tracking on, however, I felt as though I was suddenly gifted with elite hand-eye coordination. The eye-tracking-enhanced rock simply went where I was looking when I gave it a toss, and my aim became far more reliable. I wouldn't say that the software was going so far as to correct wildly off-course throws, but it was somehow using the eye-tracking data to smooth some of the disconnect between real-world motion and its effects in VR. The experience with eye-tracking on simply felt more immersive.

Another interactive demo simulated a kind of AR game where a military installation on Mars was poised to fire at UFOs invading Earth. With eye-tracking off, I had to point and click with the controller to adjust the various elements of the scene. When my gaze was tracked, I simply had to look at the stellar body I wanted to adjust and move my finger across the touchpad to move it, rather than selecting each planet directly with the controller. This experience wasn't as revelatory as the rock toss, but it was more inviting and natural to simply look at the object I wanted to manipulate in the environment before doing so.

The final demo dropped me into a sci-fi setting where I could toggle a number of switches and send an interplanetary message. Without eye-tracking on, this demo worked like pressing buttons typically does in VR right now: by reaching out with the Vive controller and selecting the various controls with the trigger. With eye tracking on, however, I had only to look at those closely-spaced buttons and pull the trigger to select them—no reaching or direct manipulation required.

A simulation of how a foveated frame might look in practice. Source: Tobii

The big surprise from this experience was that Tobii had been using a form of foveated rendering throughout the demos I was allowed to try out. For the unfamiliar, foveated rendering devotes fewer processing resources to portions of the VR frame that fall into the user's peripheral vision. Early efforts at foveation relied on fixed, lens-dependent regions, but eye-tracked HMDs can dynamically change the area of best resolution depending on the direction of the wearer's gaze. The Tobii-equipped VR system was invisibly putting pixels where the user was looking while saving rendering effort in parts of the frame where it wasn't needed.

Indeed, the company remarked that nobody had noticed the foveation in action until it pointed out the feature and allowed folks to see an A-B test, and I certainly didn't notice the feature in action until it was revealed to me though that test (though the relatively low resolution and considerable edge aberrations of today's VR HMDs might have concealed the effects of foveation on some parts of the frame). Still, if foveation is as natural on future hardware as Tobii made it feel on today's headsets, higher-quality VR might be easier to achieve without the major increases in graphics-hardware power that would be required to naively shade every pixel.

All told, Tobii's demos proved incredibly compelling, and I was elated to finally experience the technology in action after hearing so much about it. The problem is that getting eye-tracking-equipped headsets onto the heads of VR pioneers is going to require all-new hardware—the company says its sensors require a companion ASIC to process and communicate the eye-tracking data to the host system, and it can't simply be retrofitted to existing HMDs. Asking early adopters to dump their existing hardware for a smoother and more immersive experience might prove to be an uphill climb. Keep an eye out for Tobii tech in future HMDs, though—it makes for a much more natural and immersive VR experience.

Comments closed
    • Biggins
    • 2 years ago

    I play FPS games (mostly Arma 3). The foveated rendering would be nice when I’m trying to focus on a target 800 meters away and my scope is focused on the grass that’s right in front of my face.

    I would like to see AR goggles that project the screen on a white board while I’m sitting at my desk. I want to see the keyboard and mouse on the desk in front of me instead of a virtual keyboard or hands floating in the air. Also, it would be nice to see the drink sitting on my desk and the volume knob on my speakers. Having to purchase a separate set of controllers and sitting in the middle of the room is a deal breaker for AR and VR, not to mention the cost.

    • Anovoca
    • 2 years ago

    Noob question for Jeff, does foveated rendering require Async to pull off? I would imagine you would want the frames to refresh as soon as eye movement is detected and not as a scheduled refresh to eliminate latency or screen tearing issues every time you move your eyes too fast.

    • puppetworx
    • 2 years ago

    Foveated rendering, better and easier setup, more natural controls: this is the way.

    • Laykun
    • 2 years ago

    All hyperbole and no actual numbers. The eye tracking demos have been done before, that’s no big deal. The elephant in the room is foveated rendering, something that will make VR viable for a much larger swath of users, we need numbers of response times and render time saved.

    • JAMF
    • 2 years ago

    Did Tobii not mention how much performance improvement could be had from foveated rendering? Like 1080 performance on a 2k screen, or some similar analogy?

    • Pville_Piper
    • 2 years ago

    For me it would only be a matter of how much it costs. I’ve used the monitor mounted Tobii and it loved it so I know how good their products are. I now run an Oculus Rift and I would get an eye tracking model in a heart beat if I could afford it. Awesome stuff here…

      • SgorageJar
      • 2 years ago

      At CES Tobii told nordichardware.se it would cost 200 SEK = $25.

      [url<]https://www.nordichardware.se/massor/ces-2018/tobii-eye-tracker-vr-headsets.html[/url<]

    • godforsaken
    • 2 years ago

    You mean you have to use your hands? that’s like a baby’s toy

      • Liron
      • 2 years ago

      Don’t worry. The tongue-operated controllers will appear in Japan at any moment now.

    • Kretschmer
    • 2 years ago

    Wow. that makes every other VR implementation look like an alpha version.

    Also, foveation is needed on the desktop, STAT.

      • DoomGuy64
      • 2 years ago

      That’s because they are. VR is nowhere near mainstream status in hardware or software yet. Only now are we getting a glimpse of what good VR should be. I think it’ll take a least another year before VR will be close to being good. Combine this with wireless, and you will finally have a decent product.

      Everyone who already bought into VR has just been beta testing with overpriced hardware.

      • brucethemoose
      • 2 years ago

      Are LCDs even fast enough to pull foveation off?

      If not, the revolution is as far away as mass market emissive monitors. Which is, unfortunately, pretty far away.

        • UberGerbil
        • 2 years ago

        OLEDs are certainly fast enough, and they’re already in some HMDs (eg Samsung Odyssey). And in case you haven’t noticed, LG has been selling OLED TVs for a while now. They’re expensive, but not outrageously so.

      • UberGerbil
      • 2 years ago

      Foveation is much harder to do on the desktop. Measurement of the angle of your glance has to be much more precise, because any error is multiplied by the distance from your eyes to the display (and that’s compounded by the added distance from the sensors to your eyes).

        • Heiwashin
        • 2 years ago

        I don’t think that’s the problem with the tech. I’m pretty sure the same company makes a device that was used to track eye movement while playing piano perfectly and they just used some no frame eyeglasses to do it.

          • UberGerbil
          • 2 years ago

          I was assuming “desktop” meant “nothing on your face” ie built into a monitor or sitting on your desk like Kinect etc.

      • Kevsteele
      • 2 years ago

      Fixed foveation is supported in at least one PC game, Shadow Warrior 2 (which also had the first HDR support). It’s not as useful as you’d think with a fixed position monitor, but I can see it being fantastic with eye-tracking in VR.

    • Firestarter
    • 2 years ago

    [quote<]Indeed, the company remarked that nobody had noticed the foveation in action until it pointed out the feature and allowed folks to see an A-B test, and I certainly didn't notice the feature in action until it was revealed to me though that test[/quote<] this is huge. I knew the principle of foveated rendering was one of the obvious ways VR performance could be improved, but I've never heard of it being implemented well enough to be essentially transparent to the user

      • Chrispy_
      • 2 years ago

      I’m surprised this article hasn’t garnered more comments.

      Foveated rendering is the best tool we have in the battle between higher resolutions and limited GPU performance. If Tobii genuinely trickedJeff into thinking that the whole image was high resolution, that is incredible.

      It’s obviously easier on an HMD where the your eyeballs are in a static position relative to the Tobii sensor, but if they could make this happen on a desktop, we’re potentially looking at 4K gaming at 144fps, ultra details and a 1050Ti.

      Edit:
      I just looked it up, human macular vision (focused vision, rather than blurry peripheral vision) is only a 5-6 degree arc. At typical monitor distance of 25″ a 32″ 4K monitor only needs to render a circular zone of about 750 pixels diameter; That’s not even half a megapixel, so it’s easier to render than 800×600 resolution. The rest of the screen can be rendered at 640×480 and thanks to the magic of shared resources in rendering two views from the same perspective, voila! You have a total 4K image that is 12x easier to render than a full 4K resolution.

      In terms of GPU requirements, if your GPU can normally hack it at 720p (0.92MP) you’ll get higher performance with this example of foveated rendering at 4K, because it’s about 25% fewer megapixels to render than 720p full resolution!

        • Wirko
        • 2 years ago

        Peripheral vision is not only blurry, it’s colour blind too. (Probably a useless fact because monochrome rendering could save little or no GPU resources).

        [url<]https://en.wikipedia.org/wiki/Peripheral_vision[/url<] "Color perception is strong at 20° but weak at 40°. 30° is thus taken as the dividing line between adequate and poor color perception."

          • ludi
          • 2 years ago

          No, not color blind, just reduced perception, meaning that nuances of shade and texture disappear. You’re still generally aware of some color characteristics almost to the edges of your peripheral vision.

        • Pville_Piper
        • 2 years ago

        Something I noticed when checking out the Windows Mixed Reality headsets was the very small sweet spot of the lenses. I hated it, it was awful. Until I started to do a task such as a game and I adjusted to it without noticing and the small sweet spot wasn’t an issue.

        Foveated Rendering, powered by Tobii would be an awesome combination.

        • Mr Bill
        • 2 years ago

        But can Foveation keep up with Data’s reading speed? Or does he even move his pupils? I’d like to try this as a VR book reading application.

      • dodozoid
      • 2 years ago

      This tech is indeed the breakthrough needed to enable real lifelike VR. And even relatively simple hardware like some future Snapdragon could drive ultrahigh-res screens without breaking a sweat.

    • RdVi
    • 2 years ago

    This truly looks like a must have feature. Very promising from the demos described/shown. So much for pulling the trigger on one of the upcoming Gen 2 VR headsets…

      • tritonus
      • 2 years ago

      My wallet offers its sincere thanks for this compelling demonstration. The VR leap is postponed again.

Pin It on Pinterest

Share This