Tobii makes a compelling case for more natural and immersive VR with eye tracking


 We've heard murmurs about the benefits of eye-tracking in VR headsets for quite some time now, but even with the number of press days and trade shows we attend in the course of the year, I'd never had the opportunity to give the tech a spin. That changed with a demo we got to try this year at CES. Tobii, probably the leading company in eye-tracking technology, invited us in for a private showing of its most recent round of VR eye-tracking hardware this year. The company had a prototype HTC Vive headset at hand with its eye trackers baked in for me to kick the tires with, and I came away convinced that eye tracking is an essential technology for the best VR experiences.


Tobii's prototype HTC Vive

Tobii's demo took us through a few potential uses of eye-tracking in VR. The most immediate benefit came in setting interpupillary distance, an essential step in achieving the sharpest and clearest images with a VR headset. With today's headsets, one might need to make a best guess at the correct IPD using an error-prone reference image, but the Tobii tech gave me immediate, empirical feedback when I achieved the correct setting.

Next, the demo pulled up a virtual mirror that allowed me to see how the eyes of my avatar could move in response to eye-tracking inputs. While this avatar wasn't particularly detailed, it was clear that the eye-tracking sensors inside the headset could translate where I was looking into virtual space with an impressive degree of precision and with low latency.

I was then transported to a kind of courtyard-like environment where a pair of robots could tell when I was and wasn't looking at them, causing one to pop up a speech bubble when I did make eye contact. That cute and rather binary demo belies a future where VR avatars could make eye contact with one another, a huge part of natural interaction in the real world that's not present with most human-to-human (or human-to-robot) contact in VR today.

After that close encounter, I was transported to a simulated home theater where I was asked to perform tasks like dimming a light, adjusting the volume of the media being played, and selecting titles to watch. With eye tracking on, I had only to look at those objects or menus with my head mostly still to manipulate them with the Vive's trackpads, whereas without it I had to move my entire head, much as one would have to do with most of today's VR HMDs. It was less tiresome and more natural to simply move my eyeballs to perform that work as opposed to engaging my entire neck.

Another more interactive demo involved picking up a rock with the Vive's controller and throwing it at strategically-placed bottles scattered around a farmyard. With eye tracking off, I was acutely aware that I was moving around a controller in the real world to direct the simulated rock at a bottle. This motion didn't feel particularly natural or coordinated, and I'd call it typical of tracked hand controllers in VR today.

With eye-tracking on, however, I felt as though I was suddenly gifted with elite hand-eye coordination. The eye-tracking-enhanced rock simply went where I was looking when I gave it a toss, and my aim became far more reliable. I wouldn't say that the software was going so far as to correct wildly off-course throws, but it was somehow using the eye-tracking data to smooth some of the disconnect between real-world motion and its effects in VR. The experience with eye-tracking on simply felt more immersive.

Another interactive demo simulated a kind of AR game where a military installation on Mars was poised to fire at UFOs invading Earth. With eye-tracking off, I had to point and click with the controller to adjust the various elements of the scene. When my gaze was tracked, I simply had to look at the stellar body I wanted to adjust and move my finger across the touchpad to move it, rather than selecting each planet directly with the controller. This experience wasn't as revelatory as the rock toss, but it was more inviting and natural to simply look at the object I wanted to manipulate in the environment before doing so.

The final demo dropped me into a sci-fi setting where I could toggle a number of switches and send an interplanetary message. Without eye-tracking on, this demo worked like pressing buttons typically does in VR right now: by reaching out with the Vive controller and selecting the various controls with the trigger. With eye tracking on, however, I had only to look at those closely-spaced buttons and pull the trigger to select them—no reaching or direct manipulation required.


A simulation of how a foveated frame might look in practice. Source: Tobii

The big surprise from this experience was that Tobii had been using a form of foveated rendering throughout the demos I was allowed to try out. For the unfamiliar, foveated rendering devotes fewer processing resources to portions of the VR frame that fall into the user's peripheral vision. Early efforts at foveation relied on fixed, lens-dependent regions, but eye-tracked HMDs can dynamically change the area of best resolution depending on the direction of the wearer's gaze. The Tobii-equipped VR system was invisibly putting pixels where the user was looking while saving rendering effort in parts of the frame where it wasn't needed.

Indeed, the company remarked that nobody had noticed the foveation in action until it pointed out the feature and allowed folks to see an A-B test, and I certainly didn't notice the feature in action until it was revealed to me though that test (though the relatively low resolution and considerable edge aberrations of today's VR HMDs might have concealed the effects of foveation on some parts of the frame). Still, if foveation is as natural on future hardware as Tobii made it feel on today's headsets, higher-quality VR might be easier to achieve without the major increases in graphics-hardware power that would be required to naively shade every pixel.

All told, Tobii's demos proved incredibly compelling, and I was elated to finally experience the technology in action after hearing so much about it. The problem is that getting eye-tracking-equipped headsets onto the heads of VR pioneers is going to require all-new hardware—the company says its sensors require a companion ASIC to process and communicate the eye-tracking data to the host system, and it can't simply be retrofitted to existing HMDs. Asking early adopters to dump their existing hardware for a smoother and more immersive experience might prove to be an uphill climb. Keep an eye out for Tobii tech in future HMDs, though—it makes for a much more natural and immersive VR experience.

Tip: You can use the A/Z keys to walk threads.
View options

This discussion is now closed.