This eye-tracking tech will change the way you experience VR
Like most first-generation products, consumer-grade VR headsets such as Oculus Rift and HTC Vive have had their share of growing pains – whether it’s to do with shipment delays or high barriers of entry in terms of cost and hardware requirements. The PlayStation VR, which sold over 915,000 units as of February last month, has no shortage of games lined up for the rest of 2017. However, that doesn’t change the fact that triple-A VR titles are few and far between. A quick browse of the VR game library on Steam will return with an astounding number of ‘bite-sized’ games – many of which are still in Early Access. Things are a little brighter on the PS VR front, with notable exclusives like Farpoint and Gran Turismo Sport to tide over early adopters.
More importantly, those who have experienced VR with these headsets will have noticed the propensity in which the user is required to tilt his or her head and wrestle with button presses while interacting with the VR environment. During my time at the 2017 Game Developers Conference (GDC), it was clear that the major players in the VR industry were all ready to take the immersion to the next level.
For a start, Tobii – the Swedish company behind the eye-tracking technology found in today’s gaming notebooks and monitors – intends to bring that very same disruption to VR, having raised US$50 million in investments to spur the development of eye-tracking in smartphones and VR headsets. The hands-on demonstration involved a HTC Vive that’s retrofitted with Tobii EyeChip, a proprietary SoC ASIC that does most of the data and CPU heavy-lifting to reduce bandwidth load and power consumption on the headset (and subsequently, the host device). The eye sensors, on the other hand, are placed behind the lenses, while a ring of infrared illuminators surrounds the front of both lenses. This combination allows for optimal eye-tracking at up to 90Hz.
Upon donning the Tobii-integrated Vive headset, I was immediately greeted with a floating avatar reflected on two mirrors. The avatar on the left mirror tracked my head movement well, but it is the mirror on the right that truly impresses. With eye-tracking enabled, the virtual me mimicked my eye movements with startling accuracy, down to the side eye and little wink. I should also mention that this is quite the breakthrough as there were instances where earlier generation Tobii-equipped gaming notebooks have difficulty picking up the direction of the gaze due to the special coating on my glasses.
Eye-tracking in VR allows in-game avatars to exhibit a more lifelike facial expression, most of which is down to the eye movements.
I was then transported to a poolside patio where the gaze detection mechanic works by having one of two robots turn towards my direction, depending on the robot I was focusing on at the time. One allowed me to withdraw cash, which was then turned over to the other robot in exchange for a beach ball. Other possible interactions with the pair of Vive controllers include letting the sun in by opening the roof, changing channels on the TV screen, and throwing the aforementioned beach ball into the pool.
This was followed by two mini-games: the first had me picking up rocks from the ground and throwing them in the hopes of knocking down bottles from varying height and distance. Despite my best efforts, I missed every attempt without eye-tracking. Once enabled, it was simply a matter of looking at a bottle and exerting the necessary force to get the job done. The second mini-game took things further by having me fling projectiles back at an increasing number of flying enemies. Through my eyes, it became second nature to hit one target after another. I'll admit, it was exhilarating to be able to instinctively react in a natural manner without second-guessing a single action. What I also appreciate was the fact that any sense of disconnect I had in my previous VR experiences was put to rest in this session.
Upon completing the demo, I was then shown an elevated view of each individual VR environment, complete with a visual representation of my eye movements in the form of spread-out lines. By studying the lines of sight, VR game developers can take this useful feedback to iterate or implement changes in gameplay or the virtual environment where needed.
Here's an overview of NVIDIA's foveated rendering technique.
Another advantage of integrating eye-tracking technology into VR headsets is a wonderful feature known as foveated rendering. When looking at our surroundings in real life, what we choose to focus on will appear sharp, while everything else in our peripheral vision takes on a blurry quality. By following this principle, this image processing technique helps reduce GPU rendering cost by lowering visual quality for everything outside your field of view. When implemented well, the effect is more or less invisible to the users, which was definitely the case for me. To make his point, the Tobii representative bumped up the quality-reduction settings. This resulted in some noticeable artifacts in the periphery area. Although it's still early days, foveated rendering will be a crucial component moving forward, seeing as key players like NVIDIA, ARM, and SMI are equally invested in elevating the VR experience without sacrificing visual quality and performance. It'll be interesting to see if developers will embrace Tobii or other eye-tracking solutions in their VR creations.
Michael Low / Editor
A self-confessed geek with a penchant for mobile devices and the occasional video games (when time permits). During his off time, he loves nothing more than lazing on the couch with his iPad. Just don’t call him an Apple fanboy.