The next wave in virtual reality is eye tracking. This technology enables more immersive experiences and reduces the load on your graphics card. Accessories for existing head mounted displays [HMDs] are launching now and the next generation of HMDs will have eye tracking built in. Let us look at how eye tracking works.
How Eye Tracking Works
The prevalent implementation of eye tracking in virtual reality systems uses infrared emitters and cameras. The tracker projects imperceptible infrared light into your eyes and the camera records the light that reflects off your retina. By measuring the vector between the center of your iris and the corneal reflection the tracker computes your point of regard. Modern systems measure the angle of your gaze with an accuracy of less than one degree.
Immersion is Everything
Virtual reality applications, and gaming in general, depend on immersion to draw users in and make them suspend disbelief. Software developers use your gaze direction to create more immersive environments. Imagine wildlife that scampers away when you make eye contact or non-player characters who engage you in conversation if you look at them. Your virtual waifu might even ask “Are you listening to me?” if you avert your gaze for too long.
Eye tracking also helps with user interface design. Point and click menus are not very useful when your input devices are hand-tracked controllers. Eye tracking gives designers an extra dimension of input that allows them to make contextual menus for the object you are looking at. Current virtual tourism applications like Realities and Destinations allow you to point at a landmark and read or hear more information about it. With eye tracking, you could let your gaze linger on an object to access that information instead.
Rendering on a Potato PC
Your eyes naturally focus at the center of your gaze and objects outside of that area are slightly out of focus. Virtual reality applications can use this phenomenon combined with eye tracking to reduce the rendering quality for the parts of the scene that are in your peripheral vision. This technique, called foveated rendering, trades off a small increase in CPU load for a large decrease in GPU load. By using eye tracking, designers reduce the minimum system requirements for their application.
Your eyes are the windows to your soul. This adage has uses in VR game design. In general, when you gaze towards an object or location, you are about to interact with that part of the scene. Designers may use this assumption to preload data from your hard drive or the network. When implemented properly this technique creates a smoother interaction with the virtual world.
Beware the Dark Side
There are marketing implications for eye tracking that have the potential for abuse. By observing how long your gaze lingers on an object companies begin to profile your likes and dislikes. Marketing companies can gather a wealth of information by placing a few props in the game world. Do we want our games analyzing whether we prefer breasts to thighs or Quafe to Quafe Zero?
I Need It Now
The Fove Kickstarter campaign launched in May of 2015. You can buy one now at GetFove.com for about $600. The Fove uses IR-based head tracking like the Oculus Rift and features one, 120 FPS, eye tracking camera per eye. An eye-tracking accessory for the HTC Vive, called the 7invensun aGlass, releases later this month for around $220. The aGlass unit snaps into your HTC Vive headset and gives a massive boost in performance to titles that support foveated rendering.
If you are willing to wait a little, the next generation of high-end head mounted displays will come with eye tracking built in. Foveated rendering reduces the system requirements for virtual reality, which opens the market to more users. That alone is reason enough for hardware manufacturers to rally behind eye tracking.