Kinematics is the study of bodies in motion irrespective of mass and forces acting upon a rigid body. Forward kinematics equations compute the position of the end of a jointed system given information about the position of the joints. Inverse kinematic equations work in reverse. They compute the necessary joint positions to move the end of the system to a desired point. Many video games use inverse kinematics to animate the skeleton of a 3D model. For example, calculating the position of a foot moving on uneven terrain. There are some interesting challenges for virtual reality applications that want to model realistic representations of tracked humans.
Immersion is Everything
Eve Valkyrie renders a full body avatar in the cockpit of a fighter. This draws you into the character and makes you feel like you are sitting in that cockpit. That is, however, until you move your legs or arms and notice that the avatar does not move. At that point, your immersion breaks reminding you that you are in a simulation. In virtual reality parlance, presence means deep immersion—that feeling that you are in the virtual world—and it is key to a great game.
There is a reason every virtual reality application that uses hand tracking renders your hands floating in space. It is better to render no arms than render arms which do not match the user’s actual arm position. Immobile arms and legs break immersion but inaccurate ones can make you disoriented and sometimes nauseous. Your brain knows where it told your arms to move and, when your eyes tell you differently, the game breaks your body’s feedback loop.
Inverse Kinematics to the Rescue
Inverse kinematics works well in games that do not track your hands. When your avatar raises a rifle or throws a punch, the game smoothly animates the movement by constraining the problem. By constraining the range of motion of various joints game developers guide the calculations into known solution sets. However, with tracked hand controllers the problem is unconstrained and becomes much more difficult.
How good is good enough? We have established that incorrect movement of your avatar’s limbs breaks immersion. What about your friend’s avatar? Is it important that your client renders them chicken winging—sticking their elbow out while shooting from cover? Perhaps if you want to role play telling them not to get their elbow blown off. What if they are not your friend? You would likely want to shoot them in the elbow.
Most, if not all, AAA games employ motion capture to create smooth animations of avatars moving through various physical activities—running, jumping, dancing, etc. Actors on the motion capture stage wear suits covered in reflective dots called fiducials. The position of the dots enables tracking of the various bones and joints on the actor. If we can track hands, why can’t we track other parts of the body? The answer is: you can.
There are some options that let you track more points on the user. The Vive Tracker is a puck that attaches to a variety of objects and tracks their position with the same accuracy as the hand controllers—less than one centimeter. The Sixense STEM system is only available for developers and supports up to five trackers and two hand controllers. The Vive trackers cost $100 each and the Sixense system costs $5,250 for two hand controllers and three trackers. The Sixense system works with any VR headset but the Vive tracker is only for the Vive.
Theoretically, you could add cheap tracking points to the Rift by sewing infrared LEDs into clothing. There are not any reports of someone doing this yet.
Given the cost of extra trackers, it is prohibitive to add a lot of these for a full body tracking system. What is the minimum number we can add to achieve acceptable inverse kinematics rendering of a human? HTC thinks the answer is three.
By adding a tracker at the lower back and one on each foot, they can achieve high-quality rendering of the user. The standard Vive system knows where the floor is and it tracks your hands and head. With the belt tracker, the software knows where your waist is, and, given your head position, allows a good approximation of the movement of your torso. The trackers on your feet and the one on your waist do a good job with lower body movement. HTC recently released the source code for their inverse kinematics demo on GitHub.
Before bringing this article to a close take a moment to enjoy People Gun. It is damn funny.
People Gun uses head and hand tracking only. While it does a decent job rendering shoulder and arm movement, there are apparent flaws. Adding more tracked points improves the simulation dramatically, but unfortunately, there are not any good videos of it to share.
All of this is largely academic. Most of the tools to do inverse kinematics with multiple tracked points are in the hands of developers. However, it will be a year or more before this technology becomes mainstream. As the technology advances and the cost drops, it will bring full body motion capture to your living room.