Meta’s ‘Body Tracking’ API For Quest Is Just A Legless Estimate

The 'Body Tracking' API from Quest isn't what it seems to be.

On Thursday, the Movement SDK, which already contains Eye Tracking API and Face Tracking API for Quest Pro, introduced the Body Tracking API.

The release was published on the official Oculus Developers Twitter account along with an image from the docs illustrating how a user's entire body position is recorded. Due to the widespread sharing of this, many people mistakenly believed that Quest now supports body tracking, yet the graphic and the API's name are deceptive.

The exact location of your hands and fingers as detected by the outward-facing cameras is provided via Meta's Hand Tracking API. Your real gaze direction and facial muscle movements, as detected by Quest Pro's internal cameras, are provided through its Eye Tracking API and Face Tracking API. However, a Meta representative told UploadVR that the "Body Tracking" API only offers a "simulated upper-body skeleton" based on your head and hand locations. It excludes your legs and isn't a real tracking system.

Body Pose Estimation might be a more appropriate name for the API. Inverse kinematics (IK) and machine learning were combined, according to the spokesman, to create this technique (ML). A series of equations known as IK is used to estimate the unknown positions of skeleton (or robot) pieces based on their known locations. All current full-body VR avatars in apps are driven by these equations. Since IK is embedded into game engines like Unity & Unreal, developers don't need to create it themselves (or even understand it), and popular packages like Final IK offer completely developed implementations for around $100.

There are just a lot of alternative solutions for each pair of head and hand positions, therefore IK for VR tends to be incorrect unless you're employing body tracking devices like HTC's Vive Trackers. The selling point of Meta is that its machine-learning algorithm can provide a more precise body position for nothing. The demonstration video appears to back up that assertion, although most developers are unlikely to accept this offer as it only supports Quest headsets and lacks the lower part of the body.


Legs may eventually be added, according to Meta's research and indications made during its Connect 2022 event.

Mark Zuckerberg made the announcement that Meta Avatars are acquiring legs during the keynote address, along with a demonstration that was equally deceptive. Legs will debut in Horizon later this year and in the SDK for other apps the next year. Saxena verified that the Body Tracking API uses the same underpinning technology that powers Meta Avatars, which would appear to indicate that the API will also gain traction.

You might be asking how Body Tracking API could possible include legs since it is merely an estimation based on head and hand locations. Through the use of current developments in machine learning, Meta demonstrated research on just this topic last month. However, the system isn't entirely precise and has a 160ms delay, which is more than 11 frames at 72Hz. You cannot expect to look down and see your own legs in the positions in which you anticipate them to be because the timing is too slow and the output is insufficient. The business may instead employ technology like this to display legs on other people's avatars, according to Meta's CTO's comments:

"Having legs that don't match your real legs on your avatar might be highly unsettling for others. You can see that we can, however, attach legs to other individuals, and it doesn't upset you in the least.

We are thus working on legs that appear natural to a spectator since they are unaware of how your actual legs are positioned, but it's likely that you won't notice anything when you look at your own legs. That is the plan we now have."

However, as we pointed out at the time, the ensuing solution could not be as good as this study. Machine learning publications frequently utilize potent PC GPUs running at a low framerate, but the performance of the system described during runtime was not included in the study.

Post a Comment

Previous Post Next Post

Contact Form