The Quest 3 now supports full body tracking and MR occlusion! But why should you care?
We mention the Quest 3 a lot, probably even daily. But that isn’t just because it’s one of the best VR headsets on the market, or even because Meta owns the biggest standalone library of XR apps and games.
It’s because even though the headset is already out, it’s still getting new features, which makes it a really exciting, evolving piece of technology.
At first, things sounded a bit weird when Meta flaunted about features, which the Quest 3 was merely destined to get. In fact, even months after the headset was released, this is still the case, with native MR features expected to drop sometime next year.
But the great news is that we’re getting goodies along the way! And by we, I mean developers. Which still includes us, but at a later point. Look, what I mean to say is that apps and games on the Quest 3 can now get even cooler!
If that last bit sounded confusing, that’s because it was a completely intentional attempt at comedy. So let’s get serious and explain what is going on. Basically, the v60 SDK for the Quest 3 just got released and it allows developers to utilize the following Quest 3 features:
Now, the interesting bit about this update is that it’s actually three updates. This is because the Quest platform supports a native programming language, then Unity and lastly: Unreal Engine.
So why is that important? Well, basically because the update for Unreal Engine hasn’t gotten anything other than access to the Depth API. Will that change in the future? Maybe, even probably, but not certainly for the time being.
Here’s the thing though: this isn’t just a software update that will allow your Quest 3 to suddenly do more on its own. These features have become available to developers to implement as updates to existing apps and games, or in entirely new future projects.
Inside-out body tracking — or IOBT, which I love saying in front of strangers at parties — basically allows the Quest 3 to get a more accurate idea what you’re doing with your upper body, even extending to detailed movements in the shoulders and wrists.
So basically, this means that your Meta avatars and in-game characters can become a lot more expressive. But wait, there’s more! They can even get legs now, thanks to Generative legs, which have been teased numerous times in the past.
Well, they’re here: legs! Jokes aside, this feature is way more cutting-edge than it sounds, but what can you do with it? Well, basically the Quest 3 can use this hyper-advanced AI to think up legs for you, even though it isn’t actually tracking those.
Now, it’s not perfect: knee movements, for example, come off as rather difficult to understand, but still: it is impressive. And do you know what this means?
It means that Meta has successfully established full body tracking on the Quest 3 without the need of any external trackers, all thanks to generative legs and my favorite word of the day: IOBT. Sweet!
The company has even come up with a term for that: Full Body Synthesis (which isn’t officialized as FBS yet, but let’s do that for them).
Now, the final new feature from this latest release is the Depth API. Now, you might’ve seen the word Occlusion in your game settings before and this feature basically uses lighting and shadows to enhance the depth of a 3D scene.
So what could it mean when applied to mixed-reality? Well, simple: it allows rendered objects to be properly positioned regarding things in the real world. And that sounds weird, so let’s give an example:
Let’s say that there’s a rendered 3D object of a cat placed in the real world and then you move a table in front of it. Thanks to this nifty Depth API, the Quest 3 will be able to block out parts of the cat to match what the table should be preventing you from seeing.
In other words: these updates can really boost immersion when implemented into some apps in games.
So, these tools have now become available for developers to check out. From what I can tell, there aren’t any apps or games that are already taking advantage of any of these. But Meta has given us a teaser regarding where we can expect them first.
These experiences are to get an update with Full Body Synthesis (or FBS) support rather soon, so if you have any one of them in your Meta Quest library, stay on the lookout for any new updates.
While there is no word on application of the Depth API just yet, our list of the best Quest 3 MR games may give you a solid idea of where you can expect to see it in action.
The Quest 3 has a very bright future ahead of itself, not only because of the introduction of these new features, but also because we’re expecting even more down the line. Have you noticed these going live for any of the apps or games that you own? Let me know in the comments!
It’s because even though the headset is already out, it’s still getting new features, which makes it a really exciting, evolving piece of technology.
But the great news is that we’re getting goodies along the way! And by we, I mean developers. Which still includes us, but at a later point. Look, what I mean to say is that apps and games on the Quest 3 can now get even cooler!
What are the new features on the Quest 3 and who are they for?
If that last bit sounded confusing, that’s because it was a completely intentional attempt at comedy. So let’s get serious and explain what is going on. Basically, the v60 SDK for the Quest 3 just got released and it allows developers to utilize the following Quest 3 features:
- Inside-out body tracking (IOBT)
- Generative Legs
- Occlusion for MR via the Depth API
Now, the interesting bit about this update is that it’s actually three updates. This is because the Quest platform supports a native programming language, then Unity and lastly: Unreal Engine.
So why is that important? Well, basically because the update for Unreal Engine hasn’t gotten anything other than access to the Depth API. Will that change in the future? Maybe, even probably, but not certainly for the time being.
What do the new Quest 3 features mean for fans and VR gamers?
Notice how the IOBT example is much more expressive! Good chap, IOBT!
Inside-out body tracking — or IOBT, which I love saying in front of strangers at parties — basically allows the Quest 3 to get a more accurate idea what you’re doing with your upper body, even extending to detailed movements in the shoulders and wrists.
So basically, this means that your Meta avatars and in-game characters can become a lot more expressive. But wait, there’s more! They can even get legs now, thanks to Generative legs, which have been teased numerous times in the past.
Well, they’re here: legs! Jokes aside, this feature is way more cutting-edge than it sounds, but what can you do with it? Well, basically the Quest 3 can use this hyper-advanced AI to think up legs for you, even though it isn’t actually tracking those.
An example of knee movement being difficult to decipher for AI-fueled legs.
Now, it’s not perfect: knee movements, for example, come off as rather difficult to understand, but still: it is impressive. And do you know what this means?
It means that Meta has successfully established full body tracking on the Quest 3 without the need of any external trackers, all thanks to generative legs and my favorite word of the day: IOBT. Sweet!
The company has even come up with a term for that: Full Body Synthesis (which isn’t officialized as FBS yet, but let’s do that for them).
So what could it mean when applied to mixed-reality? Well, simple: it allows rendered objects to be properly positioned regarding things in the real world. And that sounds weird, so let’s give an example:
No, I don't think that this is what cats look like. I wrote my example before I saw this image. Nifty, right?
Let’s say that there’s a rendered 3D object of a cat placed in the real world and then you move a table in front of it. Thanks to this nifty Depth API, the Quest 3 will be able to block out parts of the cat to match what the table should be preventing you from seeing.
In other words: these updates can really boost immersion when implemented into some apps in games.
Are the new Quest 3 features available in any games or apps yet?
Average IOBT enjoyer right there.
So, these tools have now become available for developers to check out. From what I can tell, there aren’t any apps or games that are already taking advantage of any of these. But Meta has given us a teaser regarding where we can expect them first.
- The VR fitness app Supernatural
- The slashy game Swordsman
- The comedic adventure Drunkn Bar Fight
These experiences are to get an update with Full Body Synthesis (or FBS) support rather soon, so if you have any one of them in your Meta Quest library, stay on the lookout for any new updates.
The Quest 3 has a very bright future ahead of itself, not only because of the introduction of these new features, but also because we’re expecting even more down the line. Have you noticed these going live for any of the apps or games that you own? Let me know in the comments!
Things that are NOT allowed: