Using The Force on Quest 3
InteractionSDK
I played with Meta’s interactionSDK a few years ago when the ability to control the device with your hands had just been released, but I found it a bit glitchy sometimes.
Now that we have full colour passthrough Mixed Reality, I thought it was due a revisit and I was amazed how stable is it
It is quite straightforward to setup your own custom hand gestures and use them to trigger functions.
It struck me that, age 45 I could now be a Jedi! so I set to work…
DepthAPI
I used the Depth API here just to get the layering nice between the real and virtual content.
You can see the edges are all a bit rough and ready but it really helps the sense of immersion when my real hand goes in front of a virtual object that I’m interacting with.
I tried using it to have real life trees occlude my digital rocks, but the level of glitchy-ness was a bit too much to really use it in a production setting, hopefully quest 4 will have a more hi-res depth sensor or some mad AI algorithms that can clean out the noise. I wonder if the Apple Vision Pro (writing this Jan 24 before it has been released) could do a better job, seeing as Apple have had a Lidar scanner on their phones for 4 years now they have had more time to tune the hardware and software.
SceneAPI
I also used SceneAPI, i.e. the ability to scan in your real life space and to understand that in ways that are useful for your app.
Here I used it to generate a 3D mesh that matched my office, that the app could use to bounce virtual content off, it works super well.
Outdoor pursuits
For the outdoor scene, I did have some troubles with it as Meta don’t support using your Quest 3 outside.
I had to turn off Depth API and Scene API because of various problems I had, you can read more in my blog post: The Hacker’s guide to using your Quest 3 outdoors
I’ll get you next time!
One thing that I would like to play with, is the ability for real and virtual objects to influence each other’s lighting.
I think it should be quite straightforward for virtual objects to cast basic shadows onto the real world using the super brand new Mixed Reality Utility Kit’s Passthrough Relighting feature.#
I’m thinking we are probably some years away from having something comparable to the type of Global Illumination that we get in traditional CGI? Where we get subtle light bouncing effects going in-between the real and virtual dimensions.
