
Using Quest 3 outdoors
The hacker’s guide
Don’t use your Quest 3 outdoors!
There are 2 good reasons why you shouldn’t use you Quest 3 outdoors
1. You will look like a TOTAL muppet!
2. Meta don’t support this use case at all
WARNING: This could invalidate your warranty or damage your device, please be careful and use your discretion
Use your Quest 3 outdoors!
But of course, we are hackers and like to do the opposite of what we are told! There will come a day where the technology is fully ready for outdoor use and we want to be ahead of the game.
Life is fun on the bleeding edge of technology where everything breaks and your elderly neighbours wonder what the hell you are doing waving your arms around with the weird plastic thing on your head.
Arrghhh the light!
The critical things to keep out of the sun are the pancake lenses that you look through, these are magnifying glasses and if powerful sunrays hit them they can burn the display.
If you keep the headset on your head when not stowed away in a bag, it should minimise this type of problem. It has just passed winter solstice here in Scotland at the time of writing and the light is very low and hopefully as good as it gets for safely testing these devices outdoors.
Another thing is there are infra-red sensors on the controllers and strong sunlight may interfere with these, I haven’t tested this enough yet.
If you want a belt and braces approach, you could use in a forest area, in Scotland, in winter, for the ultimate low risk approach!
If you go on YouTube you will see a bunch of people who live in hot places out in the sun in theirs, who are saying they had no problems. Maybe they own Quest repair companies?


Get a ‘room’
At the time of writing, 4th Jan 2024, we are on Quest OS V60. For many applications we need the ‘spaces setup’ if we want to generate a 3D mesh of our environment to use as collision mesh and/or for occlusion. The Quest generally demands that is it is used in ‘rooms’.
One of the assumptions Meta have about a ‘room’ is that it will have a flat floor and ceiling, I got major problems in the hilly forest where I first tried this, I wanted a hilly floor as a collision mesh, but it only seems to accept mostly flat floors, with the odd irregular objects on them like you would have in a ‘room’
Also the maximum ‘room’ size is 10m x 10m so you might need to factor that in too, if you have plans for a larger collision mesh that won’t be possible.
Another brick in the wall
Quest update V60 is also adamant that we’re having some walls in our ‘room’, ‘Room’ scanning takes way longer than usual as it won’t complete until it finds walls, eventually it gives up, throws a ‘Walls not found’ message and tells you to place them manually. At this point you just have to position them around the outer bounds of the ‘room’.
I’m wondering if Meta are relying on walls as a guaranteed, assumed part of any mixed reality experience on their platform at the moment, so you can be sure if your experience has gameplay elements needing walls, that they will available?
Push the Boundaries
In Mixed Reality, I’d argue that ‘boundaries’ i.e. the safety system Oculus used for years in VR to stop you banging into real walls that you couldn’t just are not needed anymore. But of course Meta’s lawyers probably think overwise. For now, in Mixed Reality they do nothing but get in the way when you are outside and want to step into your ‘room’.
As of V60 of the software, as a ‘developer’ (anyone can be one) you can turn boundaries off, but then you also lose the ability record mixed reality, the real world goes black in the recording, even though I could see the mixed reality whilst I was playing. So turning off the boundaries kills one of the major features I need as a blogger. For now I simply make the boundaries as far away as possible.
Depth API beef
If you are using the Depth API (experimental feature as of V60) , i.e. you want to hide (occlude) objects based on if real world objects cover them up, there are problems here too.
Basically as you are in a ‘room’, the walls of this imagined room will occlude (hide) your objects if they go beyond the walls of the room
You can use the occlusion depth bias shader to cheat things a bit bit but I couldnt find a way to remove the walls from the calculation but keep everything else. As far as I can tell this really limits the depthAPI for outdoor use?
The docs are here, worth a read seems to be getting updated on the regular.
The gloves are off
So I knew this probably wouldn’t work, but I thought as it’s so cold here at the moment I should try it!
I have to give credit to the guys who did the computer vision stuff for the hands interaction SDK, it ALMOST seems to work when you have gloves on, it can recognise gloved hands but then when you do a gesture like a finger poke it loses the tracking. Maybe thinner gloves would work, mine are pretty extreme ski gloves.
Anyone from Meta reading this?
Here’s my wishlist…. 🙂
I’d love it if some kind of provision could be put in place to allow Quest 3 use that isn’t in a ‘room’. Perhaps they would need to segment MR into 2 categories, indoor and outdoor, so that you can design experiences that can make some assumptions about the usage context.
Can we turn off boundaries without any caveats (no MR video capture), maybe we can sign away our legal rights to sue them and be allowed to take some calculated risks ourselves stumbling around the real work in Mixed Reality?