Today I sat down with Victor, who handled the implementation of hand tracking into Richie’s Plank Experience. Check out what he has to say about bringing controller-free mode to our Oculus Quest build, and his thoughts on the future of VR…

What was the basic process to implement hand tracking?

First, we had to download the Oculus SDK, update it in the project and of course, read through all the documentation before we start implementing the changes in the project. Step 1 is to detect the user’s hands in the project, and step 2 is to detect whether the user is using hands or controllers and write the handling for that. For example, in RPE when you pick up the controllers, we need to show the controllers in-game so that the user can see they are using controllers, and when they put them down we need to make the models disappear and make the hands appear. The hands are handled by the Oculus SDK so we don’t have to worry about that. The next step is to give the hands a good look, originally they were blue and we changed to white, but based on feedback we are looking at making it a little bluer.

Next, we had to adapt all the interactions in the game to work with hands. For example, the lasers in the white room need to work with hand pinches instead of trigger presses. The good thing there is Oculus has made it very easy for us to track the correct laser direction that it’s pointing to.

After that, we implemented the grabbing – in RPE we can grab things like the cake and doughnuts by pressing the triggers on the controllers. With hands, it’s definitely more intuitive but we are also handling it by pinching, so when you pinch an object it gets picked up and when you let go it will drop. We are also looking at implementing grip detection as well so when you grip the object it will grab it more like you do in the real world.

Finally, we ironed out all the other aspects of the game, for example disabling hand tracking for Nightmare mode and flying missions. We are exploring the possibilities of hand tracking when flying and there’s no telling if it’s plausible for us to implement hand tracking into flying because currently it’s a little unstable, but we’re still experimenting and hopefully we’ll have answers soon. 

So after that we did some user testing, and then we were able to roll it out once we were happy with it. 

How long do you think implementation took from start to finish (including the work we did on the proof of concept when hand tracking was first announced in December)?

It took us about 2-3 weeks altogether. 

What were the challenges you faced along the way, or what was easy?

Not really any major roadblocks, Oculus has made it really easy. The documentation is easy to understand, the code is clear and easy to read, so the main challenge is to consider the user experience; how can we take what Oculus has provided and translate it into RPE so players can know immediately what to do without reading any text or instructions.

One of the good things that comes with hand tracking is that RPE involves a lot of elevator button pressing which is as intuitive as it gets. It’s a really good match for hand tracking, so we are really excited about it and we think the users are going to like it.

How did we decide how tracking would be used in the game? We know pointing and pressing buttons is obvious, picking up and grabbing is obvious; maybe it’s a process of testing when it comes to implementing it for flying

We’ve had a try at it and decided it’s not ready, potentially we will come back to it when more gestures are included. 

The flying is definitely less intuitive than grabbing and pressing, but there are potentially some workarounds. For example, using the pinch as thrust, that’s one of the things we are experimenting with. We can build a good and convincing UI or model around the hand to let the player know what to do. Basically using design language to let the player know hey, use pinch here to add thrust, something like that. Also, there can be other interactive methods that we don’t know about yet.

Hand tracking is a new tech and all the interactions are being explored right now, surely in the future there will be best practices with hand tracking developed, like in Elixir, where you use two hands to aim at a point on the floor to highlight it and you pinch with two hands to teleport there. That sort of interaction is potentially going to be a new standard and there will be many more. But as it stands, people are really used to using the laser pointer and the pinch to interact with the Home menu now as it’s been there since the release of the hand tracking beta on Oculus in December. So using lasers in the white room in RPE shouldn’t be a problem for the users. It’s plausible that there might be some problems when people just are using VR headsets for the first time and they’ve never used hand tracking before and don’t know what to do in the white (setup) room (in RPE), but with the ease of using lasers in Oculus Home we think the first time users will pick that up in no time.

What are your general thoughts on the state of hand tracking now? Is it stable?

The tracking is a big aspect of it, right now the tracking algorithm from Oculus is pretty solid, they are using AI to predict your hand gestures and certain techniques to eliminate uncertainties when your hands are in a difficult position to track. But still because it’s tracking from the headset there are bound to be moments where it can’t see your hands, for example where you obscure one hand by another both will disappear, or when you are pressing buttons and your fingers are completely pointing away from the headset, that’s where it’s likely to lose tracking. Things like this can affect the experience of the user and we’ve actually added in algorithms to account for this. So, we know that when the fingers are pointing away that the users are most likely reaching for an elevator button, so we will detect which button they want to press and help them press it if their hand loses tracking. So some helper methods there will make the user experience better. 

But in terms of stability, also regarding flying, when you make rapid hand movements it’s really hard for the headset to pick it up every frame. Sometimes when you are moving really rapidly or suddenly it can lose tracking, and when you are flying that can have a big impact because suddenly you have some jitter in your thrust and your brain will detect it. So stability issues like this are what we are trying to work out.

I have a few fans asking us if hand tracking now opens us up for using the controllers for body tracking. Can this work?

No, because hand tracking cant work simultaneously with controllers yet. Maybe soon!

What about body tracking with just the onboard cameras? Is this possible?

So the upper body can easily be worked out with the IK system because we know where your hands are. We can use the IK maths to work backwards from where your arms are likely to be so that’s not a problem. The problem is your lower body because we have no way of knowing where your feet are. In terms of supporting it with external cameras, I don’t know if Quest supports any of that. I think Facebook is working on a solution, but until we hear more about that (which is exciting!) we don’t really have any plans to include full-body tracking at the moment. But when Facebook releases their SDK for that, as well as face tracking, we can look into it. 

Face tracking in VR? What is so cool about it?

It’s cool because you can completely reconstruct your face in a virtual space, talk face to face like you would in the real world. Right now one of the major hindrances of using VR socially is that even though you can have a cool avatar, we are lacking facial expressions. But with facial tracking we will be able to see the subtle movements and facial expressions in VR, which is very powerful. People will be able to meet in person and talk to anyone, anywhere, the same way they would normally. This could be a huge game-changer because people could be anywhere in the world and able to collaborate on their work. 

Where do you think hand tracking and VR will go in the future?

One of the things I think is really cool and I want to see more of is productivity tools, things like GravitySketch on Quest. Things like Tilt Brush and Google Blocks, those are really cool experiences and are precursors of bigger things to come. I think it could mean some major advancements in the way we produce things in the future, for example, assets, content etc. One of the signs of complete industrial revolution back in the 1800s is that they started building machines with machines – once we start building VR with VR, it will kick it into the next level. 

I saw a Minority Report-style demo of a person programming or working in VR, typing and using gestures to drop and drag their windows. Could you see yourself working like this in VR?

Yes definitely – but I think the future of programming is not so much typing up code but more like the blueprint system in Unreal where you would drag and drop different functions to form your program. That’s much more doable in VR, because typing on a virtual keyboard isn’t quite there yet. But visual scripting is definitely feasible in VR, and probably actually easier to do in VR. You can sort them out, pin them to a board, etc. 

What are the next steps for RPE with hand tracking?

Polish! We are going to continue polishing the user experience so that players feel good about using hand tracking. It does come with some limitations – there’s no haptics, and the suite of interactions are totally different to traditional controllers. So there is a lot to learn and work out but we will work towards the goal of the players being excited using hand tracking intuitively. And of course, the flying will be really cool, but until we can say for sure we have something magical happening rather than the user having to work it out, we can’t be sure where that will go yet. Stay tuned!