Zephyrnet Logo

VR Hand Tracking with Oculus Quest and Oculus Link

Date:

Madhawa Perera

Hand gestures can be considered as one of the most intuitive interaction modes in human-computer interaction (HCI)/ Human Device Interaction (HDI), especially with the latest trends introduced with Mixed reality (XR). (Well, gestural interactions has its own challenges and shortcomings which is one of the contemporary research areas). This article aims to help you to have a quick hands-on experience with gestural interaction in Virtual Reality (VR).

A few months ago Oculus started to support the use of hand tracking on PC via Unity when using Oculus Quest + Oculus Link. This helped to improve iteration time for Oculus Quest developers. This means that you don’t have to build and deploy your VR app for testing and debugging. This is a feature that many of the VR developers were waiting to see. This brief article is a guide to start with hand tracking with Oculus Quest + Oculus Link. This article does not require any coding skills to try this out. However, if you want to interact with objects and bring physics into your app you have to write your own scripts.

Self Rock-Paper-Scissors 😊

Before we get started let’s see the prerequisites,

  1. Oculus Quest VR Headset
  2. Oculus Link cable (optional but recommended)
  3. Unity Version 2019.3.X
  4. Follow my article on How to set up Unity for Oculus Quest Development and set up Unity editor.

First, make sure you have installed the Oculus software in your PC. (Otherwise, click the link and download and install it). Installation instructions will guide you to connect your Oculus Quest with the Oculus Link cable to your PC. If everything goes alright you should see a similar screen as below in Figure 1.

Figure 1

Make sure to enable ‘Unknown Sources’ in Oculus Software under Settings General. See Figure 2 below.

Figure 2

N.B. Make sure to enable ‘Auto Enable Hands or Controllers’ feature in Oculus Quest (Not in the installed Oculus Software on your PC). In Oculus Quest, go to SettingsSee AllDeviceHands and Controllers and enable the Hand Tracking feature and Auto Enable Hands or Controllers feature by sliding the toggle buttons.

1. Designing for a modern 3D world: A UX design guide for VR

2. Scripting Javascript Promise In Spark AR For Beginners

3. Build your first HoloLens 2 Application with Unity and MRTK 2.3.0

4. Virtual Reality: Do We Live In Our Brain’s Simulation Of The World?

Spend some time to complete these steps and come back to the article. Once this is done, keep the Oculus software running in your PC (don’t close it), and let’s get into Unity.

As in the prerequisites, I assume that you have read my How to set up Unity for Oculus Quest Development article and have already set up Unity.

In Unity, go to the Project panelScenes folder and right-click then go to createScene and name it as you like. I will name it as HandTrackingScene.

Then delete the default camera and add OVRCameraRig prefab with the position set to X:0, Y:0, and Z:0. (All these steps are explained in detail in my article on How to set up Unity for Oculus Quest Development. Once finished, your current window will look like as inFigure 3 below.

Figure 3

One thing you need to make sure is that under OVRCameraRig whether you have set the Hand Tracking Support to Controllers And Hands or Hands Only. Refer to Figure 4 below.

Figure 4

Next, let’s create a plane to make things look a little better. (This step is not necessary so you can skip it and jump to sub-topic ‘Add OVRHandPrefab and configure’). Refer to Figure 5 if you want to create a plane.

Figure 5

Create a material and add it to the pane. This step is again optional if you decide not to create a plane. Refer Figure 6 GIF if you want to follow this step.

Figure 6

Add OVRHandPrefab and configure

Next search for OVRHandPrefab in the Project Panel and click and drag it on top of the LeftHandAnchor and RightHandAnchor objects. Refer to Figure 7 below which shows these steps in a GIF.

Figure 7

After that change the Hand Type, Skeleton Type, and Mesh Type under the OVRHandPrefab which was added under the RightHandAnchor. Usually, these are set for left hand hence, no need to change these in the OVRHandPrefab added under the LetHandAnchor. You can double-check if you want. Refer to Figure 8 below.

Figure 8

After that, I changed the OVRHandPrefab material to make it red 🙂 (Just for fun). To do that, select both OVRHandPrefab and then in the Inspector panel go to Element0 and change it to any material you like. I had already a Red material so I selected that. Refer to Figure 9 below.

Figure 9

Once you come to this point you are all set to see your hand tracking in Oculus Quest connected with a Link cable. While your Oculus Quest is connected to your PC through the Link cable, hit the play button. Refer to Figure 10.

Figure 10

Congratulations! You should be able to see your hands in realtime. This way you can save a lot of time as you don’t need to build and deploy your app to Oculus Quest to test your application.

You can find a similar scene set up inside Assets/Oculus/VR/Scenes/ folder. Double click on HandTest scene and it will be the same as I described above. This functionality is only supported in the Unity Editor for now and for me, it helped to improve the iteration time a lot when I develop VR application with Oculus Quest. I thought it would be helpful to everyone else, hence this brief article.

For further learning, go through the famous Mini-Train example (HandsInteractionTrainScene) provided by Oculus which you can find inside Assets/Oculus/SampleFramework/Usage/ folder. And more details here in Oculus documentation.

I hope this article will encourage you to explore more in this fascinating area of Human Device Interaction (HDI) where the digital world will start understanding human gestural intents.

If you like the article and the content, you can buy me a virtual coffee ️😊;

Cheers!

Source: https://arvrjourney.com/vr-hand-tracking-with-oculus-quest-and-oculus-link-35568eb3d6f4?source=rss—-d01820283d6d—4

spot_img

Latest Intelligence

spot_img