Zephyrnet Logo

How to Design User Interfaces for Oculus Quest Hand Tracking Applications

Date:

Source

Dejan Gajsek

In late 2019, the introduction of Oculus Quest hand tracking dominated the virtual reality (VR) industry. While this experimental feature is still very much in its infancy, there is considerable hype among VR app developers as it may open the doors to exciting new possibilities in mixed reality technology.

Oculus Quest hand-tracking support seeks to give VR developers more freedom to create and express themselves through natural gestures and simple interactions. Over time, app developers may be able to leverage this feature to make virtual reality a more accessible, realistic experience for users.

Most AR headsets and more advanced VR headsets as Oculus Quest are striving for so-called six-degrees-of-freedom (6DOF) where you can freely move your hands and body in your space.

Oculus Quest comes with Touch Controllers where the movements are being tracked by the sensors in your headset. No wires necessary. It’s one of the most exciting features of the device and it’s not surprising both Quest models (64GB and 128GB) are sold out from the Oculus Store. If you’ve ever thought about getting Oculus Rift or Rift S, consider getting a Quest first.

Watch the on-demand webinar about UI Design in VR

To experience this groundbreaking feature try a few of the examples where you’re using your real hands:

In this article, we’ll explore the impact of Oculus Quest hand tracking, look at the current best practices and consider how you can get a grip on this potentially ground-breaking technology to create a better user interface (UI).

Consider this article as a practical advice towards developing your UX/UI design. Keep in mind the hand tracking field is still in the early days and there’s no “ one size fits all “ approach. Apart from limitations of the system the virtual experience of the applications is different from user to user. Saying that, this content is a good starting point.

Attend a free “Designing UI for Hand Tracking” Workshop and learn the principles on actual use cases

Table of Contents

Chapter 1: Defining Hand Tracking?
Chapter 2: New Hardware Demands — New Paradigms?
Chapter 3: The Three Primary Ways of Interacting with Hand Tracking
Direct Manipulation
Hand Rays
Gesture Recognition
Chapter 4: Alternative Hand Tracking Ideas
Chapter 5: How to start designing hand tracking applications
Chapter 6: The Future of Hand Tracking Design (Q&A)

So, the obvious question:

What is Oculus Quest controller-free hand tracking?

Hand tracking is a feature on the Oculus Quest head-mounted display (HMD) that enables people to use their hands as a viable input method when using the VR device. Users can make simple gestures with their hands, such as pinching, holding, or dragging to perform tasks or actions in the VR environment.

  1. Jitter is a misalignment between the virtual hands and your actual hands. In a lot of AR platforms, the virtual hands are represented in abstract ways, like with shapes, clouds, or sparkles. This design is intentional to hide the jitter effect, so users don’t pick up on any obvious jittering.
  2. Drift is the feeling that you are moving in the virtual world, even while you stand still. Objects appear to move around you for no reason because of a constant offset of where the computer thinks your hand is compared to its real location. Bad lighting or too much light coming into the headset is a primary cause of drift issues.
  3. Grotesque Teleports are when a part of the virtual hands, like a finger, randomly appears inside the hand, or even completely disjointed somewhere else in the room. It may only last for a frame, as these errors happen when the technology misreads the environment and makes bizarre misjudgments. (It’s also a really good name for a symphonic progressive rock band.)

The critical aspect is that it is fully-articulated tracking hardware that can determine where your hands are in the virtual space and what every finger or finger joint is doing at any moment. This design facilitates natural interactions, which give the user a heightened sense of presence and a more immersive, engaging experience.

Tracked motion controllers or HoloLens 1 don’t provide information about your hands’ location or where your fingers are pointing. Similarly, Valve Knuckles are not hand tracking technology because they only offer one axis of movement.

You can see good examples of fully-articulated hand tracking in Leap Motion, HoloLens 2, in Smartphone SDKs, including MediaPipe and ManoMotion, as well as in VR, including Manus Prime and VRgluv.

Now, Oculus Quest hand tracking joins that list, which presents excellent opportunities for app developers.

Good things happen when the product and design teams at Facebook Reality Labs put their heads together. What originally started as a research project has resulted in an innovative new paradigm for virtual reality input.

The software behind Oculus Quest hand tracking incorporates deep learning, allowing the computer to determine the position of the user’s fingers, using only the Quest’s native monochrome cameras. The technology creates a group of 3D points that map to the user’s hands and fingers, enabling it to represent movement to an accurate degree in the VR environment.

While the new Oculus feature is a leap forward for mixed reality, it poses a challenge for developers and users alike. All unexplored technology is untested, and therefore, it forces designers to think outside the box, as they effectively must come up with new ways of designing software for this unique hardware.

The last major shift of this magnitude was when the world moved to a mobile-centric reality, swapping their desktop computers for smartphones and tablets. Developers had to forget about the mouse-based environment, and instead, think about touchscreens, as clicking was replaced by swiping.

Now, in the realm of VR development, app designers must figure out how to get the most of the Oculus Quest technology using their hands instead of controllers.

When using Oculus Quest hand tracking, there are three ways you can interact with the virtual environment:

Let’s take a closer look at each one.

Direct manipulation is a virtual reality input model where the user reaches out with their hands to touch and interact with holograms. Objects behave as they would in reality, and so, it’s a fun and easy way of learning how to control a virtual world.

You can press buttons, pick up objects, scroll windows, and activate 2D content aspects as if they were a virtual touchscreen. Direct manipulation is a near input model, which means it is best when you want to interact with content within arms reach.

Challenges of Direct Manipulation

  1. Unintentional gesticulations through subconscious movement or common body language. People moving their hands naturally while talking might cause the software to do something in response.
  2. Intentional operations in your world may infringe on the user’s ability to use certain poses for other reasons. For example, if the user can activate a button in the world, but also uses a similar pushing movement elsewhere, it can confuse the system.
  3. Intentional inherent meaning gestures like the OK sign or thumbs up already have an ingrained meaning in many cultures. Designers must avoid using these in their user interface because the user expects there to be a close one-to-one mapping between the meaning they already understand and the meaning that the software will perceive.
  4. Intentional meaningless gestures are specific poses or movements that don’t have any preconceived cultural meaning and are only done intentionally. For example, Quest uses an eye-pinch gesture that is unlikely to be confused with any natural body language.

While technology advances rapidly, there are some persistent issues with direct manipulation:

The Importance of Interaction Resolution

Together, these three problems are the key factors that impact interaction resolution, which is the minimum object scale that users can comfortably engage in the virtual world without encountering any detrimental visual or performance issues.

If you have an externally tracked controller, like a Quest controller, you have incredible accuracy. You may find it possible to comfortably interact with items the size of a pinhead in the virtual environment. However, with hand tracking, the interaction resolution is not as good because of the issues above. Realistically, any objects smaller than an inch will be hard to engage.

While the quality and user interface of Oculus Quest hand tracking is still impressive, the technology still has room for improvement. The quality will get better over time, but for now, there are some limitations that designers must accept.

Besides the object size, you must also think about the length of your arms when designing a virtual world. With direct manipulation, you can only interact with objects you can reach. More to the point, consider the length of your shortest-armed user and make sure not to overpack your virtual world with elements that may be out of reach.

A hand ray is a virtual reality design concept where the user can “shoot” a beam from their hand at a distant object, and then use gestures to exercise control over that object from afar.

Source: microsoft.github.io

This feature enables you to interact with the user interface in many ways, such as flipping switches, pushing buttons, or picking up items from across the room. You may even get some haptic feedback as you touch your finger and thumb together to activate control on an object.

You can anchor hand rays based on your head position, which adds a degree of accuracy before you adjust the angle through a broader range of motion with your arms and hands.

  1. Swipe Keyboard enables you to swipe your hands around a virtual keyboard. The software guesses what you are trying to write using machine learning technology, and the completed text appears on the screen as you continue swiping.
  2. Flying Hands makes it possible for you to interact with items that are far away. It is like a modified hand ray, where your hands are at the end of the ray, allowing near-field direct object manipulation on objects that are at a distance.
  3. Range/Size Amplification is another distance tool where the virtual reality hands are much bigger than the user’s actual hands. This allows for large-scale manipulation so you can control huge objects and entire scenes with relatively minimal movement.
  4. Hand Throwing is a fun concept where you can throw your hand away from you, and then use gestures to crawl the detached around the floor or other surfaces to get to objects.

Challenges of Hand Rays

Typically, hand rays don’t work as well in a virtual environment as they do on computers.

Gesture recognition is a virtual reality design concept where the computer analyzes the pose of the user’s hand in regard to the position and shape of the fingers and palm and then triggers a corresponding action. For example, the computer may recognize common sign language uses, like the peace symbol or a rock n’ roll sign.

Source: Medium / Vincent Mühler

Challenges of Gesture Recognition

For designers, gestures can be tricky to deal with, as there is a lot of room for error. You must consider the following:

“Don’t put the burden on yourself to build out a whole game or build out a whole product. Build many small things, no matter how silly they may seem. You’ll be surprised at just how much you learn.”

When are gestures justified?

Designers must incorporate certain gestures to enable users to improve the user experience for their Oculus Quest hand tracking application. A prominent example of an essential gesture is to perform a system escape. Users must have access to this ability at all times.

When the inherent meaning of a gesture matches the outcome, it’s worth including the gesture in your design. For example, if the user wants to take a screenshot, they can bring the tips of their thumbs toward the forefinger from the opposing hand to create a square shape, as if they are looking through the frame of a camera.

A further example of a justifiable gesture in a hand tracking application is when the gesture is the sole interaction in the interface. For instance, in a virtual darts game, the only interaction the user would be doing is the motion of throwing a dart.

The latest developments in Oculus Quest hand tracking have got the VR/AR community talking, and many UI designers are chomping at the bit to push the boundaries of this feature.

In doing so, they’ll quickly find the truth:

We need new design paradigms.

  1. Vectary takes 2D vectors and gives them depth to make them 3D. You can do this to quickly create a 3D world yourself, without needing a software engineer or high-level coding skills.
  2. Blocks by Google allows you to enter a VR world and use basic building blocks to construct shapes. You can then export your finished models directly into your game engine.
  3. Tilt Brush by Google is a similar tool to Blocks that allows you to create more complex, organic shapes through paint tools.

Virtual reality remains limited in many ways, as designers are yet to discover the most important and exploratory paradigms to maximize the potential of hand-tracking technology.

Here are just some ideas that you can use when designing Oculus Quest hand tracking applications:

The list goes on and is only limited by the imagination of the designer. To get a feel for the possibilities of new paradigms in VR design, consider the examples below.

Some of these ideas may seem novel at best, but Daniel Beauchamp , the head of VR/AR at Shopify, explains this an essential road to evolution.

“One of the best ways to unlock new and powerful ideas is to build upon silly ideas,” says Beauchamp.

To get to the ground-breaking ideas in VR/AR, you must first go through a lot of bad ideas. As you do this, think about how you make improvements to hand tracking faster. After you setup Oculus Quest for development move your mind to the researcher’s mode.

Instead of trying to brainstorm a winning product with hand tracking instantly, you should play around with the technology, coming up with fun ideas that users enjoy. In doing this, you can get familiar with the capabilities of the tool, and may stumble upon interactions and concepts that you can apply to a more purposeful project.

Beauchamp claims he wishes more VR developers took this approach, as the little ideas can lead to huge breakthroughs.

If someone walks into your office and sees a pair of fake hands on your desk, they may think you’re taking the silly approach a little too far. However, VR app developers should consider getting themselves some quality physical models.

It’s much easier to formulate ideas and explain your concepts to other team members when you have a set of hands. You can draw some axes on the back of the hands, which can make it easier to demonstrate different motions and suggested gestures with greater accuracy.

It’s vital to discuss design concepts and new ideas. Your vision for a virtual reality environment or application will only come to be if you talk openly about it with your team members.

Make sure you have regular meetings or open platforms and project management tools that facilitate free-flowing discussion so you can bounce ideas around, and get the feedback needed to sculpt a rough brainwave into a polished concept.

Everybody in design knows that the faster you can get your designs visible, the better. Here are a few tools that UI designers can use to hone their skills in designing VR environments:

Avoid the temptation to dive into VR design without a good debugging tool. You need to think about how you can improve the system and user interface, and so, a debugging tool is a vital aspect that will help you identify and eliminate issues quickly.

Time is often a significant constraint in VR design. Even if you’re on top of the debugging issues, you must look for ways to reduce your iteration time.

If you’re using the Oculus link, you can use Quest hand tracking on your PC instead of building an app and deploying it to the Quest for testing. Through Oculus link, your VR app will play live through Unity or Unreal.
An alternative to this is Leak Motion, a camera that you can place on your desk or helmet that will enable everyone on the team to access hand tracking.

Another idea is to configure ways of simulating hands through mouse or keyboard functions. Doing this means people in the team can quickly play with simple ideas on the fly, which speeds up the design process.

When you use VR newbies for beta testing, it can prove unproductive. As they have no experience with the hardware, it’s hard for them to get past the WOW factor. Instead, you should use people with some VR experience because they can give negative feedback and constructive criticism that helps you improve your app design.

While many experts may have some bias toward certain paradigms or technology, they are more useful for testing and optimization assistance. To combat any bias, ask your testers to rely only on what they see or hear in the virtual environment. By playing dumb and acting solely on what the software tells them to do, users can identify flaws in the UI.

Oculus Quest hand tracking presents immense opportunities from education to enterprise. As virtual reality becomes ever-more integrated into the modern world, more companies want to tap into the raw potential of this technology.

For now, there are still some kinks in the design. Still, UI designers and app developers can come together to create incredibly engaging and immersive VR environments for a diverse range of real-world uses.

Communication within design teams is crucial, as is external communication with users, especially when it comes to testing and optimization. Developers must enlist the help of experienced VR users during testing, and leverage advanced debugging tools to reduce iteration times.

Performance matters most, but fast production is vital as the competition heats up to become the company that creates the next “swipe” motion for VR on smartphones.

You can learn more about designing for hands from Oculus or practice hand tracking techniques in Microsoft Design Labs Hands Playground for Quest and HoloLens 2.

Source: https://arvrjourney.com/how-to-design-user-interfaces-for-oculus-quest-hand-tracking-applications-586cdc6e0240?source=rss—-d01820283d6d—4

spot_img

Latest Intelligence

spot_img