Zephyrnet Logo

How to Build a VR Video Chat App Using Unity’s XR Framework

Date:

Virtual reality (VR) has seen many recent gains in popularity, and as headset makers work to keep up with hardware demand, developers are working to keep up with users’ need for engaging content. VR isn’t the only technology that’s seen an increase in popularity. In today’s professional world everyone is using live video streaming to connect and collaborate. This creates an interesting opportunity for developers to develop applications that can leverage Virtual Reality along with video streaming to remove all barriers of distance and create an immersive telepresence experience.

VR developers face two unique problems:

  • How do you make VR more inclusive and allow users to share their POV with people who aren’t using VR headsets?
  • How do you bring non-VR participants into a VR environment?

Most VR headsets allow the user to mirror the user’s POV to a nearby screen using a casting technology (Miracast or Chromecast). This creates a limitation of requiring one physical screen per VR headset, and the headset and screen must be in the same room. This type of streaming feels very old-school given that most users today expect to have the freedom to stream video to others whom are remotely located.

In this guide, I’m going to walk through building a Virtual Reality application that allows users to live stream their VR perspective. We’ll also add the ability to have non-VR users live stream themselves into the virtual environment using their web browser.

We’ll build this entire project from within the Unity Editor without writting any code.

For this project, I will use an HTC Vive Focus Plus because it allows me to build using Unity’s XR framework, which makes it relatively easy to setup a VR environment. And Vive builds to an Android target, so I can use the Agora Video for Unity SDK to add live video streaming to the experience.

How VR could bring transhumanism to the masses

How Augmented Reality (AR) is Reshaping the Food Service Industry

ExpiCulture — Developing an Original World-Traveling VR Experience

Enterprise AR: 7 real-world use cases for 2021

This project will consist of three parts. The first part will walk through how to set up the project, implementing the Vive packages along with the Agora SDK and Virtual Camera prefab.

The second part will walk through creating the scene, including setting up the XR Rig with controllers, adding the 3D environment, creating the UI, and implementing the Agora Virtual Camera prefab.

The third section will show how to use a live streaming web app to test the video streaming between VR and non-VR users.

Note: While no Unity or web development knowledge is needed to follow along, certain basic concepts from the prerequisites won’t be explained in detail.

The first part of the project will build a Unity app using the XR Framework, walking through how to add the Vive registry to the project, download and install the Vive plug-ins, install the Agora Video for Unity Plug-in, and implement the Agora plug-in using a drag-and-drop prefab.

Start by creating a Unity project using the 3D Template. For this demo, I’m using Unity 2019.4.18f1 LTS. If you wish to use Unity 2020, then you will need to use the Wave 4.0 SDK, which is currently in beta.

Note: At the time of writing, I had access to the beta SDK for the new HTC headset. I chose to use the Wave 3.0 SDK. This project can be set up in exactly the same way using the Wave 4.0 SDK.

Once the new project has loaded in Unity, open the Project Settings and navigate to the Package Manager tab. In the Scoped Registries list, click the plus sign and add the Vive registry details.

Once the Vive “scopedRegistries” object and its keys have been added, you’ll see the various loaders importing the files. Next open Window > Package Manager and select Packages: My Registries. You will see the VIVE Wave. If no package is shown, click Refresh at the bottom-left corner.

Click through and install each of the Vive packages. Once the packages finish installing, import the PureUnity and XR samples from the Vive Wave XR Plug-in and the Samples from the Essense Wave XR Plug-in into the project.

After the packages have finished installing and the sample has finished importing, the WaveXRPlayerSettingsConfigDialog window will appear. HTC recommends to Accept All to apply the recommended Player Settings.

Next, open the Project Settings, click in the XR Plug-in Management section, and make sure Wave XR is selected.

Now that we have the Wave SDK configured, we need to add Unity’s XR Interaction Toolkit package. This will allow us to use Unity’s XR components for making it possible to interact with Unity inputs or other elements in the scene. In the Package Manager, click the Advanced button (to the left of the search input) and enable the option to “Show preview packages”. Once the preview packages are visible, scroll down to the XR Interaction Toolkit and click install.

Note: If you are using Unity 2018, you will need to configure the input Manager using the presets provided by HTC. The inputs can be defined manually, or you can download this InputManager preset. Once you’ve downloaded the preset, drag it into your Unity Assets, navigate to the InputManager tab in the Project Settings, and apply the presets.

InputManager preset from HTC Wave Documentation

When working with the Wave XR Plug-in, you can change the quality level by using QualitySettings.SetQualityLevel. HTC recommends setting the Anti Aliasing levels to 4x Multi Sampling in all quality levels. You can download this QualitySettings preset from HTC’s Samples documentation page.

For more information about Input and Quality Settings, see the HTC Wave Documentation.

Open the Unity Asset store, navigate to the Agora Video SDK for Unity page, and download the plug-in. Once the plug-in is downloaded, import it into your project.

Note: If you are using Unity 2020, the Asset store is accessible through the web browser. Once you import the asset into Unity, it will import through the Package Manager UI.

The last step in the setup process is to download the Agora Virtual Camera Prefab package and import it into the project.

When you import the Agora Virtual Camera Prefab package you’ll see that it contains a few scripts, a prefab, and a renderTexture. The two main scripts to note are AgoraInterface.cs (which contains a basic implementation of the Agora Video SDK) and AgoraVirtualCamera.cs (which implements and extends the AgoraInterface specifically to handle the virtual camera stream). The tools folder contains a few helper scripts: one for logging, another to request camera/mic permissions, and one to handle token requests.

Also included in the package is the AgoraVirtualCamera.prefab, an empty GameObject with the AgoraVirtualCamera.cs attached to it. This will make it easy for us to configure the Agora settings directly from the Unity Editor without having to write any code. The last file in the list is the AgoraVirtualCamRT.renderTexture, which we’ll use for rendering the virtual camera stream. The tools folder contains the scripts Logger.cs, PermissionHelper.cs, and RequetstToken.cs.

We are now done with installing all of our dependencies and can move on to building our XR video streaming app.

In this section, we will walk through how to set up our 3D environment using a 3D model, create an XR Camera Rig Game Object, create the buttons and other UI elements, and implement the Agora VR Prefab.

Create a scene or open the sample scene and import the environment model. You can download the model below from Sketchfab.

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://arvrjourney.com/how-to-build-a-vr-video-chat-app-using-unitys-xr-framework-4e1774bcd53b?source=rss—-d01820283d6d—4

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?