Connect with us

AR/VR

How to Use Oculus Air Link to Play PC VR Games Wirelessly on Quest 2

Avatar

Published

on

Oculus Air Link requires that both your Quest 2 headset and Oculus PC software are running v28 or later. Here’s how to check:

On Quest 2
  • In your Quest 2 headset, open the Settings page (gear icon) on the menu bar
  • On the left side of the Settings page, scroll down to find the About section
  • On the About page, see the Version section, which should read 28.X or higher (it may be a very long version number, like 28.0.0.221.359…)

If your Quest 2 isn’t yet running v28 or later, Oculus Air Link will not work. The v28 update is rolling out slowly to Quest 2 users. If you aren’t already on v28 or later, see if there’s a prompt to update your headset on the About page next to the Software Update section. There’s no way to force the update, but if it says ‘No Updates Available’, you could try restarting your headset and checking again.

On Your PC
  • On your PC, launch the Oculus PC app (if you don’t already have it installed, you can download it here)
  • On the left side of the app, select Settings then go to the General tab
  • Scroll all the way to the bottom of the General section where you will find the version number which should read 28.X or higher (it may be a very long version number, like 28.0.0.222.459…)

If your Oculus PC software isn’t yet running v28 or later, Oculus Air Link will not work. The v28 update is rolling out slowly. If you aren’t already on v28 or later, go to the Library section and then the Updates tab, you may see an ‘Oculus Software Update’ item in the list. Allow it to update and restart the software if prompted, then check again to see if you are on version 28 or later.

If you still aren’t on v28, go to the Settings section and then the Beta tab. Enable the Public Test Channel option, then return to the Library section and the Updates tab to see if an ‘Oculus Software Update’ appears. Allow it to install and then check your version number again to see if you’re on v28 yet.

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://vrarnews.com/details/how-to-use-oculus-air-link-to-play-pc-vr-games-wirelessly-on-quest-2-608bb2ef0388565630657acd?s=rss

AR/VR

Smash Drums Hits Oculus Quest in June

Avatar

Published

on

When the Oculus Quest App Lab launched back in February the new distribution method only had a very small selection of titles available. One of those was a demo for Smash Drums, a rock-infusing rhythm action videogame by indie team PotamWorks. Today, the developer has confirmed that the full experience will arrive via App Lab next month.

Smash Drums

While the Smash Drums demo offers one song for players to test their drumming skills on the official launch in June will see 21 songs from up-and-coming indie rock bands made available. There are seven environments for players to drum away in, including a prison, a crypt, and an office but one of Smash Drums’ more unique features is that the environments are destructible. So you can literally smash through the songs.

Like several other rhythm action titles currently available Smash Drums employs 360-degree gameplay, so rather than standing on one stationary spot players will find the drums twisting left and right encouraging further movement. At the same time, there’s a live global rankings system allowing players to see their leaderboard position in real-time.

Once the launch has taken place PotamWorks will continue to expand Smash Drums with further content, both paid and free. The studio plans on adding free songs and environments on a regular basis with paid DLC introducing new bands to support development.

Smash Drums

“As a rock fan since I was a teenager in the 90s, it’s been a joy to bring my childhood fun into my real-life job. Thanks to the incredible help of the discord community, and incredible feedback the players provided, I am looking forward to releasing the game and supporting it long beyond once it reaches App Lab,” said Mr Potam, founder at PotamWorks SAS in a statement. “There is so much more to come, and I want you to be sure of one thing: Rock n Roll is not dead: it has never been more ALIVE!”

Download the Smash Drums demo ahead of its launch on 17th June 2021, retailing for $19.99 USD. PotamWorks also plans to release a PC VR version later this year. As further updates are released VRFocus will let you know.

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.vrfocus.com/2021/05/smash-drums-hits-oculus-quest-in-june/

Continue Reading

AR/VR

Intel Researchers Give ‘GTA V’ Photorealistic Graphics, Similar Techniques Could Do the Same for VR

Avatar

Published

on

Researchers from Intel’s Intelligent Systems Lab have revealed a new method for enhancing computer-generated imagery with photorealistic graphics. Demonstrated with GTA V, the approach uses deep-learning to analyze frames generated by the game and then generate new frames from a dataset of real images. While the technique in its research state is too slow for real gameplay today, it could represent a fundamentally new direction for real-time computer graphics of the future.

Despite being released back in 2013, GTA V remains a pretty darn good looking game. Even so, it’s far from what would truly fit the definition of “photorealistic.”

Although we’ve been able to create pre-rendered truly photorealistic imagery for quite some time now, doing so in real-time is still a major challenge. While real-time raytracing takes us another step toward realistic graphics, there’s still a gap between even the best looking games today and true photorealism.

Researchers from Intel’s Intelligent Systems Lab have published research demonstrating a state of the art approach to creating truly photorealistic real-time graphics by layering a deep-learning system on top of GTA V’s existing rendering engine. The results are quite impressive, showing stability that far exceeds similar methods.

In concept, the method is similar to NVIDIA’s Deep Learning Super Sampling (DLSS). But while DLSS is designed to ingest an image and then generate a sharper version of the same image, the method from the Intelligent Systems Lab ingests an image and then enhances its photorealism by drawing from a dataset of real life imagery—specifically a dataset called Cityscapes which features street view imagery from the perspective of a car. The method creates an entirely new frame by extracting features from the dataset which best match what’s shown in the frame originally generated by the GTA V game engine.

An example of a frame from GTA V after being enhanced by the method | Image courtesy Intel ISL

This ‘style transfer’ approach isn’t entirely new, but what is new with this approach is the integration of G-buffer data—created by the game engine—as part of the image synthesis process.

An example of G-buffer data | Image courtesy Intel ISL

A G-buffer is a representation of each game frame which includes information like depth, albedo, normal maps, and object segmentation, all of which is used in the game engine’s normal rendering process. Rather than looking only at the final frame rendered by the game engine, the method from the Intelligent Systems Lab looks at all of the extra data available in the G-buffer to make better guesses about which parts of its photorealistic dataset it should draw from in order to create an accurate representation of the scene.

Image courtesy Intel ISL

This approach is what gives the method its great temporal stability (moving objects look geometrically consistent from one frame to the next) and semantic consistency (objects in the newly generated frame correctly represent what was in the original frame). The researchers compared their method to other approaches, many of which struggled with those two points in particular.

– – — – –

The method currently runs at what the researchers—Stephan R. Richter, Hassan Abu AlHaija, and Vladlen Koltun—call “interactive rates,” it’s still too slow today to make for practical use in a videogame (hitting just 2 FPS using an Nvidia RTX 3090 GPU). In the future however, the researchers believe that the method could be optimized to work in tandem with a game engine (instead of on top of it), which could speed the process up to practically useful rates—perhaps one day bringing truly photorealistic graphics to VR.

“Our method integrates learning-based approaches with conventional real-time rendering pipelines. We expect our method to continue to benefit future graphics pipelines and to be compatible with real-time ray tracing,” the researchers conclude. […] “Since G-buffers that are used as input are produced natively on the GPU, our method could be integrated more deeply into game engines, increasing efficiency and possibly further advancing the level of realism.”

The post Intel Researchers Give ‘GTA V’ Photorealistic Graphics, Similar Techniques Could Do the Same for VR appeared first on Road to VR.

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://vrarnews.com/details/intel-researchers-give-gta-v-photorealistic-graphics-similar-techniques-could-do-the-same-for-vr-609cf87e7984ab3c72f598bb?s=rss

Continue Reading

AR/VR

Clash of Chefs VR Serves Up a Tasty Meal on Oculus Quest This Summer

Avatar

Published

on

Flat Hill Games released its cooking videogame Clash of Chefs VR as a Steam Early Access title back in 2018 for PC VR headsets. This summer Clash of Chefs is set to officially launch, leaving Early Access whilst natively supporting the Oculus Quest platform.

Clash of Chefs VR

For the last couple of years Clash of Chefs VR has been serving up a fast and frantic cooking experience where players can tackle an 80 level single-player or go head-to-head in the multiplayer. Budding virtual chefs will be challenged to prepare American, Italian, and Japanese recipes as fast as possible, juggling various stations as they create ever more complex meals.

From slapping a simple burger together to plating up a bowl of ramen, to hit the top of those online leaderboards players will need to carefully ensure the right amount of beans go in the burritos, shredded cheese doesn’t overwhelm pasta bowls and the salad is finely chopped.

It doesn’t all need to be serious work though. Have some fun along the way by unleashing your inner Gordon Ramsey, smashing plates on the floor, throwing an onion in the waiters face or grabbing the condiment bottles to create a fountain or sauce.

Clash of Chefs VR

“Oculus Quest’s catalog is hyper-curated to ensure a high degree of player satisfaction and great return on investment for developers,” says Adrian Djura, CEO and Founder of Flat Hill Games in a statement. “We’re proud that our virtual food frenzy, Clash of Chefs VR, will soon be featured among other quality titles on the headset’s platform.”

As Clash of Chefs VR will be leaving Early Access Flat Hill Games will providing a bunch of new content for current players to enjoy, with a new themed restaurant, achievements, and customization options just some of the new features.

Clash of Chefs VR will be available for Oculus Quest Summer 2021, providing cross-platform gameplay with the Steam version. For all the latest Oculus Quest content updates keep reading VRFocus.

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.vrfocus.com/2021/05/clash-of-chefs-vr-serves-up-a-tasty-meal-on-oculus-quest-this-summer/

Continue Reading

AR/VR

Intel Researchers Give ‘GTA V’ Photorealistic Graphics, Similar Techniques Could Do the Same for VR

Avatar

Published

on

Researchers from Intel’s Intelligent Systems Lab have revealed a new method for enhancing computer-generated imagery with photorealistic graphics. Demonstrated with GTA V, the approach uses deep-learning to analyze frames generated by the game and then generate new frames from a dataset of real images. While the technique in its research state is too slow for real gameplay today, it could represent a fundamentally new direction for real-time computer graphics of the future.

Despite being released back in 2013, GTA V remains a pretty darn good looking game. Even so, it’s far from what would truly fit the definition of “photorealistic.”

Although we’ve been able to create pre-rendered truly photorealistic imagery for quite some time now, doing so in real-time is still a major challenge. While real-time raytracing takes us another step toward realistic graphics, there’s still a gap between even the best looking games today and true photorealism.

Researchers from Intel’s Intelligent Systems Lab have published research demonstrating a state of the art approach to creating truly photorealistic real-time graphics by layering a deep-learning system on top of GTA V’s existing rendering engine. The results are quite impressive, showing stability that far exceeds similar methods.

In concept, the method is similar to NVIDIA’s Deep Learning Super Sampling (DLSS). But while DLSS is designed to ingest an image and then generate a sharper version of the same image, the method from the Intelligent Systems Lab ingests an image and then enhances its photorealism by drawing from a dataset of real life imagery—specifically a dataset called Cityscapes which features street view imagery from the perspective of a car. The method creates an entirely new frame by extracting features from the dataset which best match what’s shown in the frame originally generated by the GTA V game engine.

An example of a frame from GTA V after being enhanced by the method | Image courtesy Intel ISL

This ‘style transfer’ approach isn’t entirely new, but what is new with this approach is the integration of G-buffer data—created by the game engine—as part of the image synthesis process.

An example of G-buffer data | Image courtesy Intel ISL

A G-buffer is a representation of each game frame which includes information like depth, albedo, normal maps, and object segmentation, all of which is used in the game engine’s normal rendering process. Rather than looking only at the final frame rendered by the game engine, the method from the Intelligent Systems Lab looks at all of the extra data available in the G-buffer to make better guesses about which parts of its photorealistic dataset it should draw from in order to create an accurate representation of the scene.

Image courtesy Intel ISL

This approach is what gives the method its great temporal stability (moving objects look geometrically consistent from one frame to the next) and semantic consistency (objects in the newly generated frame correctly represent what was in the original frame). The researchers compared their method to other approaches, many of which struggled with those two points in particular.

– – — – –

The method currently runs at what the researchers—Stephan R. Richter, Hassan Abu AlHaija, and Vladlen Koltun—call “interactive rates,” it’s still too slow today to make for practical use in a videogame (hitting just 2 FPS using an Nvidia RTX 3090 GPU). In the future however, the researchers believe that the method could be optimized to work in tandem with a game engine (instead of on top of it), which could speed the process up to practically useful rates—perhaps one day bringing truly photorealistic graphics to VR.

“Our method integrates learning-based approaches with conventional real-time rendering pipelines. We expect our method to continue to benefit future graphics pipelines and to be compatible with real-time ray tracing,” the researchers conclude. […] “Since G-buffers that are used as input are produced natively on the GPU, our method could be integrated more deeply into game engines, increasing efficiency and possibly further advancing the level of realism.”

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.roadtovr.com/intel-research-gta-v-photorealistic-graphics-deep-learning-vr/

Continue Reading
Aviation4 days ago

JetBlue Hits Back At Eastern Airlines On Ecuador Flights

Cyber Security5 days ago

Cybersecurity Degrees in Massachusetts — Your Guide to Choosing a School

Blockchain4 days ago

“Privacy is a ‘Privilege’ that Users Ought to Cherish”: Elena Nadoliksi

AI2 days ago

Build a cognitive search and a health knowledge graph using AWS AI services

Cyber Security5 days ago

Cybersecurity Degrees in Texas — Your Guide to Choosing a School

Blockchain1 day ago

Meme Coins Craze Attracting Money Behind Fall of Bitcoin

Energy3 days ago

ONE Gas to Participate in American Gas Association Financial Forum

Esports3 days ago

Pokémon Go Special Weekend announced, features global partners like Verizon, 7-Eleven Mexico, and Yoshinoya

Fintech3 days ago

Credit Karma Launches Instant Karma Rewards

Blockchain4 days ago

Opimas estimates that over US$190 billion worth of Bitcoin is currently at risk due to subpar safekeeping

SaaS4 days ago

Blockchain11 hours ago

Shiba Inu: Know How to Buy the New Dogecoin Rival

Esports2 days ago

Valve launches Supporters Clubs, allows fans to directly support Dota Pro Circuit teams

SaaS4 days ago

Blockchain4 days ago

Yieldly announces IDO

Esports4 days ago

5 Best Mid Laners in League of Legends Patch 11.10

Cyber Security3 days ago

Top Tips On Why And How To Get A Cyber Security Degree ?

SaaS4 days ago

Blockchain1 day ago

Sentiment Flippening: Why This Bitcoin Expert Doesn’t Own Ethereum

Business Insider2 days ago

Bella Aurora launches its first treatment for white patches on the skin

Trending