Connect with us

AR/VR

A Wake Inn Pulls Those Trailer Strings Ahead of a 2021 Release

Avatar

Published

on

There were plenty of virtual reality (VR) titles announced during the week that would normally have been the Electronic Entertainment Expo (E3), VR Bros’ A Wake Inn being one of them. An immersive horror experience originally slated for this year, the team has just released a new trailer moving the launch to early 2021.

A Wake Inn

A Wake Inn is a scary title featuring a classic horror staple, mannequins, those lifeless, dead-eyed entities which work so well at terrifying almost everyone. The twist here is that not only is the art deco hotel where the gameplay is set filled with an army of living dolls, you also happen to be one. And then there’s the mysterious Doctor Finnegan, owner of the estate who talks to you via a shortwave transmitter.

The story revolves around you finding out who you are, how you ended up here, and how to get out whilst avoiding the other not so friendly dolls. As VR Bros puts it: “Is it time for the player to take revenge on their maker and set themselves free, or perhaps they’re just a puppet being pulled by its strings?”

In a similar fashion to Last Labyrinth, you’re bound to a wheelchair, making A Wake Inn an entirely seated experience. That’s where the similarity ends, as in this experience you’re given free rein to explore the hotel and figure out its various escape room-style gameplay elements. You operate it just as you would any manual wheelchair, moving the controllers as if pushing the wheels.

A Wake Inn

Further thought has been put into the gameplay interactions as well, a flashlight for lighting up the darkness which does run out of batteries, a radio with custom stations, and the cinema room where you can watch tapes found around the building.

A Wake Inn will support HTC Vive, Oculus Rift and Valve Index when it launches next year. For further updates on this wheeled horror, keep reading VRFocus.

Source: https://www.vrfocus.com/2020/11/a-wake-inn-pulls-those-trailer-strings-ahead-of-a-2021-release/

AR/VR

US Army using Augmented Reality overlays in its research for the detection of roadside explosive hazards

Avatar

Published

on

In Augmented Reality News 

January 23, 2021 – The US Army Combat Capabilities Development Command (DEVCOM), Army Research Laboratory (ARL), has recently announced that it is employing the use of augmented reality (AR) overlays in its research for the detection of roadside explosive hazards, such as improvised explosive devices (IEDs), unexploded ordnance and landmines.

Route reconnaissance in support of convoy operations remains a critical function to keep Soldiers safe from such hazards, which continue to threaten operations abroad and continually prove to be an evolving and problematic adversarial tactic. To combat this problem, ARL and other research collaborators were funded by the Defense Threat Reduction Agency, via the ‘Blood Hound Gang Program’, which focuses on a system-of-systems approach to standoff explosive hazard detection.

Kelly Sherbondy, Program Manager at the lab, said “Logically, a system-of-systems approach to standoff explosive hazard detection research is warranted going forward,” adding, “Our collaborative methodology affords implementation of state-of-the-art technology and approaches while rapidly progressing the program with seasoned subject matter experts to meet or exceed military requirements and transition points.”

The program has seven external collaborators from across the country, which include the US Military Academy, The University of Delaware Video/Image Modeling and Synthesis Laboratory, Ideal Innovations Inc., Alion Science and Technology, The Citadel, IMSAR and AUGMNTR.

In Phase I of the program, researchers took 15-months to evaluate mostly high-technology readiness level (TRL) standoff detection technologies against a variety of explosive hazard emplacements. In addition, a lower-TRL standoff detection sensor, which was focused on the detection of explosive hazard triggering devices, was developed and assessed. According to the Army, the Phase I assessment included probability of detection, false alarm rate and other important information that will ultimately lead to a down-selection of sensors based on best performance for Phase II of the program.

Researchers use various sensors on Unmanned Aerial Systems equipped with high-definition infrared cameras and navigation to enable standoff detection of explosive hazards using machine learning techniques.

The sensors evaluated during Phase I included an airborne synthetic aperture radar, ground vehicular and small unmanned aerial vehicle LIDAR, high-definition electro-optical cameras, long-wave infrared cameras and a non-linear junction detection radar. Researchers carried a field test in real-world representative terrain over a 7-kilometer test track and included a total of 625 emplacements including a variety of explosive hazards, simulated clutter and calibration targets. They collected data before and after emplacement to simulate a real-world change between sensor passes.

Terabytes of data was collected across the sensor sets which was needed to adequately train artificial intelligence/machine learning (AI/ML) algorithms. The algorithms subsequently performed autonomous automatic target detection for each sensor. The Army stated that this sensor data is pixel-aligned via geo-referencing and the AI/ML techniques can be applied to some or all of the combined sensor data for a specific area. Furthermore, the detection algorithms are able to provide ‘confidence levels’ for each suspected target, which is displayed to a user as an augmented reality overlay. The detection algorithms were executed with various sensor permutations so that performance results could be aggregated and determine the best course of action moving forward into Phase II.

“The accomplishments of these efforts are significant to ensuring the safety of the warfighter in the current operation environment,” said Lt. Col. Mike Fuller, US Air Force Explosive Ordnance Disposal and DTRA Program Manager.

The Army noted that future research into the technology will enable real-time automatic target detection displayed with an augmented reality engine. The three year effort will ultimately culminate with demonstrations at multiple testing facilities to show the technology’s robustness over varying terrain.

“We have side-by-side comparisons of multiple modalities against a wide variety of realistic, relevant target threats, plus an evaluation of the fusion of those sensors’ output to determine the most effective way to maximize probability of detection and minimize false alarms,” Fuller said. “We hope that the Army and the Joint community will both benefit from the data gathered and lessons learned by all involved.”

Image credit: US Army

About the author

Sam Sprigg

Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he covers news articles on both the AR and VR industries. He also has an interest in human augmentation technology as a whole, and does not just limit his learning specifically to the visual experience side of things.

Source: https://www.auganix.org/us-army-using-augmented-reality-overlays-in-its-research-for-the-detection-of-roadside-explosive-hazards/

Continue Reading

AR/VR

LIV Now Supports Full-body Avatars from ReadyPlayerMe, Making it Easy to Stream VR Without a Green Screen

Avatar

Published

on

Many VR streamers use complicated mixed reality setups to show themselves from a third-person perspective inside the virtual world. LIV, a leading tool which makes this possible, now supports free, customizable, full-body avatars from ReadyPlayerMe, making it possible to stream your avatar inside of VR without the need for a green screen.

In addition to true mixed reality streaming, Liv has supported streaming with avatars for some time. However, actually finding a unique avatar for yourself was no simple task. Now, Liv has partnered with avatar maker ReadyPlayerMe to make it as simple as can be.

ReadyPlayerMe allows you to build a free full-body avatar—optionally based on a photo of yourself—in mere minutes. You can use the avatar as the character in select Liv-supported VR games, allowing stream viewers to see your movements in third-person.

Here’s an example of a ReadyPlayerMe avatar in Pistol Whip streamed via Liv:

What Sadie said! They have improved on them, they now are full body and support finger tracking and full body tracking! It’s pretty smooth! pic.twitter.com/J8rY5UwWOo

— AtomBombBody (@AtomBombBody) January 17, 2021

Avatars from ReadyPlayMe are moderately customizable, and easy enough to get something you’re happy with relatively quickly, though we hope to see more customization options in the future (like height, build, and more control over outfits).

Image courtesy ReadyPlayerMe

You can make your own ReadyPlayMe avatar to import to Liv right here. If you want to download your avatar for some other use, you can make one here and download it at the end of the process as a .GLB file for use in other applications.

Streamer Atom Bomb Body also has a detailed walkthrough for configuring Liv with your new avatar here:

The post LIV Now Supports Full-body Avatars from ReadyPlayerMe, Making it Easy to Stream VR Without a Green Screen appeared first on Road to VR.

Source: https://vrarnews.com/details/liv-now-supports-full-body-avatars-from-readyplayerme-making-it-easy-to-stream-vr-without-a-green-screen-600b772745b9dcae3e9a590f?s=rss

Continue Reading

AR/VR

Pinterest’s new AR feature lets you try on virtual eyeshadow

Avatar

Published

on

Shopping online is the primary way people get most of the items they want or need, but there are some downsides: you can’t try on clothes to make sure they’ll fit right and it’s not easy to determine whether a particular makeup color will look good on you. Pinterest has introduced another feature that addresses the latter problem, one that … Continue reading

Source: https://vrarnews.com/details/pinterests-new-ar-feature-lets-you-try-on-virtual-eyeshadow-600b6e18c1c62e453a615b12?s=rss

Continue Reading

AR/VR

Magic Leap announces partnership with Google Cloud to Spatial Computing to enterprise and Google Cloud customers

Avatar

Published

on

In Augmented Reality and Mixed Reality News

January 22, 2021 – Magic Leap has today announced that it has entered into a multi-phased, multi-year strategic partnership agreement with Google Cloud to deliver spatial computing solutions to businesses and Google Cloud customers.

Through the partnership, Magic Leap will deliver its enterprise solutions on the Google Cloud Marketplace and explore potential new cloud-based, spatial computing solutions running on Google Cloud.

Magic Leap stated that as enterprises have evolved their operations over the past year to meet the needs of the changing business environment, demand for solutions that support business continuity, agility and borderless collaboration has accelerated exponentially. The partnership is therefore designed to meet those demands.

Beginning in 2021, select Magic Leap solutions that provide tools for businesses will be available in the Google Cloud Marketplace, allowing developers who create solutions on the Magic Leap platform to reach global customers via Google’s marketplace. Magic Leap’s own solutions, such as its Communication, Collaboration and Co-presence platform, will also be made available in the Google Cloud Marketplace as well.

“As we continue to build momentum for spatial computing in the enterprise market, we are very excited to partner with Google Cloud to deliver unique cloud solutions to their customers and ours,” explained Walter Delph, Chief Business Officer, Magic Leap. “Google Cloud offers best in class infrastructure for leading edge solutions designed to provide efficiencies, continuity and innovation to businesses across the globe.”

In the second phase of the partnership, the two companies will jointly explore opportunities to integrate Google Cloud capabilities in artificial intelligence (AI), machine learning, and analytics into Magic Leap’s Communication, Collaboration and Co-presence platform to support co-presence in any enterprise setting globally. According to Magic Leap, potential use cases involve applying cloud capabilities to help capture data and knowledge from experienced technicians in manufacturing settings, enhancing remote-technical support and training using augmented reality (AR), or providing complex or personalized procedure support in the healthcare industry.

Magic Leap added that it is working on the development of an AR Cloud product that will help to “advance the activation of spatially-aware enterprise solutions across multiple industry verticals.” The ‘Magic Leap Augmented Reality Cloud’ will allow enterprises to build applications that are spatially-aware and collaborative. The company also stated that it will explore the optimization of its AR Cloud by working in collaboration with Google Cloud, leveraging its network, content delivery services, and evolving 5G network edge compute services.

“More than ever, organizations are looking for ways to keep teams connected and support employees with innovative solutions in the cloud,” said Joe Miles Managing Director of Healthcare and Life Sciences at Google Cloud. “We are excited that Magic Leap has selected Google Cloud to expand the availability of its solutions for productivity in the enterprise. We look forward to working together to help Magic Leap scale its cloud-based solutions globally, and to help customers deploy next-generation collaboration and productivity solutions in the workplace.”

For more information on Magic Leap and its augmented and mixed reality solutions for enterprise, please visit the company’s website.

Image credit: Magic Leap

About the author

Sam Sprigg

Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he covers news articles on both the AR and VR industries. He also has an interest in human augmentation technology as a whole, and does not just limit his learning specifically to the visual experience side of things.

Source: https://www.auganix.org/magic-leap-announces-partnership-with-google-cloud-to-spatial-computing-to-enterprise-and-google-cloud-customers/

Continue Reading
Blockchain5 days ago

5 Best Bitcoin Alternatives in 2021

Cyber Security3 days ago

Critical Cisco SD-WAN Bugs Allow RCE Attacks

Medical Devices4 days ago

Elcam Medical Joins Serenno Medical as Strategic Investor and Manufacturer of its Automatic Monitoring of Kidney Function Device

custom-packet-sniffer-is-a-great-way-to-learn-can.png
Blockchain2 days ago

TA: Ethereum Starts Recovery, Why ETH Could Face Resistance Near $1,250

SPAC Insiders4 days ago

Churchill Capital IV (CCIV) Releases Statement on Lucid Motors Rumor

SPACS3 days ago

Intel Chairman Gets Medtronic Backing for $750 Million SPAC IPO

Cyber Security4 days ago

SolarWinds Malware Arsenal Widens with Raindrop

PR Newswire4 days ago

Global Laboratory Information Management Systems Market (2020 to 2027) – Featuring Abbott Informatics, Accelerated Technology Laboratories & Autoscribe Informatics Among Others

SPAC Insiders4 days ago

Queen’s Gambit Growth Capital (GMBT.U) Prices Upsized $300M IPO

SPAC Insiders4 days ago

FoxWayne Enterprises Acquisition Corp. (FOXWU) Prices $50M IPO

SPACS3 days ago

Payments Startup Payoneer in Merger Talks With SPAC

SPACS5 days ago

Why Clover Health Chose a SPAC, Not an IPO, to Go Public

Medical Devices5 days ago

FDA’s Planning for Coronavirus Medical Countermeasures

SPACS5 days ago

With the Boom in SPACs, Private Companies Are Calling the Shots

NEWATLAS4 days ago

New Street Bob 114 heads Harley-Davidson’s 2021 lineup

NEWATLAS4 days ago

World-first biomarker test can predict depression and bipolar disorder

Aerospace5 days ago

Aurora Insight to launch cubesats for RF sensing

SPACS3 days ago

Michael Moe, fresh from raising $225M for education-focused SPAC, set for another free Startup Bootcamp

Blockchain2 days ago

Bitcoin Cash Analysis: Strong Support Forming Near $400

University of Minnesota Professor K. Andre Mkhoyan and his team used analytical scanning transmission electron microscopy (STEM), which combines imaging with spectroscopy, to observe metallic properties in the perovskite crystal barium stannate (BaSnO3). The atomic-resolution STEM image, with a BaSnO3 crystal structure (on the left), shows an irregular arrangement of atoms identified as the metallic line defect core. CREDIT Mkhoyan Group, University of Minnesota
Nano Technology4 days ago

Conductive nature in crystal structures revealed at magnification of 10 million times: University of Minnesota study opens up possibilities for new transparent materials that conduct electricity

Trending