Zephyrnet Logo

Tag: mapping

My Cranfield MSc: Robotics MSc – my group project experience

Hello! My name is Angelos Dimakos, and I am from Greece. I am currently studying for an MSc in Robotics at Cranfield University. I am also the Course Representative for this course. During this blog, post I’d like to take you through my group project experience on the Robotics MSc. The group design project element ...

The post My Cranfield MSc: Robotics MSc – my group project experience appeared first on Cranfield University Blogs.

Snapchat Extends Its AR Lenses to Viber Messaging App

Yo dawg, Snap heard you using Snapchat augmented reality Lenses in your messages, so it's putting its AR Lenses in other messaging apps. On Wednesday, Rakuten Viber flipped the switch on an update that brings Snapchat's popular AR Lenses to many users of its Viber messaging and voice communications app. Don't Miss: Snap Releases Lens Studio 4.0 with 3D Body Mesh & More, Upgrades Scan as Fashion Assistant Viber users will now have access to 30 Lenses at launch for augmented video messaging and photos created in the app, but the selection will expand over time, with Rakuten Viber planning to... more

Review of Rivals of Aether In 2021 🤜 Is It Still Worth It?

  What are Rivals of Aethers? Why everyone has been talking about it? Is it better and funnier than Super Smash Bros.? Keep reading. We’ll answer these questions down below.    RoA Review: What Is Rivals of Aether? Is It Worth It In 2021? Rivals of Aether is a 16-bit platform fighting game based on […]

The post Review of Rivals of Aether In 2021 🤜 Is It Still Worth It? appeared first on Gamer One.

Ayoa Review 2023: Features, Functionality, Pricing, and More

In this Ayoa review, we will look at OpenGenius’s powerful mind mapping and online collaboration platform.Ayoa gives its users the speed and consistency to...

Niantic Primes Its Lightship AR Engine for Transformers Game, Global Launch Slated for 2021

With the tagline "More Than Meets the Eye," the Transformers franchise was pretty much preordained to have its own augmented reality game, and AR gaming pioneer Niantic has stepped up to fulfill that destiny. Niantic will now join forces with toymakers Hasbro and Tomy and development studio Very Very Games to build Transformers: Heavy Metal on its Lightship AR cloud platform. Don't Miss: Niantic Bringing Buddy Interactions to AR+ Mode in Pokémon GO, Shared Experiences with Other Trainers to Follow Based on images of gameplay provided by Niantic, Transformers: Heavy Metal follows the formula... more

Chinese Private Constellations and the Art of the Pivot

In one of the most influential entrepreneurship books of the last 10 years, The Lean Startup by Eric Ries describes the necessity to test, iterate, and pivot until ultimately reaching a product that fits the market. Chinese tech companies have shown us that they excel at this practice (see the history of a company like Meituan [1]), and it increasingly seems like Chinese NewSpace companies, …

A virtual spectacle: bringing video production to life



The houselights go down, your heart is racing, you’ve got goosebumps, everyone in earshot is screaming, and the performance is about to start... The difference, however, is that you’re not in a venue, you’re actually in the comfort of your own home.

As we all know, the coronavirus pandemic has caused all live performances to come to a complete standstill in the first half of the year. Because of this, live video has come to the forefront within many industries, whether that be for live events or university lectures. This sets the question; how can you achieve an experience that is just as immersive as attending a concert or lecture, for example, in person? Extended reality may be the answer.

Extended reality (XR) is an umbrella term in live production that combines augmented (AR), virtual (VR) and mixed reality (MR) elements to extend the reality we experience by blending the virtual and 'real' worlds. According to Visual Capitalist, XR is expected to grow to a market size of more than $209bn (£160bn) by 2022 and is already being used across many industries such as corporate events, education, broadcast, live music, and e-sports, allowing brands, artists and organisations alike to connect with their audiences remotely.

The technology, however, is thought to take virtual productions to the next level. Indeed, the company ‘disguise’ has developed an extended reality (xR) platform that allows both creatives and technical people to imagine, create, and deliver spectacular live visual experiences. “It is a hardware and software solution that allows us to map video onto all sorts of creative surfaces and respond to the environment in various ways,” says Peter Kirkup, global technical solutions manager at disguise.

So how does it work? xR’s virtual set extension places presenters in environments larger than the spaces available, creating more compelling content to increase audience engagement. This complete immersion allows for interaction with computer graphic (CG) elements, real lighting, and support for reflective and refractive props. The virtual environment combines camera tracking and real-time content not only visible on the screen but live on set and on camera. This process gives directors and designers more control, and faster calibration workflows. 

How disguise's xR technology works

Image credit: disguise

Its creators say the xR workflow can also quickly and accurately align the virtual worlds, bringing together the content system, camera tracking system, and the LED screen, with pixel-accurate precision. This method, known as spatial calibration, takes place on set and can be done in under 30 minutes. “The system allows us to put structure light patterns onto the LED screen, and then perceive those through the camera,” Kirkup explains. “This system also helps us understand where the camera is in reality against what the tracking system is telling us, and calibrate the relationship between the two.”

As part of the xR workflow, disguise handles the blending of real and virtual worlds due to the colour calibration process embedded into the system. This enables the different 'worlds' to appear as one seamless environment. The system is also render-engine agnostic, which allows creatives to select their preferred content engine, such as game engines Notch, Unreal and Unity, in order to deliver high-quality visuals for their productions. Furthermore, disguise allows users to synchronise multiple render engines from a single timeline. It also has latency compensation built into the workflow to ensure minimal latencies to deliver such experiences.

According to the company, artists are one of many groups who can use xR to perform live, using content engines to create and render limitless possibilities for the video narrative in real-time. This transports the viewers to a world beyond the physical LED walls – an experience that has recently been attained by American singer Katy Perry. Here, tech company XR Studios leveraged disguise’s technology to bring what they described as the “future of broadcast” to primetime American TV, where the Grammy nominee performed her latest single ‘Daisies’ on the season finale of American Idol. 

“The seamless extension of the real-world LED screens to the virtual world environments could only be done by disguise’s xR camera registration workflow, allowing switching between camera perspectives and the LED content. Multi-Camera switching between perspectives also allowed us to switch cameras, and the LED content,” explains Scott Millar, a disguise xR workflow specialist who provided creative technology support on the project. Using disguise xR spatial mapping, the teams working on the project were able to accurately keyframe the position of real and virtual worlds into one coherent place where choreographed action could take place.

Here is an annotation of the xR set up and components demonstrated during Katy Perry's American Idol performance. It's important to note that the AR elements and set extension aren't there in real life.

Here is an annotation of the xR set up and components demonstrated during Katy Perry’s American Idol performance. It’s important to note that the AR elements and set extension aren’t there in real life and the camera doesn’t ‘see’ these elements.

Image credit: disguise

Katy Perry has not been the only world-renowned artist to perform virtually using this platform. The technology enabled partner XR Studios to create an environment for the Black Eyed Peas to promote their new album across primetime TV shows across the world. The Grammy award-winning hip-hop group recorded performances of four new songs, from their eighth studio album ‘Translation’, with XR Studios. These were then distributed in ‘promotion packages’ to a handful of primetime shows across the world, including Good Morning America and across Europe.

The platform allowed in-house teams to design and recreate environments, as well as bring in an artist who was unable to make the shoot. In the performance of new single ‘Ritmo (Bad Boys For Life)’, the song credits a guest feature with Colombian reggaeton singer J Balvin who could not be present in person. To overcome this, disguise xR allowed Silent Partners Studios, who designed the content for the videos, to create a 3D digital version of the singer within the content and virtually perform his verse amongst the original designs of the songs music video.

“Will.i.am and the band have always pushed the boundaries of technology as it crosses over into music production, promotion and consumption, so a disguise xR workflow for their performances really was a perfect fit for them as a group,” explains JT Rooney of Silent Partners, who was closely involved in the creation of the content with Will.i.am and the band’s management. The band also performed one of their most well-known songs, ‘Where Is The Love?’ as part of the promotion package. Beginning on a round platform, with standout AR elements of the red question mark logo, as seen in the original music video for the song, floating above the band, the scenery around them slowly evolves. The background displayed the names of recent police brutality victims – a poignant statement in light of recent global events.

Other performances by the band saw the xR platform ‘extending the set’, with two dancers on set reacting to the virtual dancers in the scene behind them. The teams also decided to use mirrors in the performance. And combining the mirrors with the LED walls in the virtual space allowed for natural reflections of content across the LED – a technique in which the green screen faults. “Green screen technology is a very isolating and difficult environment for presenters and artists to work in,” Kirkup agrees. “You don’t have a genuine reaction to the environment around you as all you can see is green. But with this extended reality workflow, we overcome that.”

Prior to the pandemic, the xR technology was first used in a broadcast that brought on a visual spectacle for viewers watching the HP OMEN Challenge e-sports tournament in September 2019. Here, the disguise gx 2c and gx 1 media servers were used alongside the xR workflow to power the real-time generative Notch content that accompanied the gameplay. The company’s media servers also help power audio-visual company White Light’s ‘SmartStage’ system, which has been used to create an immersive virtual classroom solution for an online Masters in Business Administration programme at Michigan University in the US.

The company believes xR technology will change the face of delivery in broadcast, “proving to be a vital lifeline in taking virtual productions to the next level”. Tom Rockhill, chief sales executive at disguise says: “xR enables creative and technical professionals to tell stories in new and innovative ways for the world’s leading brands and artists and produces opportunities for collaboration, combining teams, and technologies to drive a single creative vision.”

Rise of the robot

Robotics, artificial intelligence and machine learning are no longer the stuff of science fiction. It's simply a given that AIEd is becoming

Game Changers Deep Dive: Glasgow’s Climate Action Story

Glasgow’s Climate Action Story By Gavin Slater, Head of Sustainability, Neighbourhoods & Sustainability, Glasgow City Council The City of Glasgow has experienced constant change and evolution. In 1765, James Watt, while walking on Glasgow Green, conceived of the separate condenser to the steam engine and, thus, set about an acceleration of the evolution of the industrial age and inadvertently enabled the acceleration of climate change. In the years that followed, Glasgow became an industrial powerhouse. The ripples from that one moment in time here in Glasgow lapped the shores of the entire world, changing it just as much as it transformed us.  Since then we have generated new ways of urban living, but with them has come the generation of the greenhouse gases that have […]

China’s Spaceplane Projects: Past, Present and Future

This article is the second and final part of a two-piece blog post by China Aerospace Blog on Chinese spaceplanes. The first part discussed China’s historical approach to reusability, and more specifically to spaceplanes. This part extends the discussion by reviewing current Chinese spaceplane projects, and provides a map. Mapping Current Chinese Spaceplane Projects Below is a map of all Chinese spaceplane projects, including abandoned …

Chinese Aviation Cities: the Obvious, the Unexpected, and the Discrepant

In this blog post, China Aerospace Blog revisits the “What is the Chinese Aviation Industry Like?” article, which dates back to July 2018. While the former was essentially based on the China Civil Aviation Industry Report 2017 by MIIT, this blog post is built on a different set of data detailed below. The focus is also different: we single out the main aeronautical cities in …

Latest Intelligence

spot_img
spot_img