Zephyrnet Logo

Tag: Camera

My Cranfield MSc: Robotics MSc – my group project experience

Hello! My name is Angelos Dimakos, and I am from Greece. I am currently studying for an MSc in Robotics at Cranfield University. I am also the Course Representative for this course. During this blog, post I’d like to take you through my group project experience on the Robotics MSc. The group design project element ...

The post My Cranfield MSc: Robotics MSc – my group project experience appeared first on Cranfield University Blogs.

How to View Art from Your Chromecast in Augmented Reality

The Chromecast TV streaming lineup from Google is one of the more popular products in the category, primarily due to its low price tag and broad app support. With more than 430,000 reviews on the App Store, Chromecast has even found fans of those in the Apple ecosystem. The latest edition in the lineup, Chromecast with Google TV, elevates the platform from a casting dongle to a full-fledged smart TV device and Apple TV alternative. Although Chromecast's obvious strength is in streaming video, its Ambient Mode is one of its low-key great side benefits. Ambient Mode (formerly known as Backdrop... more

PepsiCo’s Doritos Used to Push for Return to Live Concerts with Augmented Reality Experience

With fears over the COVID-19 pandemic subsiding, live concerts are returning. And Doritos wants to send music fans to a few of them via an augmented reality promotion. Dubbed "Mark Your Play," the campaign is centered on scannable posters, denoted by a Doritos chip displayed as a play button, found in London, Manchester, Birmingham, Leeds, Glasgow, Liverpool, Bristol, Brighton, Cardiff, and Southampton in the UK. Don't Miss: 8th Wall's Web AR Brings Album Art of Pink Floyd to Life Scanning the QR code opens the Doritos Make Your Play website. The AR experience begins with participants... more

PackageX Mailroom Review 2023: Features, Pricing & More

Today, we will have an in-depth PackageX Mailroom review – one of the best mailroom management solutions available in the market.Mail automation brings many...

JigSpace Puts Together $4.7 Million in Funding to Expand AR Tutorial Technology

The startup JigSpace, which was among the first apps to support ARKit and LiDAR for iPhone augmented reality apps, has capitalized on its early mover status by innovating within the space. On Wednesday, JigSpace closed a $4.7 million Series A round of funding, led by Rampersand with Investible, Vulpes, and Roger Allen AM participating as well. Don't Miss: The 50 Best Augmented Reality Apps for iPhone, iPad & Android Devices "Creating and sharing knowledge in 3D should be simple, useful, and delightful. We're on a mission to unlock the utility of augmented reality at massive scale and bring... more

Snapchat Extends Its AR Lenses to Viber Messaging App

Yo dawg, Snap heard you using Snapchat augmented reality Lenses in your messages, so it's putting its AR Lenses in other messaging apps. On Wednesday, Rakuten Viber flipped the switch on an update that brings Snapchat's popular AR Lenses to many users of its Viber messaging and voice communications app. Don't Miss: Snap Releases Lens Studio 4.0 with 3D Body Mesh & More, Upgrades Scan as Fashion Assistant Viber users will now have access to 30 Lenses at launch for augmented video messaging and photos created in the app, but the selection will expand over time, with Rakuten Viber planning to... more

Review of Rainbow Six Siege in 2021 6️⃣ Is It Still Worth It?

  Are you wondering how many people play Rainbow Six Siege?  Would you like to find out if still is popular in 2021 or if is it dying? Let’s answer this and more questions down below.   Review: What Is Rainbow Six Siege? Is It Worth It In 2021? Rainbow Six Siege is a tactical […]

The post Review of Rainbow Six Siege in 2021 6️⃣ Is It Still Worth It? appeared first on Gamer One.

You Can Now Try-On & Test Samsung Galaxy Smartwatches in AR via Snapchat

Now that Snapchat has extended its virtual try-on powers from the face and feet to the wrist, Samsung is leveraging the new capability, along with an interactive twist, to sell its smartwatches. Samsung Electronics UK partnered with Snapchat to run a virtual try-on for Samsung Galaxy Watch 3 using the app's wrist-tracking technology to project the product onto consumers' wrists. Don't Miss: Snap Releases Lens Studio 4.0 with 3D Body Mesh & More, Upgrades Scan as Fashion Assistant After scanning the Snapcode or selecting the Samsung icon from the Lens Carousel, users can scroll through... more

Immerse yourself in the Olympics this summer



Tokyo will see many firsts when it hosts the Games of the XXXII Olympiad. Not least, that the 2020 Games are being held a year later than planned and less than one year before the 2022 Winter Olympics in Beijing.

The worldwide pandemic also means that there will be no international visitors in Tokyo, but Olympic Broadcasting Services (OBS) will build on virtual-reality (VR) technology introduced in Rio in 2016 and PyeongChang’s 2018 Winter Olympics to make TV viewers feel a part of the Games from the comfort of their own homes.

OBS is the host broadcast organisation for the Olympic, Youth Olympic and Paralympic Games. It was created by the IOC in 2001. It provides broadcast content for use by all rights-holding broadcasters (e.g. the BBC, Eurosport) around the world, and it also helps broadcasters prepare for the Games; OBS oversees the fit-out of the International Broadcast Centre (IBC), which is the home of the broadcasting operations of the Games. It also prepares the compounds at each competition venue, where OBS and rights-holding broadcasters’ production and technical facilities are located and from where international television and radio signals will be produced.  

OBS expects to produce 56 live feeds and nearly 9,500 hours of content during the 17-day event, which consists of 339 events across 50 sporting disciplines. It has a team of nearly 8,000 people from more than 70 countries, with specialities and skills in different sports broadcasting.

Tokyo will be the first Olympic Games to be natively produced in ultra-high definition (UHD) – or 4K – and high dynamic range (HDR). UHD content has a resolution of 3,840 x 2,160 pixels, four times the number of pixels of full HD, to give more detail. HDR technology improves the contrast between black and white pixels for an accurate picture with more colour shades. Coverage of previous Games has been done in parallel coverage with broadcasters but this year will be the first time the native broadcast coverage for the OBS world feed will be produced directly as UHD and HDR across all competitions and ceremonies of this year’s Games.

Until recently there was no universal standard for UHD and the application of HDR was not a foregone conclusion, says Yiannis Exarchos, chief executive officer of OBS. He places great store on HDR in particular. “It brings, especially in outdoor sport, a level of detail, both in colour and light rotation, which really makes the images considerably more realistic than you get in HD; it’s not just about resolution.”

OBS has developed a standard which derives the best possible HDR output out of the UHD solution, making it more efficient and sustainable because OBS does not have to double up broadcast resources.

The closing ceremony will be broadcast in 8K to make the most of what are traditionally vibrant spectacles. 8K doubles 4K’s number of pixels to 7,680 x 4,320 and is 16 times greater than HD. Other sports (athletics, gymnastics, judo, and some > < swimming events) will be available in 8K in Japan, although other broadcasters may pick up the feed and experiment with trial transmissions. There are already plans for 8K broadcast by Chinese broadcasters for the 2022 Winter Olympics in Beijing.

Another Tokyo first will be immersive audio. All venues will deliver an immersive audio feed as discrete channels in a 5.1.4 format. This is five microphones placed front left, right and centre and either side or behind for surround sound, with one dedicated bass channel and four speakers above the source of noise. There will be 85 separate 5.1.4 audio feeds available for national broadcasters.

Immersive audio brings a three-dimensional experience to viewers or listeners. Exarchos says: “I am a huge audio fan because I believe that audio, in some sense, is the carrier of emotions... so I am happy that for the first time, across all sports, we will do 5.1.4 immersive audio.” Countries such as Japan, China, the US, Brazil, and most of Europe (via Discovery and Eurosport channels) will be able to experience 5.1.4 audio transmission.  

Bringing 3D quality of the Games to viewers from events held in arenas, OBS has partnered with Intel to bring its True View technology to Tokyo. TrueView is based on directors using images from virtual cameras all around a venue to deliver perspectives that cannot be seen by physical ones. It allows viewers to select the angle from which they want to see the camera, with options for three or six degrees of movement. It uses an array of high-resolution cameras positioned to capture the entire field of play, connected to on-site servers, based on Intel’s Xeon processors. Data from the cameras is sent to the cloud to be synchronised, analysed, and processed.

In the production suite, engineers can use virtual stationary and tracking cameras to create content focused on particular points in the game, maybe the most exciting action or sequences, for analysis by commentators. Images from the virtual cameras are rendered and converted into compressed digital video in the cloud. Up to 200 terabytes of raw data is processed per event, including height, width, depth, and relative attributes to create high-fidelity 3D video.

TrueView supports the common industry-standard video codecs (H.264, H.265, MPEG and AAC for audio) for use on different platforms and devices. The encoded video is converted into bit streams for live streaming. The bit streams are converted by the rights-holding broadcaster to decompress the video into a series of images, which are rendered sequentially and broadcast.

The output of volumetric content allows viewers to see all perspectives of the game, or to follow a particular player or see the play from any position on the field – including the referees.

In Tokyo, TrueView will be used for basketball games. It is, says Exarchos, one of the sports where it could be really outstanding, because of the three-dimensional nature of the sport. “It is also a very fast sport, that moves in many different directions and there is significant vertical movement.” OBS will produce five to seven volumetric replay clips for each game, which will be made available for directors to integrate into live coverage. “I’m pretty sure that some of these clips will go viral on social media platforms,” he adds.

Tokyo is the first step as broadcasters explore how to create exciting content. It could be someone ‘walking’ into the field of play, turning and watching the athletes around him or her, suggests Exarchos. He expects to see this develop and is looking forward to what will be broadcast at the 2024 Olympics in Paris.

An added benefit is that True View saves costs, and carbon footprint,  by not needing a camera crew to travel to the venue.

‘Covid, travel bans and a postponement are challenges that should be used to advance broadcasting.’

Yiannis Exarchos, OBS

While Exarchos is sworn to secrecy and cannot reveal details of the opening ceremony, he is able to confirm that it will be broadcast using multiple cameras for an immersive, virtual-reality experience.

VR is not new for the Olympic Games; it was used in 2016 at Rio and in 2018’s Winter Games in South Korea. For the opening ceremony, there will be six 180° camera systems and one 360° VR system. There will also be experimentation with 5G technology during the ceremony to ensure that content is quickly received at high resolutions and turned around for the worldwide audience.

“I believe that 5G can be a big game-​changer for broadcasting,” says Exarchos. “It can help us get rid of a lot of the technical constraints, equipment and regulatory needs, as well as permits for RF transmissions... We need to find a way that we can use all these interesting technologies in a way which is equally effective as we do with tools we have now.” VR has a huge bandwidth and 5G’s reduced latency could address synchronisation issues around live events. “5G provides ample bandwidth and also defeats the digital problems of latency to a very big extent,” he continues. There will also be opportunities for progress using mobile phone screens and the computational capabilities; “hence our collaboration with Intel,” Exarchos says.

The VR experience will be “significantly upgraded” compared to that used at PyeongChang, with coverage of 47 live events as well as between 50 and 100 pieces of VR highlights. “We have noticed that people tend to prefer to experience highlights on VR,” says Exarchos. “For Beijing, our common goal, with Intel, is to produce an 8K VR product.”

The 2018 Winter Games introduced Intel True VR, whereby OBS broadcast 30 events with live and video-on-demand VR events. There will be three to six camera viewpoints recording each event for content compatible with most commercially available VR headsets. Viewers can watch an immersive ‘VR Cast’, seen from different angles as well as graphics and picture-in-picture overlays. True VR can also be used for post-event highlights, and production teams can overlay statistics or picture-in-picture content.

The quest for stats by viewers has also led to Intel’s 3D Athlete Tracking (3DAT) being used at this year’s Games. Originally a coaching tool, it uses AI to identify 20 skeletal points on an athlete. Replaying the race these points indicate where pressure or stress is being exerted or to analyse moments of acceleration and deceleration. In Tokyo it will be used post-races for viewers to analyse the race in detail.

Intel’s director of sports performance technology, Jonathan Lee, explains how much data processing is involved. “We will track all eight or nine sprinters, so that’s nine videos at 60 frames per second for 10 to 12 seconds. It is not just the detection of athletes but tracking skeletal points... We use AI because the athletes are next to each other, so we may not be able to see the arm of the sprinter in lane five, for example,” he says. AI is used to differentiate which sprinter is which at any given moment.

Data that is of interest to an athlete and coach may be too detailed for a viewer but it is likely that statistics such as top speed, when a sprinter reaches that top speed and when they decelerate will be shown. 3DAT allows athletes and coaches to review a race and see that they maintain their top speed for say 20 or 30m and then start to decelerate at the 70m mark. “To be honest, to be able to digest all that information quickly, you’ll see colour indicators, almost like a heat map,” Lee says. The classic view of the sprinters racing towards the finish will be embellished with colours indicating their speed, moving from yellow to red and darker as they go faster, explains Lee. This in itself is a visual representation but post-production teams can overlay when a particular sprinter hits their top speed and what speed that is.

“We will work with broadcasters to understand how they use this type of content... perhaps some of the things coaches and athletes find interesting, we may find that fans find them compelling too, and we might start to weave them in as well,” suggests Lee. The technology is designed to complement the coverage of rights-holding broadcasters. The 3DAT clips with overlaid ‘heat maps’ can be used by commentators to examine what has happened.

Intel’s graphics partner receives the video from OBS, ingests the 3DAT data and renders the clips. AI processing happens in the cloud on Xeon scalable processors. It is turned around in less than 60 seconds.

“To be able to have volumetric replay, potentially also with analysis, and to be able to use it as a fourth or fifth replay after a race is a big deal,” says Exarchos. “I’m sure it will go faster and faster... and hopefully, we will do more in Beijing.”

Virtual reality

How to be there, without being there

The Tokyo Games are expected to be popular around the world as people emerge from a pandemic hibernation. Exarchos expects viewing figures to exceed Rio’s 3.3 billion worldwide. (The 2012 London Olympics was watched by 3.6 billion people worldwide, topped only by the 2008 Beijing Games’ 4.4 billion viewers).

The absence of foreign visitors to Tokyo will create a different experience for viewers, as much of the Olympic vibe is derived from the crowd reactions. Flying the flag and the crowd roaring on their favourite is part of many events, and OBS has plans to allow spectators and fans to show their virtual support. It is providing rights-holding broadcasters with digital tools to enable end users to cheer for athletes or countries or the Games in general. These cheers will be shown in the venues.

This is a last-minute development, concedes Exarchos. “We decided to fast-​track it, given the situation in Tokyo,” he says. This remote engagement by fans will be a virtual presence – they will be prompted to cheer or post a selfie celebrating a win. “They’re not just passive watchers, they will actually have a virtual presence in the field,” he explains.

“We made a commitment to deliver the same Games [regardless of the Covid pandemic]. We will provide exactly the same level of coverage. Since the [initial] postponement of the Games we have been thinking about ways we can address the situation. You never want to let a challenge go unused,” he adds.

Tokyo’s innovation team is planning a VR booth in a location outside Tokyo to allow fans to don VR headsets and be part of the Games virtually using 360° projections and in co-operation with local production companies. There are similar plans in China; all, however, are dependent on permission by the local broadcasters.

Google Extends AR Search Lead Over Apple with Landmark Recognition for Google Lens

This week, Apple unveiled its own version of Google Lens in the form of Live Text. In response, Google just hit back with a new feature for its visual search tool called Places, a new search category that can recognize landmarks and return information on them within the camera view, which Apple touted as a capability of Live Text during its WWDC keynote. Don't Miss: Nintendo Crafts AR Experience via Google Lens for Pokémon Sword & Shield Game Packaging Places for Google Lens, which is available now worldwide, uses image recognition and Google Earth's 3D map assets to identify locations... more

Far Cry 6 Release Date, Game Details, and Much More

Far Cry 6 is the next instalment in the critically-acclaimed Far Cry series. It was announced a couple of years ago, and then during...

Rediscover your city through a new Lens this summerRediscover your city through a new Lens this summerDirector, Product Management, Google Lens and AR

A new tool for rediscoveryGoogle Lens is now used over three billion times per month by people around the world, and with many ready...

Latest Intelligence

spot_img
spot_img