Zephyrnet Logo

Will Visual Search Drive AR Shopping?

Date:

AR Insider

Immersive shopping is proving to have experiential impact for consumers, and revenue impact for brands. Related to — but separate from — AR advertising, this is when AR is used as a tool to visualize and contextualize products to engender more informed consumer purchases.

This is a subset of AR that we call camera commerce. It comes in a few flavors, including visualizing products on “spaces and faces.” It also includes visual search — pointing one’s smartphone camera at a given product to get informational, identifying, or transactional overlays.

In each case, AR brings additional context and confidence to product purchases. And this value has been elevated during a pandemic, as AR brings back some of the product dimension and tactile detail that’s been taken away from consumers during retail lockdowns.

Synthesizing these factors, ARtillery Intelligence recently produced a report to dive into the drivers and dynamics of camera commerce. How is the field shaping up? Who’s doing what? And how big is the market opportunity? We’ve excerpted the report below for AR Insider readers.

Though visual search is less prevalent than AR shopping’s other main format — product visualization (a.k.a., “try-before-you-buy”) — it has greater potential due to its high-intent orientation. Visual searches happen when consumers want to actively identify an item visually.

This makes visual search a natural evolutionary step from web search. Indeed, one of the things that’s made web search so lucrative for Google and others is the same “high-intent” orientation where consumers explicitly indicate a specific need. That makes contextual advertising natural.

Visual search takes that principle into the next generation of camera-based experiences and visual media. This won’t replace web search of course, but it will supplement it with an alternative visual input. This will resonate among camera-forward millennials and Gen-Z.

In fact, these are reasons that Google is so keen on visual search. Along with voice search, it sees it as a way to boost search query volume by letting people search from more places and modalities. It’s also a play to future-proof its core search business by leaning into emerging tech.

1. How VR could bring transhumanism to the masses

2. How Augmented Reality (AR) is Reshaping the Food Service Industry

3. ExpiCulture — Developing an Original World-Traveling VR Experience

4. Enterprise AR: 7 real-world use cases for 2021

Beyond sheer numbers, this growth validates Google Lens’ broadening capability. Launched initially with use cases around identifying pets and flowers, the eventual goal — in true Google fashion — is to be a “knowledge layer” for monetizable searches like shoppable products.

This raises the question of what types of products shine in visual search. Early signs point to items with visual complexity and unclear branding. This includes style items (“who makes that dress?”) and in-aisle retail queries, which could position it strong for the post-Covid world.

Another fitting use case is local discovery. Visual search could be a fitting tool to find out more about a new restaurant — or book a reservation — by pointing your phone at it. The smartphone era has taught us that search intent is high when the subject is in proximity.

In fact, Google has already begun to develop this opportunity with its Live View urban navigation feature. When using it, consumers can see businesses along their route identified visually through AR overlays — the first step towards a visually-driven local search tool.

But these efforts could take a while to materialize — at least the monetization components. Google is in the process of testing visual search, optimizing the UX, and devising interfaces for sponsored content insertion. A key question: what will be the “results page” of visual search?

The challenge — just like with voice search — is that there isn’t a “10 blue links” results page. So monetization will defy the traditional search model. This could involve enhanced results (think: “buy” buttons) when a visual search advertiser is discovered on Google Lens.

Until then, Google can use visual search behavior to optimize web search results. In other words, you won’t see sponsored results in a visual search flow, but you’ll see visual-search-informed results when back on web search — assuming you’re signed in to the same Google account.

To further grease the adoption wheels, Google continues to develop visual search “training wheels.” This includes making Google Lens front & center in well-traveled places such as Google’s iPhone app. This could reduce some friction and help incubate the visual search.

We’ll pause there and circle back with more analysis in the next report excerpt. Meanwhile, check out the full report here and the video companion below.

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://arvrjourney.com/will-visual-search-drive-ar-shopping-a354f90ab72c?source=rss—-d01820283d6d—4

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?