What Amazon's Echo Look Means for Visual Commerce

Specialized selfie-taking cameras and smart mirrors are just the beginning  

Author: Krista Garcia

June 8, 2018

More than a year after Amazon debuted the Echo Look, a $199 Alexa-fueled, freestanding camera to take photos of a user's outfits, the device has moved from the invite-only stage to full US availability. Coupled with Amazon's push into private-label apparel, Echo Look is just one more attempt to penetrate the competitive fashion category.

Echo Look can take full-length selfies—or 6-second videos—triggered by a voice command. With built-in lighting and bokeh-effect photos (what newer iPhones models call "portrait" mode), it's a product suited for the Instagram era. Users can store images in collections sorted by season, color or other self-created categories.

The "Style Check" feature allows users to compare photos of themselves in two outfits side by side, and gives scores to both outfits with reasons why one was more successful—like the colors matching better or the choice of shoes working better. "Better" is a subjective concept, though, and Amazon's version of it is determined by a mysterious combination of "advanced machine learning algorithms and advice from fashion specialists."

Through usage, Amazon is capturing a rich trove of customer data to offer up personalized recommendations. The average apparel retailer doesn't have granular insights into what styles, brands or colors customers are wearing every day, how often they wear items, or how they choose items depending on season or occasion. 

Retailers across sectors, like Neiman Marcus and The Home Depot, have been introducing visual search initiatives for the past few years. Roughly one-third (32%) of retail executives worldwide that use artificial intelligence (AI) to personalize the customer experience said they had enabled visual search, according to a January 2018 Deloitte and Salesforce survey. 

While visual search shows promise, using photos to find products hasn't seen mass adoption yet. In a survey conducted almost a year ago (July 2017) by the National Retail Federation (NRF) and Toluna, 27% of US internet users said they were aware of visual search—more than were familiar with voice search (21%) and smart dressing rooms (13%), two features related to the Echo Look's functionality.

In May 2018, RichRelevance gauged US internet users' interest in using a retailer's mobile app to take photos and get product recommendations. A majority (52.3%) said they would like this feature, while fewer than one-fifth were not interested.   

Echo Look serves a similar purpose to the smart mirror technology that is being adopted by more retailers. Just last week, H&M began trialing a voice-activated selfie-snapping mirror in its Times Square flagship. The mirror puts users' faces on a virtual magazine cover with H&M's logo as well as giving outfit suggestions, which can be bought digitally via QR code.  According to German trade publication Lebensmittel Zeitung, 86% of people who tried the mirror chose to download their photo, and 10% registered for H&M's newsletter in exchange for a discount and free shipping. 

The camera component of Echo Look has implications for applications like body scanning. Brooks Brothers has been using depth-sensing cameras to digitally tailor clothing at its Madison Avenue location. In May 2018, Amazon put a call out in New York City for people willing to have their measurements taken over time, which hints at the direction this in-home fashion camera could go. It's not outlandish to imagine a future where Amazon can glean measurements, construct a custom garment and deliver it in the standard Prime two-day window.