ARCHIVES: This is legacy content from before Industry Dive acquired Mobile Commerce Daily in early 2017. Some information, such as publication dates, may not have migrated over. Check out our topic page for the latest mobile commerce news.

Heels.com addresses mobile merchandising challenges via visual search

The preference to communicate visually rather than verbally is evidenced as selfies, pins and emojis push further into ubiquity. Partnering with Iterate Studio to evolve its ecommerce strategy and remain au courant with online retail technologies, Heels.com is opting for a browse-and-discover search technique, as it believes text-based browsing will soon be retired.

“People often struggle to find the words to describe the exact product they want. It’s so much easier to visualize as verbal descriptions can be very subjective,” said Eric McCoy, founder and CEO, Heels.com. “A customer may say they want a ‘hot pink’ shoe, but the exact shade they have in mind could be very different than another person’s interpretation of ‘hot pink’.

“By using visual search, users are able to save time browsing and discovering products they love. We can instantly present our customers with products that meet the visual criteria.”

Visual search plays into the fact that people are visually oriented. Ninety percent of information transmitted to the brain is visual, and this information is processed 60,000 times faster than text, according to Hubspot.

If the shoe fits…
There is no doubt that products can be difficult for shoppers to describe with words, yet easy to visualize. On Heels.com, a customer can now use visuals to describe exactly what they want to find.

For the retailer, visual search accomplishes a previously unattainable milestone – facilitating an interaction with a consumer at her point-of-inspiration and desire.

“This new feature targets our savvy 18- to 35-year-old female audience, who are widely known to be the most engaged on the visual social media platforms like Pinterest and Instagram.” Mr. McCoy said.

The visual search engine analyzes the pixels in an image and then compares those pixels to other images. There are two components to the visual search technology which aid in the discovery of products.

The first piece is the Style Match Image Search, which allows customers to upload a picture or enter an image URL, similar to Google’s “search by image” functionality.

The customer is then presented with all of the products that look similar to that image. Customers can find these sample images however they like, whether they snap their own picture or happen upon a product through social media or fashion blogs.

Customers can also find products that look visually similar to the style they are currently viewing, using the Sole Mates Search tool. If a shoe they love is out of stock, with one click they will instantly be presented with shoes that look visually similar, from color to construction to height of the heel.

“People think and behave using pictures,” said Brian Sathianathan, head of emerging technologies for Iterate Studio. “We want to help Heels.com bring that natural human behavior to Web and mobile-based shopping.”

Show me, don’t tell me
Research from WeSEE, a visual recognition ad targeting system, reveals that 74 percent of consumers say that traditional text-based keyword queries are inefficient in helping them find the right items online, with 15 percent of these shoppers saying they regularly encounter trouble finding what they’re looking for using keyword searches.

Considering that 73 percent of people shop online by entering a search term into a search engine, retailers that rely on solely keyword search miss out on significant business, as 40 percent of shoppers would like their online shopping experience to be more visual, image-based and intuitive.

Recent advancements in the area of visual search are setting the stage for a major shift in how people interact with the world around them and how those selling can better interact with those buying.

Online British clothing retailer Zalando announced in March that the company is trialing a cloud-based visual search technology that connects customers with the pair of shoes they’re looking for through technology it is calling FindSimilar.

The technology works by replicating the way the brain processes images, and finds similarities to offer a curated shopping experience in-app or online. Featuring the retailer’s product catalogue as the sole context for results, shoppers will be able to perform complex visual search actions in any environment.

For consumers, branded apps using visual search have the potential to transcend being just a disposable gimmick and become a valued tool, providing guidance at the point of indecision, efficiency at the point of irritation and access to products at the very moment of inspiration.

A Deloitte study found that 80 percent of branded apps were downloaded less than 1,000 times and that the majority of smartphone users had only one or two retailer apps on their phone.

Mobile consumers are impatient and spoiled for choice. There must, therefore, be a highly compelling reason why they would choose to take up precious smartphone real estate with a brand or retailer app.

Visual search technology may present an opportunity to unify the disparate components of retailer operations, and provide the modern consumer with the compelling shopping experience that they are seeking.

“Visual search will absolutely change the way consumers shop,” Mr. McCoy said. “This is the age of the visual web. People express themselves visually and enjoy curating visual content through platforms like Pinterest, Instagram, Polyvore and Wanelo.”

“This innovation allows them to shop using the same behavior. It only makes sense that ecommerce should evolve with our users’ preference.”

Final Take
Michelle is editorial assistant on Mobile Commerce Daily, New York