The evolution of search IRM51
Imitation is said to be the sincerest form of flattery, so when someone asks where you bought your shoes/bag/coat/dress, it’s nice to be able to tell them. But what if you could take a picture on your mobile phone of someone wearing those shoes/bag/coat/dress and do an instant search for them – or something similar – to find out where you can buy them? That’s just the service being offered by Snap Fashion and fashion retailer Zalando, among others.
Oasis, for example, added the enabling visual search technology to its site in September 2014. The ‘Snap for Similar’ button allows its customers to browse online for garments which are similar in colour or texture.
The firm is using technology from Snap Fashion, which as well as working with retailers has its own affiliate site enabling people to search not only by colour and texture/pattern but also by garment shape. Shoppers are given the results in three separate lists, and while one list isn’t more popular than the others, Jenny Griffiths, Snap Fashion’s Founder and CEO, does say that its mobile app ColourPop ranks higher in the app store than the full Snap Fashion app. The colour match does seem to be the one that has the wow factor and offers instant gratification, she explains.
“Visual search democratises shopping,” says Griffiths, “since shoppers are clicking on things they like the look of rather than just visiting the same retailers or brands that they always visit online. In this way, we can see which are the smaller, up and coming retailers, how many 18 year olds are clicking on a particular garment from Marks & Spencer or that a particular size dress is very popular.”
When it comes to the company’s mobile app, it enables shoppers to take a picture of what someone is wearing and search for something similar among the products on the Snap Fashion database. Some 35% of searches lead to a visit to retailers’ sites. This is lower than the “more robust” desktop site, which sees 60% of shoppers clicking through. When Snap Fashion is implemented on a retailer’s site, where shoppers already know the brand and are on its site, “conversion goes through the roof,” according to Griffiths.SAME OR SIMILAR
In the US, visual search is incorporated into the Neiman Marcus mobile app to help shoppers looking for the right handbag or shoes. The shopper takes a photograph of a pair of shoes or a handbag in an advert, magazine or someone wearing them and the app automatically finds the item on the website, if it’s that season’s stock, or matches the image to something similar amongst the current inventory, explains the company’s VP of Corporate Comms, Ginger Reeder.
The technology does have its limitations though, with better results from a close-up image with a clean background. The company plans to expand its use to other product categories.
Staff at Neiman Marcus stores are also using the app to help customers. Sometimes, it’s not easy for customers to describe exactly what they are looking for, explains Reeder, so store associates can use the app on an iPhone to help them narrow down what they are looking for by colour, shape or style. “Customers are pleased with the app as it answers a problem they’re facing then and there,” explains Reeder.
Eric McCoy, founder and CEO of Heels.com, agrees that it can often be difficult for shoppers to find the right words to describe the precise shoe they’re looking for. “For example, shoppers might say they want a red heel, but with almost infinite shades, it’s all subjective,” he says. Heels.com, in partnership with open innovation company Iterate Studio, has implemented pixel image-based technology from PCCSO to help customers find what they want.
It’s the ability to flag up something similar, rather than returning no results, that can be skewed by retailers to make it less likely that a customer leaves a site empty handed. It’s down to retailers to bring in business logic since search results can be weighted based on inventory or merchandising rules. For example, if a shopper is searching on handbags, the results can be weighted so they first show handbags that are in a sale.
“That’s when the real power comes in,” says John Safa, Chief Technology Officer of Cortexica, the company which powers apps for Macy’s and Zalando. Retail websites now have an extra shop assistant functionality, effectively replicating the shop assistant’s response of ‘we don’t have that one but how about these, which are similar in colour, pattern or shape’. “It’s instant gratification for the Instagram generation,” he says.INSIGHT
Visual search, though, isn’t just about matching product with the customer; it also gives retailers insight into what shoppers are looking for. As with any app on a mobile device, retailers are gathering data about the shopper, their device and their location. Search always gives data that goes beyond search, but “visual search gives customers more insight into what’s going on,” says Safa.
Rather than a text search which shows shoppers looking for a black dress, visual search will show what shape of black dress they are seeking.
“It’s always interesting to see what’s breaking the system,” says Griffiths, since searches that return no matches are highlighting potential trends for the next season. “We’re acquiring a lot of data for retailers based on visual product search that will help them to make merchandising decisions,” says Mark Elfenbein, CEO of Slyce – the company behind the Neiman Marcus app. This real-time, regional data which shows what’s trending in different areas in different time zones is what will turn visual search from a shop assistant or conversion tool into a data and analytics platform.TESTING THE CAMERA
European fashion retailer Zalando is well known for developing its own technology in-house, so it’s interesting that it is has been collaborating with Cortexica on visual search functionality for its mobile phone app. The Photo Search feature allows customers to take a picture of an item of clothing or a pattern and search for similar items being sold by the retailer.
However, as Daniel Schneider, Head of Onsite Customer Journey at Zalando explains, visual search is “more than just picture taking”. For him, the Cortexica link-up is just the first step in its journey to find how to make the search for fashion more visual. “We’re trying to understand what the customer is looking for,” explains Schneider, “since they may not actually be looking for the item in the photograph but a recommendation of something to go with it, for example.”
Feedback from customers has been mixed with the Photo Search not being used as often as text searches. But as Schneider points out, the functionality of the app is leading to what he calls “playful” searches. “Some people are taking multiple pictures,” he says, as they see something they like and wonder if Zalando has something similar. Even when shoppers are browsing on a desktop site, he believes that they are searching visually because one click leads to another as they spot items they like the look of.
More significant, though, is the fact that visual search is the company’s first foray into seeing how they can use the camera on a mobile phone. “Mobile is important,” says Schneider as mobile phone and tablet makes up 43% of all traffic to Zalando, so location and the camera becomes part of that. What the future holds for the technology is something that Zalando – which sees itself as a technology company rather than a retailer – is investigating. It may be used for fashion advice or for picture sharing, but one thing Schneider is sure of is that it won’t replace text-based search since in most text searches people are looking for specific brands. “It will be complementary and in addition to that though,” he adds.FUTURE
Is the future for visual search a ‘Shazam for fashion’? The technology companies agree that it’s close but still early days. “It will be driven by consumers,” says Safa. In terms of where visual search could go, Elfenbein believes that the industry is where Google was with search in 2002. “People are communicating in pictures, searching by images, so quick image-based search will become common place in a couple of years,” he says.
However, as the Tesco wine finder app showed back in 2009, visual search technology is not just for fashion.
Amazon’s Flow app tells users more about the items around them which are available to purchase on Amazon.com. By continuously scanning, the app can identify millions of products, including DVDs and books, and packaged household items such as a box of cereal or tissues. The functionality has been extended so users can now scan phone numbers, URLs and business cards and then quickly dial, open or add the information to their contacts.
Visual search is also being used by other retailers to identify items and add them to grocery lists or to create wish lists, add purchases to a loyalty scheme or even to digitise coupons.
The Shopper mobile shopping list application from Purchase Decision Network (PDN) uses visual search as a means for shoppers to add items to their grocery shopping list. Rather than having to scan a barcode, Snap2Add functionality enables more than 1.75 million users of the app to snap photos of grocery products around their home and have them recognised and immediately added to their mobile shopping lists. A catalogue of 20,000 of the most-listed items has been created and more products are being added weekly, explains CEO Sean Flynn. The items don’t even need to be in packaging to be recognised, so along with fruit and vegetables, the visual search will recognise rice crispies in a bowl – although labelling them cereal and not by brand and product number. If something isn’t recognised, the app reverts back to visual search company Slyce and Mechanical Turk with human intervention. “The identification rate is over 90%,” says Flynn.
Because the Slyce technology, as used by PDN, can understand and digitise the information on a receipt or coupon, the data these hold such as the date and price could be used to add the purchase to a loyalty scheme or to notify a shopper that their 20% discount scanned from a coupon will expire in 2 days time as they walk past a shop branch.
This digitising of coupons is something that Slyce will be working on with its acquisition of SnipSnap, the third largest mobile couponing company in the US. SnipSnap was the first company to develop image-recognition technology for scanning and interpreting coupons with a mobile device. The SnipSnap app launched in the US in 2012 and is now used by 4.5 million shoppers who photograph a coupon on their smartphone camera and then search, retrieve and share it with other SnipSnap users. Information such as the barcode and expiration date is captured and digitised so users simply need to present their phone at the in-store checkout to redeem the offer.
SnipSnap provides a white label platform for retailers, including Toys “R” Us, to create and target mobile coupons across email, web, mobile and SMS channels. It was also incorporated with beacons by Lord & Taylor, where “SnipSnap’s location and behavioural targeting mixed with micro-location beacons served as the perfect offline to online marketing bridge for stores,” explains Ryan Craver, former SVP Strategy for Lord & Taylor-owner Hudson’s Bay Company.
Slyce’s vision is to fold 3D visual search, instant couponing and one-click customer checkout into one application. As Slyce’s applications show, the use case for visual search goes way beyond fashion retailing – and beyond just the retail industry. And the future for retailers? Well, while many are still getting their heads around location and using this smartphone functionality, Zalando has moved onto the camera. As a technology company, its ethos is: “if we discover and work on it, we’ll be there in a year or two”. Zalando doesn’t want to be a follower, do you?