Thursday January 11 2018
What are the visual search tools of the future and how could they hold the key to increasing mcommerce transactions for retailers? Olivier Legris, lead strategist at Future Platforms explains
While a number of recent predictions are painting a rosy picture for the future of mcommerce, the goalposts for delivering a successful shopping experience are on the move, leaving no time for retailers to rest on their laurels.
Retail ecommerce sales made via smartphones are set to be worth £16.42billion in the UK this year, according to a report by eMarketer, which also predicts that smartphone channels will account for 46.5% of the UK’s total retail mcommerce sales.
These findings signify how UK consumers are becoming more comfortable using their mobile devices to make retail purchases as reflected in eMarketer’s forecast that mcommerce figures will continue to climb, reaching a value of £58.5 billion by 2021 and accounting for over half of the country’s retail ecommerce sales.
As the possibilities for mobile technology expand, retailers that don’t adapt will get left behind while others convert more sales and deliver a better shopping experience via the mobile platforms. Visual search will be a key tool of the future but there are points for retailers to consider if they are going to get it right.
Problem 1: Choice
The balancing act of giving the customer enough choice while providing them with smarter, quicker and more accurate results is becoming increasingly difficult to achieve in today’s always-on world.
With a simple online customer journey, wherein the customer knows exactly what they want, it’s a straightforward transaction which allows the retailer a quick sale that satisfies the customer accurately.
For customers using mobile platforms, the journey until now has been adequate at best, but not without its pain-points: retailers can deliver search results to match a customer’s criteria through specific keyword searches and filters, giving them enough insight to close the sale.
The problem we have at the moment is that keyword-generated search functions alone, on a mobile level and especially in relation to ecommerce, have limitations for product searches where inventories are vast.
For retailers with large-scale catalogues, a keyword approach does not necessarily yield the best experience: there’s too much choice for the customer to navigate through. Mobile shoppers are often browsing on-the-move, so when you pair this with the small screen interface, an overload of options can become a turn-off and not conducive to encouraging a quick sale.
Problem 2: Inspiration
Meeting customer expectations through the delivery of an intuitive retail experience throws up its own set of challenges, which intensifies when the need for inspiration is added into the equation.
Social media platforms such as Instagram and Pinterest are adding to our visual stimulation. They are teeming with accounts that do a great job of showcasing products both in-the-moment and in-the-hand. This inevitably raises the bar of expectation for customers.
There’s a huge opportunity in getting a smart solution to this challenge. Channels like Instagram don’t often provide links to the images, so it’s hard for the customer to learn more about products without significant and often futile searches. They are inspired, but left frustrated by the fact they cannot connect that source of inspiration to a successful purchase: there’s a gap in their journey.
While current solutions aren’t quite hitting the spot, those retailers that can eventually align themselves with customers, at the point of inspiration, will have an opportunity to guide that inspiration through into their full retail journey and convert to a sale.
Problem 3: Imagery
Because the current state of technology has limitations when it comes to extracting specific information from images, few companies are making use of imagery as an input.
Those that are dabbling in image recognition technologies are using a simplistic level of categorisation. It’s possible for studio-quality images in search to identify items at a high level, for example, a dress, a wardrobe, a landmark. But extracting more specific information for example, around style or perhaps function, or maybe at the level of understanding a customer-generated photo with its unpredictability around composition or lighting, is far more complex. This is especially true if the image is blurred, shot from an unusual angle or badly lit.
But it is possible, with the right approach to implementing improved inventory datasets - something that is currently lacking with most legacy systems.
Those that can offer a solution that can match and recommend to a phone-generated image, will be able to offer a search experience that helps customers to identify and purchase, closer to the moment when the inspiration has captured their ‘buy now’ impulses.
When it comes to overcoming these three challenges, visual search has an important role to play. The term computer vision / visual search is the capacity for machines to translate visual assets (pictures and video) into descriptive data that than can be processed by other systems.
On a practical level, this technology can help customers get inspiration and find the products they need using images instead of text as an input.
Retailers are already beginning to invest in computer vision technology. It is expected that 45 per cent of retailers are planning to utilise AI-driven solutions to enhance the customer experience in the next three years, according to BRP Consulting’s 2017 Customer experience/Unified Commerce Survey.
Leading the way are Pinterest’s Lens and ASOS, though there’s a number of startups hot on their heels - particularly at a retail fashion level - drawing on the potential power of visual recognition.
Why? On one level, it’s to help fashion designers understand current trends and develop new designs. On another, it’s about leveraging the inspiration element of the journey and satisfying customers who were previously frustrated with the lack of route to purchase.
Innovation in visual search is far from restricted to fashion. Last year, Wayfair launched its ‘search with photo’ feature, eBay released its Image Search and Find it On eBay experiences, and Made.com is now using a visual discovery engine from Hullabalook to power a new type of search for homeware products.
The fashion and furniture sectors are natural leaders in this field as imagery is more important to them when convincing customers to purchase a product. The ‘sell’ is focused more on how the item looks rather than how it functions.
Innovators are even making waves in the beauty sector: L’Oreal and digital incubator Founders Factory recently selected an analytics platform and a visual search engine to form part of their beauty technology accelerator programme.
Where are we now?
In this first phase, the way in which visual search works is that the technology recognises the image it is shown and matches it to a selection of appropriate results. Similarly, if no information about the brand/cost/model is available, you can screen-grab the image from your smartphone and the visual search in this perspective works very well.
But there are restrictions in terms of the technical capabilities, especially in relation to customer-generated photos.
The main obstacle is that there are a number of variables that make it difficult to extract or identify an item and/or compare it to a studio-shot product, such as the angle of the shot and partial/blurred visibility of the item.
And so understanding specific attributes of a product that go beyond the higher level of categorisation becomes difficult to achieve.
With visual search poised to play an important role in enhancing the customer experience of the future, proving effective during the inspiration aspect of the journey, the question retailers need to be asking themselves is no longer ‘should I invest?’ rather ‘can we facilitate this internally or do we need to look outside the business to third parties?’
Visualising the road to innovation
If eMarketer’s projections come to fruition, then the retailer leveraging visual search will be in with a chance of taking a large share of the mobile retail honeypot.
But where to start? Retailers looking to dip their toes into the visual search pool must first get themselves image ready.
It’s important to remember that AI will not understand the ‘concept’ of an item, rather, it will understand the ‘probability’ that this is the desired item. But first, to have a working computer vision model, you need to train it.
For the AI and computer vision to understand an item, you must show it hundreds of different images of the product, and assign labels to help determine them. This way, the technology will understand an image when it sees it.
In order to do this, you need large datasets, you need to input and you need clarification for the process to work effectively.
A change in approach to current product datasets and improved training across the entire usage of data will also be important to levels of success. After all, many of the technologies that will need to be deployed to enable smarter search, will only be as good as the quality of data that is utilised.
The industry is at the start of building new and improved datasets that will help with this challenge. For example, in fashion, companies like Zalando have started creating image databases like Fashion-MNIST.
The question for retailers should be who should take responsibility for these enhanced datasets. Should it be the product maker? The retailer? The startup? Talent in this field is currently rare, and the upfront investment to achieve an acceptable result is significant. But generation and ownership of improved datasets is doable in-house and there is arguable value for retailers in owning long-term knowledge and insight.
When it comes to the possibilities of visual search technology, the quality of your image assets could also prove to be a deal breaker in the future retail scene.
At the moment, those retailers who have technology that can match and recognise front-facing product shots taken against brilliant white backgrounds with perfect lighting have the upper hand.
Retailers who have invested in technology solutions that are built to match, recognise and understand complex image assets will hold the real advantage.
A glance into the future
Where the possibilities of visual search become even more exciting is in the second phase: imagine the technology not only understanding your request, but also perceiving characteristics about style, usage, and preferences beyond the definitive description of the product.
When visual search functionality can achieve this level of complex understanding, they’ll not only display products that match the description but they’ll also showcase options to inspire you, much like their human counterparts. This is a glimpse into the future of visual search, but it’s not too far away, as demonstrated by Wayfair and L’Oreal.
But to achieve a more accurate and refined level of visual search, the technology of the future will not only need to understand what a customer does or doesn’t like, it will also have to understand why.
Retailers with extensive catalogues will need to ensure they bring in expertise that can work with them to implement technologies that help turn what seems like the impossible, into the possible. Those that find a path to honing in on personalised choices, capitalising on inspiration and tantalising with imagery will lead the charge and take the lion’s share of increased revenues that mobile has to offer.
Tagged as: Future Platforms | visual search | ASOS | Wayfair