At its annual “Search On” event Wednesday, the Mountain View, California-based tech company unleashed several updates for product search and discovery in its mobile app that cut to the heart of the online apparel and shoe business. The company introduced pilot programs that bring 3D product imagery for sneakers into search results, as well as a new “Shop the Look” feature for apparel, personalized results and other updates to help U.S. consumers feel more confident about buying.
More from WWD
Altogether, the company unveiled nine new tests and tools with the stated goal of making shopping more visual and immersive, while helping consumers stay better informed. This is the search engine doing what it does best — slicing and dicing its huge tranche of data in new ways and flexing its considerable machine learning muscle to deploy new features. In other words, this is “peak Google,” and it’s pointing straight at fashion.
One signal of Google’s high priority on shopping in general and fashion in particular is location: The features were designed for the main search engine, not sidelined in the Google Shopping section. That includes a noteworthy new pilot program to populate results with 3D product imagery for sneakers.
Think of it as a follow-up to the platform’s 3D home goods, introduced earlier this year. According to Google, people interacted with those 3D images nearly 50 percent more than still photos, so it’s eager to expand virtual imagery to other areas, starting with the feet.
People will be able to check out more lifelike sneakers, zoom in on details and rotate them to view every angle before buying. The test is limited right now to Vans and a handful of other brands, but later this year, all sneaker companies will be able to add their own assets.
Of course, not all brands or merchants have 3D images or the resources to create them. To remove that barrier, Google developed a new tool that uses machine learning to render “spinnable” images by stitching together standard 2D merchandise pics. Another limited pilot program is getting underway to test this tool through the Google Manufacturer Center, as detailed on a help page in the support section.
But that’s just part of the new experience, according to Google, and it all starts with a simple change in user behavior: In the U.S., consumers begin merely by typing the word “shop” alongside the product name or keywords.
From there, they’ll see results populate with “a visual feed of products, research tools and nearby inventory related to that product,” Lilian Rincon, Google’s senior director of product, shopping, wrote in a blog post. “We’re also expanding the shoppable search experience to all categories and more regions on mobile (and coming soon to desktop).”
As Rincon elaborated in an interview with WWD, the results include a shoppable display featuring the products, lifestyle images, guides and more from a broad array of retailers and brands.
“One of the new tools, which we’re calling Shop the Look, helps people assemble the perfect outfit,” she said, pointing to an example of a bomber jacket search. The results would show photos of different styles for the item itself, as well as complementary pieces and where to buy them directly within search. It’s akin to Google’s version of styling services, except it’s not necessarily based on runways or the talents of human tastemakers. It’s informed by data.
Shop the Look and other features rely on machine learning, specifically Google’s Shopping Graph. The artificial intelligence-powered model ingests data from across the internet or as provided by merchant partners, and over the past year alone, its understanding of product listings has ballooned from 24 billion to more than 35 billion listings, Rincon said.
The listings, plus what and how consumers search, power Shop the Look, as well as a new trending products feature that will launch in the U.S. this fall.
Shop the Look and Trending Now join several other new features, including Page Insights — which conjures more information while visiting a webpage based on the featured products, like pros and cons or star ratings — a buying guide that drills down into different considerations when evaluating product, opt-ins for deals or price-drop alerts and, for consumers who shop with Google, personalized shopping results based on their preferences and shopping habits. They can tweak details like favorite brands and department stores, or shut off the personalization if they don’t want the feature.
Another update brings dynamic whole page shopping filters that change according to trends.
“For example, when shopping for jeans, I may see filters for wide leg and boot cut, because those are the denim sales that are popular right now,” Rincon explained. “And if jeggings ever come back in style, this might be suggested as a filter in the future.”
A new Discover feature in the Google app will also begin suggesting styles based on what the individual and other consumers have searched and shopped. “If you’re into vintage styles, you’ll see a suggested query of popular vintage looks,” she added. “And then you can tap whatever catches your eye and use lens to see where to buy it.”
The changes look intriguing, but ultimately they won’t amount to much if no one uses them. That’s why their top-line visibility and access in Google main search page is important — though, it also begs the question of whether there’s any point in a dedicated Google Shopping page. Whatever its fate, it’s obvious that shopping isn’t just a side show for the search business. It’s the main attraction, and likely a strategic move to maximize recent market trends.
Search dominance is an existential matter for Google as the source of its core revenue, yet it has watched as more consumers started kicking off their hunt for product on Amazon. According to e-commerce software developer Jungle Scout, the second quarter saw more than half of online consumers, at 61 percent, begin their product searches on the e-tail behemoth’s site.
Though impressive on its face, the data actually illustrates a downward slide from the 74 percent noted in the first quarter of 2021. While Amazon saw attrition, the broader search engine category held steady at 49 percent. This could look like an opening for Google to gain ground. By making shopping more visual, it’s building on investments in e-commerce — an area that has been paying off for chief executive officer Sundar Pichai, as he told analysts during parent company Alphabet’s second-quarter earnings call.
“People are shopping across Google more than one billion times each day,” Pichai said. “We see hundreds of millions of shopping searches on Google Images each month.”
So far, the company has seen success with visual shopping in places like Japan and India, according to Rincon. Adding virtual imagery makes sense, given the high engagement rate over 2D visuals, to accelerate traction even further. It also lines up with other initiatives across the organization, which cover ads and shopping from YouTube to Google search, embedding experiences like augmented reality makeup try-ons and virtual furniture into mainstream shopping habits. Now it’s keen to do the same with 3D sneakers — and it won’t let obstacles, like a lack of visual assets, get in the way.
“Our new ML model takes just a handful of photos and creates a compelling 3D representation of an object, in this case, the shoe. This new model builds on the neural radiance field, or NeRF, which is a seminal paper that we collaborated [on] with UC Berkeley and UC San Diego,” Rincon said. NeRF is a neural network that can, in essence, use ML to fill in the visual gaps between 2D photos to create 3D images.
Rincon believes the tech is a game-changer for smaller brands and merchants, and she’s not alone. Forma developed similar tech, which fueled partnerships with players from Bold Metrics to Snapchat, and even Apple dove in with Object Capture, a developer tool announced in 2021 that uses photogrammetry to make easier work of casting 2D images as 3D objects. Amazon, too, supports virtual showrooms and shopping environments, thanks to its partnership with Adobe, alongside AR features for virtualized products in its marketplace.
Although Google-made 3D images can’t be exported or used outside of the platform, at least for now, the effort could go far in cementing virtual shopping as a foundational consumer behavior. Right now, the approach applies to real-world goods, but there are implications beyond the physical world, too. To be clear, this initiative is not exactly a metaverse strategy. But it seems related, perhaps as something adjacent — at least in potential, if not in reality.
“The things that are sister to these types of experiences aren’t just [about] visualizing 3D assets by themselves, but also pivoting to AR, right?” Rincon said. “So something that you could imagine is looking at the street and, you know, a 3D shoe, and then having some way to try it on yourself and see, with your camera, what it looks like on your feet.”
Virtual shoe try-ons have been available for years, but not in the same place where people search everything else, where it can shape search results and join other data to inspire complementary picks, choose outfits pieced together from across the web.
In whatever reality, physical or virtual, Google’s ambitions apparently teeter on fashion, so much so that it’s even dipping into styling territory now. But as Stitch Fix, Amazon and other tech platforms that employ human stylists can attest, it takes more than just data to style people. The science behind it has been evolving by leaps and bounds, but there’s also an art to it — at least in the right hands — and it’s not at all clear if Google has the chops for that. Soon, shoppers will be able to judge for themselves.