Google Launches Updated AI Tools for a Personalized Shopping Experience

Google’s newly updated shopping tool uses AI and machine learning to offer users the ability to customize online search results based on their unique wants and needs.

The search engine’s Shopping Graph application, launched several years ago, now features new capabilities that help online shoppers sift through more than 45 billion products to pinpoint products that meet their exact specifications.

More from Sourcing Journal

According to the tech juggernaut, Google has started rolling out a feature that gives shoppers access to more personalized search results. Beginning with U.S. mobile and app users, the feature allows shoppers to search for an item—like “straw tote bags,” for example—and access a section labeled “style recommendations.” Users can rate each option with a thumbs up or thumbs down, or swipe left or right to indicate their interest, instantly updating their personalized results.

The recommendations keep coming until the user finds what they’re looking for, all the while amassing insights from the shopper’s inputs. “We’ll remember your preferences for next time, too,” Google said in a statement. “So when you’re looking for, say, men’s polo shirts again, you’ll see personalized style recommendations based on what you liked in the past and products you interacted with.”

Shoppers can specify their favorite brands carrying apparel, footwear and accessories during the discovery process. When searching for an item, they need only click the three dots above a result and toggle to “About this result” to manage brand preferences, indicating to Google that they would like to see more from that brand.

The company has also developed an AI-powered image generation software that helps users search for products using a visual input. According to Google, about 20 percent of apparel shopping queries contain five or more words, indicating that shoppers have a very specific idea of what they’re looking for when they begin their online search. However, the descriptions they use might vary. “For instance, someone might call something boxy, while another might call it oversized,” the company said.

Solving for this, the technology allows users to input a search query, like “colorful quilted spring jacket,” for example, which generates a visual representation of the desired product. Clicking “generate image” will prompt Google to serve up a number of photorealistic results that could match their vision. When a shopper finds one that closely resembles their desired product, they can click it to see shoppable options from the web. U.S. Google users must opt into Search Generative Experience (SGE) with Search Labs in order to access the feature.

After a user finds a style they like, they can use Google’s virtual try-on tool, launched last year, to get an idea of how it might look. Desktop, mobile and app users can find a product in their shopping search results and click the “try-on” icon for men’s and women’s tops. “You’ll see what that top looks like on a diverse set of real models ranging in size from XXS-4XL—including how it would drape, fold or form wrinkles and shadows on the model,” Google said. Virtual try-on has also been proven to drive traffic directly to retailers’ sites for purchase.

Google has been doubling down on its efforts to streamline the shopping process in recent months. January saw the announcement of a strategic partnership with Victoria’s Secret, which will feature a conversational chatbot powered by AI.

The bot will provide Victoria’s Secret shoppers with style recommendations tailored to their lifestyles and preferences, while at the same time improving the brand’s knowledge of the customer. Ultimately, the insights gleaned from the bot interactions could be used to inform product development, marketing and the overall shopping experience.