Technology

Google’s New AI Tool Is About to Make Online Shopping Even Easier

The generative AI tool lets you “try on” clothes from hundreds of brands by displaying them on models of a wide array of sizes and skin colors.

Since Google I/O in May, the company has heavily promoted its generative text and image AI tools to help people do everything from draft essays to create art. However, its core business model is selling ads and products. Today the company unveiled a new shopping tool that may help do exactly that.

Now, customers in the United States can virtually “try on” women’s tops. The company uses images of real models ranging from XXS to 3XL to wear AI-generated versions of clothes from hundreds of brands sold across Google, like Anthropologie, Everlane, and H&M. You can scroll through and select different body types or skin tones and see how clothes might drape on your own body. When you find the model that most closely resembles you, you can save them to be your default model.

Every woman of a certain age longed for Cher’s closet from 1995’s Cluelessna tool that would drape clothing on her own body and show exactly what it would look like when she put it on. Although Google’s new tool is available only for women’s tops, this limited application might be the best use case for AI yet.

Cut and Paste

Online shopping is a nightmare. The fashion industry has shifted since the heroin-chic models of the ’90s, but it’s still uncommon to see models that look like you. Long torsos makes dresses three inches shorter than advertised; finding jeans that fit legs, thighs, and butts feels like an impossible feat. Not only is it impossible to judge the quality of the cloth, the models could also be posed poorly, or the clothes could be pinned or fastened to the model to look completely different than they are in person.

Google’s AI tool skirts those problems by showing you how the real clothes will drape and mold around a real human form. The company trained the tool using the images of real models and its Shopping Graph. During the model photo shoot, the company identified their size according to the sizing charts across several brands.

It matched photos of of the models wearing shirts in two different poses and generated images of that shirt from other angles. Then it took images of the clothing from the merchant and fused them with images of Google’s model via generative diffusion models to produce multiple, diverse images of the clothing. The result? A wide array of remarkably real-looking images of the clothes you want to buy.

Now, when searching for shirts maybe you’re in need of a new going-out top? you’ll see a “Try On” badge next to applicable clothing items. Clicking that opens up a list of models to scroll through. All 40 female models are included for every shirt, so you’ll see multiple models for each size. That’s especially helpful, since two people can wear the same size but be shaped differently, causing clothes to look very different on each of them.

Obstacle Course

Within this new shopping experience, you can see guided refinements. If you’re looking at a shirt on the model you’ve selected, but you want a version that’s more affordable, or like the shape but want it in a different color or pattern, you can select a few options from dropdown menus and Google will output similar options.

Of course, you probably shop for clothes directly on the brand’s website, and this feature will work only within Google Shopping. If you find an item you like, you’ll have to do a Google search for it to see if it’s available.

Source
wired

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button