Google is introducing two new features for its online shopping platform that are intended to aid consumers in doing more thorough apparel searches and improving their ability to picture how items will appear on various body types.
Users of Google Shopping in the US will now have access to a virtual try-on experience that accurately depicts how a piece of apparel would appear on a variety of real human models. These models, which range in size from XXS to 4XL, have different skin tones, ethnicities, hair kinds, and body types to assist customers visualize how an item of apparel would seem on a body type that is like their own.
The virtual try-on experience will primarily feature women’s shirts from a variety of retailers, including H&M, Anthropologie, Everlane, and Loft. Men’s tops and “other apparel” will allegedly become accessible later this year, according to Google. By properly imagining how a piece of apparel will look before they buy it, the feature was intended to assist consumers avoid dissatisfaction. According to the company’s own purchasing data, 59 percent of online buyers are dissatisfied with an item of clothing purchase because they anticipated it to appear differently on their bodies, and 42 percent feel that online fashion models don’t accurately represent them.
A diffusion-based generative AI model, which is used in the new Google Shopping virtual try-on feature, is trained by adding Gaussian noise to an image (basically, random pixels), which the model then learns how to remove to generate realistic images. In spite of the angle or posture, the models are in, the procedure enables Google’s AI model to realistically represent how a piece of clothing might wrinkle, drape, fold, cling, and stretch on them. To be clear, the models for Google Shopping are not created by artificial intelligence; rather, AI is only utilized to tailor the apparel to photographs of these real people.
Today, Google Shopping is also getting new filters that are intended to help consumers locate exactly what they’re looking for, such as a comparable but less expensive shirt alternative or a jacket in a different pattern. Users can narrow down inputs like colour, style, and pattern among numerous online clothing stores to locate an item that most closely matches their needs according to machine learning and visual matching algorithms. The feature is currently only available within Google Shopping product listings and is similarly restricted to tops; Google has not stated a timeframe for when it will be extended to other categories of clothing.
In a similar manner, Levi’s revealed in March that it was utilizing AI to increase the modelling options for online purchasing. The denim firm announced it will explore utilizing AI-generated models rather than real people’s photographs like Google, at first stating it would help “diversify” the denim company’s purchasing experience. After receiving criticism for the statement, Levi’s eventually took back those remarks, but the company insisted that employing AI-generated models will enable it to “publish more images of our products on a range of body types more quickly.”
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.