Google is expanding its virtual try-on so you can use a single selfie to see how clothes would look on you. Rather than produce a full-body photo, the feature instead creates a lifelike, full-length version of you based on just a shot of your face; it then dresses your virtual body in garments to try out fit and style before making a purchase.
Supported by Google’s Nano Banana and the Gemini 2.5 Flash Image model, it generates various renderings and allows you to choose a favorite image as your default try-on photo.
You can still upload a full-body image or pick from a variety of preset models.
The rollout will begin in the United States on Search, Google Shopping, and Google Images while using an internet browser or the Android Google app, wherever you see the “try it on” icon appear in a product listing for apparel.
How the selfie try-on works to visualize clothing fit
At a high level, Google reads body geometry in a selfie using pose estimation, segmentation, and learned priors on human shape. The system then simulates the fit, texture, and lighting of the garments, eliciting a photorealistic full-body composite that looks as if it were you in the selected garment. It’s basically virtual draping plus photorealistic synthesis, tweaked for speed so that shoppers can have a browse and try-on in the very same flow.
Shoppers select their typical size, kick off the try-on, and view a few different poses or angles. Just tap on one image to make that the default and the system can reuse it for future data, so you don’t have to start at the beginning. There’s a meaningful usability win in not requiring friction from a full-body upload to the iteration of a selfie, especially on mobile, where most apparel discovery now begins.
Why it matters to shoppers and brands in online fashion
Returns continue to be a costly weight on online fashion. Citing returns, the National Retail Federation estimated that U.S. retailers experienced a 14.5% return rate recently — hundreds of billions of dollars in merchandise being processed back through the system. According to Narvar’s consumer research, size and fit consistently top the list of reasons for returns and frequently make up about half of apparel send-backs. Anything that increases confidence around fit can eliminate expensive reverse logistics.
Virtual try-on also tends to drive up conversion and reduce hesitation for style-driven items where product photos alone fail to answer “Will this work on me?” Competitors have encountered all this and more: Walmart’s laboratory and Minions-branded Zeekit-based tool, AR try-ons in Chinese social apps — with shoppers signing off with conviction. Google’s strength is distribution — plugging try-on right into Search and Shopping, where you’re already hot with intent, and doing it with a fast-flow selfie, look-first product.
Privacy and accuracy caveats for selfie-based try-ons
Selfies are sensitive data. While Google presents some alternatives — including full-body uploads or selecting from predetermined models — the company will have to explain exactly how images are processed, stored, and deleted, and how the feature treats teen accounts. U.S. privacy regulations like California’s CPRA and Illinois’ BIPA take biometric and face data seriously, so transparency and control are going to be as important as realism.
No try-on system is perfect. Shiny fabrics, busy patterns, and unusual silhouettes, as well as poor lighting, can fool a draped garment or the color on the site. Occlusions can also be caused by accessories and hair. Best results require a well-lit selfie and, perhaps, for you to check the size against brand-specific charts or reviews — virtual try-on is a tool of confidence rather than the tailor’s tape measure.
Part of a wider commerce push across Google surfaces
The selfie upgrade is part of Google’s broader push to make shopping more visual and shoppable throughout its ecosystem. The try-on ties back to clothing in the Shopping Graph throughout Search, Google Shopping, and Image results, ultimately minimizing hops between discovery and decision. And separately, Google’s Doppl app is also centered on the visualization of outfits but now features a shoppable discovery feed with AI-generated videos and direct links to merchants — an experiment in TikTok-style inspiration contributing to transactions.
The competitive environment is also becoming hotter. Amazon has introduced virtual try-on for some categories and fit guidance features, Snap is still building out AR Shopping, and retailers are utilizing tech from players like Zeekit and Forma to provide likeness dressing rooms on their own sites. Google’s decision will make virtual try-on a default expectation within every shopping destination, not an exception held back for novel experiences at individual storefronts.
What to watch next as Google scales virtual try-on
Look for rapid iteration: more garment categories, better fabric physics, multiple-item layering, faster rendering, and tighter connections to sizing recommendations. An important question is whether processing can move further on-device over time for privacy and latency. Geographical expansion is also possible as Google fine-tunes accuracy and adds more retailers.
For now, if you start with a full-body but then take a selfie, it’s less of an obstacle to trying before buying. If the experience is accurate enough at scale, it could reduce return rates and increase conversion — and, most importantly of all, give shoppers a better idea of how clothes will actually look on them — the person wearing them — not just a studio model.