Google is turning screenshots and social feeds into instant shopping assistants. The latest update to Circle to Search doesn’t just recognize a single item anymore; it can parse an entire look, break it into components, and let you virtually try on clothing from the search overlay itself. On flagship Android phones, this turns a quick circle into a full outfit breakdown and a fitting-room preview in seconds.
Multi-Item Recognition Comes to Your Screen
Until now, visual search often meant hunting for one piece at a time. The new release changes that rhythm. See a creator’s fit on your feed? Circle the whole ensemble. Circle to Search will identify the jacket, shirt, pants, shoes, and accessories simultaneously, then surface similar options and price tiers for each category—no back-and-forth queries required.
This scene-level understanding also stretches beyond fashion. If you grab a still of a living room, the feature can tag the sofa, coffee table, rug, and lighting in one pass, effectively reverse-engineering the vibe so you can shop by aesthetic instead of guesswork. It’s visual merchandising, but inverted and personalized to whatever you’re looking at.
Virtual Dressing Room Inside Search on Android
The marquee addition is a “Try it on” control placed directly in the Circle to Search overlay. After circling an outfit and tapping “Find the look,” you can jump into a virtual dressing room to see how pieces drape on different body types before you tap buy. The experience borrows from Google’s prior work in virtual try-on for shopping, which used a diverse set of models to reflect real-world sizes, shapes, and skin tones.
This matters because size charts and flat product shots leave big gaps. Retailers and platforms have long chased the holy grail of “confidence to cart.” Shopify has reported that listings enhanced with AR or 3D can lift conversions by up to 94%, and virtual try-on consistently reduces uncertainty around fit and style. By embedding that step into Circle to Search, the leap from inspiration to informed purchase gets much shorter.
Practical example: You spot a streetwear look with layered outerwear and statement sneakers. Circle the photo, skim suggested matches for each layer, then preview the jacket silhouette on a body type close to yours. Instead of ten tabs and a guess, you get a curated shortlist and a realistic fit preview in one flow.
Powered by Gemini for Scene-Level Understanding
Under the hood, Google credits Gemini 3’s agentic planning and multimodal reasoning for the step-change. Rather than a single object match, the model performs a sequence: it segments the image, identifies and crops each item, runs parallel searches, and reconciles results into a coherent panel of options. That orchestration is what enables deconstructing an outfit or mapping a room without manual, item-by-item prompts.
Critically, the system isn’t just finding “visually similar” products. It weighs attributes like cut, fabric, colorways, and context, then blends them with shopping signals such as availability and merchant quality. The goal is a set of recommendations that feel less like a raw image match and more like a stylist’s pull sheet.
Why It Matters for Shoppers and Brands Right Now
Returns are a costly friction point. The National Retail Federation has reported overall retail return rates in the mid-teens, with apparel typically higher. Better discovery and realistic try-on can reduce sizing errors and style mismatches, directly improving margins for merchants and cutting hassle for buyers.
For creators and retailers, this also tightens the loop from inspiration to conversion. Visual search already handles billions of queries each month, and combining multi-item recognition with try-on makes impulse discovery feel more intentional. Expect brands to optimize product imagery and metadata to surface more reliably in these scene-aware results.
Availability and Early Limitations on Supported Devices
The upgraded Circle to Search experience is rolling out first to select flagship devices, including the Samsung Galaxy S26 series and the latest Pixel 10 phones, with broader Android support to follow. Availability of “Try it on” may vary by region, apparel category, and merchant integrations as the ecosystem ramps.
Expect a hybrid of on-device and cloud processing depending on task complexity. While Google emphasizes privacy and security in Search features, shoppers should still review account settings for activity controls, personalized results, and ad preferences. As with any AI-forward tool, results will improve as the system sees more diverse images, brands, and body representations.
The Bottom Line on Circle to Search’s New Try-On
Circle to Search just evolved from a handy visual finder into a stylist-grade assistant. The ability to parse an entire look, recommend comparable pieces, and offer a built-in try-on closes the gap between seeing something you like and knowing it will work for you. For Android users, it’s one of the most practical blends of AI and shopping to date—and it’s poised to reshape how outfits go from screen to wardrobe.