Amazon has unveiled Lens Live, a real-time visual search feature that turns your phone’s camera into a shopping assistant. Point at a pair of shoes, a coffee grinder, or a floor lamp in the wild, and the Amazon app surfaces similar products in a swipeable carousel, complete with prices, ratings, and quick actions. The experience plugs directly into Rufus, Amazon’s AI shopping assistant, to summarize options and answer follow-up questions before you buy.
Lens Live doesn’t replace Amazon Lens, the company’s longstanding image and barcode search. It layers in instant recognition and conversational context, bridging the gap between discovery in the physical world and checkout in the Amazon ecosystem. The rollout starts on iOS for “tens of millions” of U.S. shoppers, with broader availability to follow.

What the new camera-powered search does
Open the camera in the Amazon Shopping app, aim at an object, and tap on the item you care about. Lens Live identifies the product and displays lookalikes and compatible alternatives at the bottom of the screen. If something fits your needs, tap the plus to add to cart or the heart to save for later—no manual typing, no guesswork on keywords.
Because it’s built for real-world use, the system is optimized for imperfect conditions: angled views, cluttered backgrounds, and partial occlusion. Rufus sits alongside the results to generate quick summaries (“Key differences between these espresso machines?”), offer buying guides, and propose follow-up prompts you might not think to ask.
Under the hood: AWS-scale AI
Lens Live runs on Amazon SageMaker to train and deploy vision models at scale, paired with AWS-managed Amazon OpenSearch for fast retrieval from a vast catalog. That combination is designed to match a camera frame—essentially a rich visual query—to relevant items in milliseconds, while keeping latency low enough to feel instantaneous.
The tech challenge is not just recognizing an object; it’s mapping it to commercial inventory with correct attributes, variants, and availability. Amazon’s advantage is the depth of metadata in its catalog, which helps the models understand whether you’re looking at “men’s trail runners, wide fit” versus “lightweight gym trainers,” and retrieve appropriate listings.
Why this matters for retail
Shoppers increasingly research in-store but buy online, a behavior long dubbed “showrooming.” Industry groups like the National Retail Federation have reported that a majority of consumers use their phones in stores to compare prices and read reviews. Visual search compresses that behavior into seconds: see it, match it, evaluate it, purchase—or save for later.
Visual discovery also reduces the friction of language. Describing “that curved oak lamp with a linen shade” is harder than pointing your camera at it. Platforms from Google Lens to Pinterest have validated the appetite for this behavior; bringing the same capability directly into the Amazon purchase flow could lift conversion and average order value by making intent more actionable.
Real-world examples
Spot a jacket on your commute? Lens Live will surface similar cuts and materials across price points, plus Rufus can summarize trade-offs between waterproof and water-resistant fabrics. Testing a blender in a store? Scan it to compare wattage, jar capacity, and warranty terms on Amazon before you commit.
Home and DIY scenarios stand out: aim at a faucet to find matching finishes, or point at a curtain rod bracket to locate the right diameter and hardware. For parents, quickly identify compatible replacement parts—straws, lids, filters—without sifting through product pages.
Impact on sellers and search quality
For marketplace sellers, Lens Live could expand the top of the funnel by capturing spontaneous, real-world intent that keyword search might miss. It also raises the bar on catalog data hygiene: accurate titles, attributes, and imagery help the matching models rank listings correctly. Research from the Baymard Institute has long tied complete product data to higher conversion; visual search amplifies that effect.
Quality and trust are critical. Vision systems must avoid lookalike confusion, and AI summaries need to reflect the underlying product data, not embellish it. Amazon’s choice to anchor results in its catalog and pair them with product-level signals—ratings, verified reviews, fit notes—helps mitigate the risk of AI overreach.
Availability and what to watch
Lens Live is rolling out first on the Amazon Shopping app for iOS to tens of millions of U.S. customers. Amazon has not detailed timing for Android or international markets. The company says the experience will continue to expand, and it will integrate deeper with Rufus for richer comparisons and buying guidance.
Two questions will shape its trajectory: how well the system handles edge cases in the wild, and how Amazon balances organic visual matches with sponsored placements over time. If Lens Live proves fast and accurate, it could make pointing your camera at the world as natural a shopping habit as typing a query.