Google is rolling out Search Live worldwide, bringing its camera-and-voice search experience to users in more than 200 countries and territories where AI Mode is available. The expansion moves the feature beyond its initial launch markets and signals a broader shift toward real-time, multimodal search that understands what you’re seeing and hearing in the moment.
Search Live lets people point their phone camera at a scene, ask a question out loud, and hold a natural conversation that leverages the live visual context. It’s powered by Google’s new audio and voice model, Gemini 3.1 Flash Live, which is designed for fluid, back-and-forth dialogue with rapid responses.
What Search Live Does for Real-Time, Multimodal Help
Instead of crafting the perfect keyword string, users can show and tell. Aim your camera at a flat-pack shelving unit, ask where the brackets go, and get step-by-step guidance that references what the camera sees. Point at a humming router and ask which cable to check first. Look at a houseplant and inquire about watering schedules, then follow up with pest treatment tips.
Because the assistant can see the environment, it can disambiguate faster than text-only search—recognizing labels, parts, and context—while still offering links to dive deeper on the web. Google says the experience is built for moments when speed and situational awareness matter, such as troubleshooting, shopping decisions in-store, or understanding signage while traveling.
How to Use Search Live on Android and iOS Devices
On Android or iOS, open the Google app and tap the Live icon under the Search bar. Ask your question out loud to receive an audio response, then continue naturally with follow-ups. If you’re already using Google Lens, you can switch into the same experience by tapping the Live option at the bottom of the screen.
The mode blends voice, vision, and web results in a single flow. You can pause to read on-screen, tap through to sources, or keep talking while the camera maintains context.
Under The Hood With Gemini 3.1 Flash Live
The global rollout is enabled by Gemini 3.1 Flash Live, Google’s latest real-time audio and voice model. It’s optimized for low-latency turn-taking and for interpreting a continuous stream of frames from the camera. In practice, that means fewer awkward pauses and a conversation that adapts as the scene changes—like when you rotate a product box or step closer to a cable panel.
This builds on Google’s years of visual search work in Lens and multimodal search, which the company says already serves billions of visual queries each month. Search Live marries those capabilities with speech recognition and synthesis so the system can look, listen, and respond in near real time across many languages.
Privacy, Safety, and Controls for Camera-Based Search
Because Search Live uses your camera feed, Google emphasizes user control: you explicitly enable the camera when starting a session, and you can manage activity saving in your account settings. Safety systems also aim to filter sensitive or personally identifying content in line with Google’s long-standing policies for Lens and Search.
In regions covered by the European Union’s platform rules, transparency and control requirements continue to apply to ranking and AI features. While Google hasn’t detailed region-specific changes for this rollout, global availability typically reflects compliance work across major regulatory regimes.
Why It Matters for Users and Businesses Worldwide
Search Live shortens the distance between a question and an answer by removing the need to translate real-world scenes into text. For everyday users, that means faster fixes and more confident in-store decisions. For retailers and manufacturers, it raises the bar on product clarity: legible labels, distinct parts, and accurate structured data help AI identify items and route to the right instructions or offers.
Publishers and brands may also see new entry points for traffic. When the assistant suggests links, sources with strong visuals, schema markup, and concise how-to content are more likely to surface in context-rich moments, not just on traditional results pages.
Live Translate Expands Alongside Search Live Rollout
Google is also extending Google Translate’s Live Translate feature to iOS and to additional countries, letting users hear real-time translations through any pair of headphones in more than 70 languages. The pairing is practical: Search Live can help interpret a sign or a menu through the camera, while Live Translate carries the rest of the conversation without swapping apps.
Taken together, these moves make Google’s mobile search more situational and more conversational. As camera-forward experiences go global, the default query language is shifting from typed keywords to the world in front of you—spoken, seen, and answered on the spot.