Google Photos is making interface changes that bring generative tools more prominently to the forefront, burying classic editing sliders even further into menus, and has users who make quick adjustments such as upping brightness, contrast, or warmth up in arms.
The move brings the epicenter of attention not to a broader AI-first strategy but to everyday edits that used to require only a couple of taps.
Manual sliders get sidelined beneath AI-centered tools
The updated editing screen is fronted by Help Me Edit, an Ask Photos–branded text field that takes suggestions and executes AI edits. It’s right below your photo, front and center. On the other hand, all manual settings controls like black levels, warmth, and tint are now deeper in the stack, usually under Tools and then a second submenu such as Color before you actually get to sliders.
Practically speaking, what used to require three taps now takes five or more. Cropping and rotation are still just a layer beneath, but the granular adjustments that hobbyists and pros grab time after time aren’t what you see first anymore. Help Me Edit is the first thing you see on recent Pixels and certain Android phones; manual fine-tuning plays second fiddle.
It’s a small thing on paper, but UI friction adds up. Taps and scrolls are stopping the flow, especially when you have to batch-correct a bunch of similar pics. The appeal for many of Photos has been edits that are quick, syncable, and fast enough to do while sitting on the couch—that’s being taxed now.
An AI-first bet in Photos with a reach that reshapes editing
Google hasn’t been quiet about its plans to sprinkle AI throughout the ecosystem, including in Search, Workspace, and on Android. Photos is a great petri dish for that tactic. It had some show-stopping demos—things like Magic Eraser, Magic Editor, and Ask Photos were big features that it could do right from the start, and Help Me Edit keeps that going with a way to apply text-guided, context-aware adjustments so you can rub out objects or change skies or even alter facial expressions.
With Photos being such a large-scale product, the stakes are high. It’s a sentiment that would crop up less in relation to other web apps, as few are used at the scale of the Play Store: The store has now passed 5B installs, and Google confirmed already years ago that it had passed over a billion users—so even small UX changes ripple across an enormous user base. With analysts at Rise Above Research putting the number of photos taken around the world each year well over a trillion, the fight to streamline curation and editing is among the central battlegrounds in people managing their visual memories.
Why power users are objecting to buried manual controls
Photographers and fans frequently want nothing more than predictable, reversible, screamingly manual edits versus algorithmic whims. A typical case: warming up (tweaking the colour of) images to offset a cooler white balance. Before, it was Edit > Adjust > Warmth. Now it’s Edit > Tools > Color > Warmth. Multiply that by hundreds of shots from a trip and the extra navigation becomes significant.
There’s also a trust factor. AI can yield dramatic results, but many users are looking for open control over the image pipeline. Burying sliders by default increases cognitive load—though such practices fly in the face of UX tenets that include cutting down on the number of steps it takes to complete common tasks and respecting power user workflows. Early chatter in Reddit communities and on Google’s own support forums reflects this sentiment: the tools aren’t gone, but they seem demoted.
What to do right now to keep fast manual edits in reach
If you would rather edit manually, there’s a way to get there—though it means a few more taps. Continue to use the Crop and Rotate tools for fast fixes; they are still at the forefront. For color and exposure, go into Tools, then head to Color and Light for warmth, contrast, and blacks. Auto can be a decent starting point before tweaking further.
For more involved work, you can use Photos in conjunction with a dedicated editor. Snapseed and Lightroom Mobile give you direct access to granular controls and batch-friendly features, then allow you to save back in Photos for syncing and sharing. It’s not as smooth, but it brings back speed on repetitive tasks.
The bigger question for Photos as AI takes center stage
The AI-powered overhaul that Google has given Photos is not fundamentally a bad thing: plenty of users will welcome the ability to remove an object in one tap, or replace a bland sky with something more interesting. The issue is prioritization. One could please both camps with a simple fix: an option in settings to allow users to pick whether the editor opens up to AI suggestions or manual sliders, or perhaps simply keep Adjust as a top-level tab. A “Pro Defaults” setting would save taps without folding the AI from view.
As AI gets rolled out through Android and beyond, the Photos team must strike a balance between wow-factor automation and the muscle memory of those who only want to nudge warmth, dial contrast, and get on with it. For now, Google Photos still does everything—it simply makes the simplest edits a bit more cumbersome to get to.