FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Technology

Google Photos Values AI Over Manual Tuning

Gregory Zuckerman
Last updated: November 28, 2025 11:08 am
By Gregory Zuckerman
Technology
7 Min Read
SHARE

Google Photos is making interface changes that bring generative tools more prominently to the forefront, burying classic editing sliders even further into menus, and has users who make quick adjustments such as upping brightness, contrast, or warmth up in arms.

The move brings the epicenter of attention not to a broader AI-first strategy but to everyday edits that used to require only a couple of taps.

Table of Contents
  • Manual sliders get sidelined beneath AI-centered tools
  • An AI-first bet in Photos with a reach that reshapes editing
  • Why power users are objecting to buried manual controls
  • What to do right now to keep fast manual edits in reach
  • The bigger question for Photos as AI takes center stage
The Google Photos logo, consisting of four colorful, semicircular shapes arranged in a pinwheel pattern, centered on a professional flat design background with a soft blue-to-green gradient and subtle diagonal line patterns.

Manual sliders get sidelined beneath AI-centered tools

The updated editing screen is fronted by Help Me Edit, an Ask Photos–branded text field that takes suggestions and executes AI edits. It’s right below your photo, front and center. On the other hand, all manual settings controls like black levels, warmth, and tint are now deeper in the stack, usually under Tools and then a second submenu such as Color before you actually get to sliders.

Practically speaking, what used to require three taps now takes five or more. Cropping and rotation are still just a layer beneath, but the granular adjustments that hobbyists and pros grab time after time aren’t what you see first anymore. Help Me Edit is the first thing you see on recent Pixels and certain Android phones; manual fine-tuning plays second fiddle.

It’s a small thing on paper, but UI friction adds up. Taps and scrolls are stopping the flow, especially when you have to batch-correct a bunch of similar pics. The appeal for many of Photos has been edits that are quick, syncable, and fast enough to do while sitting on the couch—that’s being taxed now.

An AI-first bet in Photos with a reach that reshapes editing

Google hasn’t been quiet about its plans to sprinkle AI throughout the ecosystem, including in Search, Workspace, and on Android. Photos is a great petri dish for that tactic. It had some show-stopping demos—things like Magic Eraser, Magic Editor, and Ask Photos were big features that it could do right from the start, and Help Me Edit keeps that going with a way to apply text-guided, context-aware adjustments so you can rub out objects or change skies or even alter facial expressions.

With Photos being such a large-scale product, the stakes are high. It’s a sentiment that would crop up less in relation to other web apps, as few are used at the scale of the Play Store: The store has now passed 5B installs, and Google confirmed already years ago that it had passed over a billion users—so even small UX changes ripple across an enormous user base. With analysts at Rise Above Research putting the number of photos taken around the world each year well over a trillion, the fight to streamline curation and editing is among the central battlegrounds in people managing their visual memories.

A screenshot of the Google Photos app interface, displaying a grid of various personal photos and videos, with the text The home for all your photos and videos at the top.

Why power users are objecting to buried manual controls

Photographers and fans frequently want nothing more than predictable, reversible, screamingly manual edits versus algorithmic whims. A typical case: warming up (tweaking the colour of) images to offset a cooler white balance. Before, it was Edit > Adjust > Warmth. Now it’s Edit > Tools > Color > Warmth. Multiply that by hundreds of shots from a trip and the extra navigation becomes significant.

There’s also a trust factor. AI can yield dramatic results, but many users are looking for open control over the image pipeline. Burying sliders by default increases cognitive load—though such practices fly in the face of UX tenets that include cutting down on the number of steps it takes to complete common tasks and respecting power user workflows. Early chatter in Reddit communities and on Google’s own support forums reflects this sentiment: the tools aren’t gone, but they seem demoted.

What to do right now to keep fast manual edits in reach

If you would rather edit manually, there’s a way to get there—though it means a few more taps. Continue to use the Crop and Rotate tools for fast fixes; they are still at the forefront. For color and exposure, go into Tools, then head to Color and Light for warmth, contrast, and blacks. Auto can be a decent starting point before tweaking further.

For more involved work, you can use Photos in conjunction with a dedicated editor. Snapseed and Lightroom Mobile give you direct access to granular controls and batch-friendly features, then allow you to save back in Photos for syncing and sharing. It’s not as smooth, but it brings back speed on repetitive tasks.

The bigger question for Photos as AI takes center stage

The AI-powered overhaul that Google has given Photos is not fundamentally a bad thing: plenty of users will welcome the ability to remove an object in one tap, or replace a bland sky with something more interesting. The issue is prioritization. One could please both camps with a simple fix: an option in settings to allow users to pick whether the editor opens up to AI suggestions or manual sliders, or perhaps simply keep Adjust as a top-level tab. A “Pro Defaults” setting would save taps without folding the AI from view.

As AI gets rolled out through Android and beyond, the Photos team must strike a balance between wow-factor automation and the muscle memory of those who only want to nudge warmth, dial contrast, and get on with it. For now, Google Photos still does everything—it simply makes the simplest edits a bit more cumbersome to get to.

Gregory Zuckerman
ByGregory Zuckerman
Gregory Zuckerman is a veteran investigative journalist and financial writer with decades of experience covering global markets, investment strategies, and the business personalities shaping them. His writing blends deep reporting with narrative storytelling to uncover the hidden forces behind financial trends and innovations. Over the years, Gregory’s work has earned industry recognition for bringing clarity to complex financial topics, and he continues to focus on long-form journalism that explores hedge funds, private equity, and high-stakes investing.
Latest News
Black Friday slashes price of UGREEN Qi2 3-in-1 for Pixel 10
Samsung just shared a quick fix for Adaptive Clock issue
Motorola Edge 70 Ultra Tipped to Feature Snapdragon 8 Gen 5 SoC
Galaxy Watch 6 Users Report Lag After One UI 8 Watch Update
OnePlus 15R Tipped to Feature Larger Battery Than OnePlus 15
Google Restricts Free Gemini Nano Banana Pro Access
Google Eyes Next Steps to Merge Android With iPhone
Z Fold 7 Beats Pixel 10 Pro Fold on Foldable Software
Windows 11 Pro And Office 2021 Bundle Drops To $39.97
YouTube TV overhaul adds cheaper, flexible packages to aid growth
Galaxy S27 Ultra to Retain Current 200MP Sensor?
Sony Bravia 9 85-Inch TV $1,000 Off in Black Friday Sale
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.