FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Technology

Google Tests Reversal Of Photos Editor Redesign

Gregory Zuckerman
Last updated: March 18, 2026 2:13 pm
By Gregory Zuckerman
Technology
6 Min Read
SHARE

Google appears to be walking back parts of last year’s Google Photos editor overhaul, testing a refreshed interface that restores familiar controls and moves AI features out of the spotlight. Early sightings suggest the company is responding to sustained user frustration with the previous redesign, which buried common tools and slowed down quick edits.

What Is Changing In The Google Photos Editor

Test builds show the return of a bottom-aligned carousel with staples like Crop, Adjust, and Filters back in plain view. That means less tapping to reach everyday options, a small but meaningful win for speed. The dedicated Crop menu — removed in the last redesign when its functions were scattered — also appears to be back, restoring a workflow many users relied on.

Table of Contents
  • What Is Changing In The Google Photos Editor
  • Why The Redesign Drew Criticism From Users
  • Google Appears To Dial Back AI Prominence
  • Explore Feed Test Inside Google Photos App
  • Rollout And What To Expect From Google Photos
  • Why It Matters For Everyday Google Photos Editing
A professionally enhanced image with a 16:9 aspect ratio, showcasing two sets of Before and After photos. The left set demonstrates a Magic Editor improving a camping scene with people, while the right set shows Photo Unblur enhancing a group photo of four women.

Google is also revamping the AI entry point. The prominent Help Me Edit text field is replaced by a more compact Ask button that opens an input box on demand, freeing screen space for manual tools. Quick-access buttons such as Enhance, Dynamic, and AI Enhance remain available, but they’re less dominant than before.

Visually, action labels now sit inside pill-shaped containers that highlight when selected, an affordance that makes state changes easier to parse at a glance. The test has been spotted on a Pixel 7 running Google Photos version 7.67.0.882706237, though availability appears limited and controlled by server-side flags.

Why The Redesign Drew Criticism From Users

Last year’s editor revamp coincided with the app’s tenth anniversary and emphasized AI-guided edits. Power users quickly complained on Reddit and in Play Store reviews that routine tasks took longer, with familiar tools hidden behind a Tools menu next to the AI prompt. For workflows that hinge on muscle memory — crop, straighten, adjust exposure — an extra tap or a relocated control adds friction, especially when repeated across dozens of photos.

This is classic usability debt: prioritizing marquee features can inadvertently penalize frequent, high-intent actions. In photo apps, where edits often happen in bursts, even a one-tap regression compounds into real time lost. Rolling back to a visible, scrollable tool rail brings the app closer to long-standing norms used by professionals and casual editors alike.

Google Appears To Dial Back AI Prominence

Google Photos has leaned heavily into AI with tools like Magic Eraser, Magic Editor, and Photo Unblur, many of which debuted first on Pixel hardware. Those features aren’t going away, but the new layout suggests a rebalancing: put manual controls front and center, let generative suggestions be additive rather than directive. It’s a pragmatic shift that acknowledges not every edit needs an AI prompt, and that trust grows when users feel in control.

A smartphone displaying a photo editing interface with a picture of a man carrying a child on his shoulders, set against a background of green ivy. The phone is presented on a professional flat design background with soft geometric patterns and a gradient.

This also aligns with broader feedback patterns across consumer software. Users embrace AI when it saves time on complex tasks, but resist when it gets between them and simple, well-understood actions. By trimming the AI footprint and restoring predictable tool placement, Google can satisfy both newcomers and seasoned editors.

Explore Feed Test Inside Google Photos App

Separately, Google is testing an Explore option near Memories on the main screen that surfaces a vertically scrolling, TikTok-like video feed drawn from a user’s own library. Early impressions suggest on-device and cloud intelligence curate clips based on scene content, locations, and people — for example, stitching together “beach days” or “concert moments” across years without manual album-making.

For an app that handles billions of videos, this format could increase rediscovery. But it also raises design questions: how prominently should lean-back viewing sit inside a utility-focused gallery, and will it respect privacy expectations around face clustering and location tags? Implementation details, including opt-outs and curation controls, will matter.

Rollout And What To Expect From Google Photos

As with many Google features, both the editor tweaks and Explore feed appear in limited A/B tests and may change before a wider release. Reports indicate the updated editor is surfacing for a small cohort of Pixel users on recent app versions, but broader availability will likely depend on server-side enablement. It’s common for Google to iterate quietly for weeks before any announcement.

Why It Matters For Everyday Google Photos Editing

Google Photos serves over a billion users and handles staggering volume — Google has previously cited 28B new photos and videos uploaded each week. At that scale, shaving a tap off core tasks has outsized impact, and UI reversals can be as consequential as new features. If these tests stick, they signal a course correction toward efficiency, familiarity, and a more balanced approach to AI in everyday editing.

Gregory Zuckerman
ByGregory Zuckerman
Gregory Zuckerman is a veteran investigative journalist and financial writer with decades of experience covering global markets, investment strategies, and the business personalities shaping them. His writing blends deep reporting with narrative storytelling to uncover the hidden forces behind financial trends and innovations. Over the years, Gregory’s work has earned industry recognition for bringing clarity to complex financial topics, and he continues to focus on long-form journalism that explores hedge funds, private equity, and high-stakes investing.
Latest News
How Faceless Video Is Transforming Digital Storytelling
Oracle Cloud ERP Outage Sparks Renewed Debate Over Vendor Lock-In Risks
Why Digital Privacy Has Become a Mainstream Concern for Everyday Users
The Business Case For A Single API Connection In Digital Entertainment
Why Skins and Custom Servers Make Minecraft Bedrock Feel More Alive
Why Server Quality Matters More Than You Think in Minecraft
Smart Protection for Modern Vehicles: A Guide to Extended Warranty Coverage
Making Divorce Easier with the Right Legal Support
What to Know Before Buying New Glasses
8 Key Features to Look for in a Modern Payroll Platform
How to Refinance a Motorcycle Loan
GDC 2026: AviaGames Driving Innovation in Skill-Based Mobile Gaming
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.