Google Photos users are voicing fresh frustration over Magic Eraser, saying the once-reliable tool no longer handles fine, surgical edits the way it used to. Reports shared in an active Reddit thread suggest the feature, which debuted with the Pixel 6 and later expanded beyond Pixel devices for Google One members, now excels mainly at removing larger objects while struggling with tiny distractions and close-up refinements.
Multiple users describe a noticeable drop in precision since last year’s overhaul of the Photos editor. Tasks that were previously routine — zapping stray hairs, cleaning up small text, fixing edge artifacts, or tidying minor blemishes — now often produce smears, warped textures, or simply fail to apply convincingly. One user demonstration on a Pixel 10 Pro even shows Magic Eraser misinterpreting a selection and shifting an object instead of removing it, with edits taking longer to process and undo.
What Users Report Is Breaking in Magic Eraser
The pattern is consistent across dozens of anecdotes: the tool remains decent at obvious removals — think a passerby in the background, a trash can, or a road sign — but falters when selections are small, narrow, or intricately shaped. Heavily zoomed edits appear especially fragile, with the inpainting step (the part that fills in the background) producing telltale blurs or repeating textures that draw the eye.
Some users also report that the brush feels less “sticky” to edges than before. Instead of respecting clear boundaries, selections can bleed into nearby detail, which in turn degrades the fill quality. Others complain about slower responsiveness in the editor, particularly on large images or when quickly iterating through try-undo-try cycles that used to feel near-instant.
These observations don’t appear tied to a single phone model. While the Pixel 10 Pro clip has circulated as an example, similar feedback has surfaced from older Pixels and non-Pixel Android devices as well, pointing more toward changes in the editing pipeline than isolated hardware quirks.
Why Magic Eraser’s Precision Might Have Slipped
Google rolled numerous AI editing capabilities into a revamped Photos editor last year, consolidating features such as Magic Editor into a single interface while also spotlighting newer, prompt-based tools like Help Me Edit powered by Gemini. That shift may have altered the balance between classic, edge-aware inpainting and newer generative workflows optimized for broader scene changes.
From a technical standpoint, small-object removal is the hardest test for any content-aware system. Fine strands of hair, dense textures like gravel, or high-frequency edges near text require extremely accurate segmentation and texture synthesis. If the underlying model skews toward larger, low-frequency fills — the type often showcased in marketing demos — quality at the micro level can degrade, especially when the tool is expected to run quickly on-device.
Model updates, quantization for performance, or tweaks to how the app blends AI output back into the image can all affect the “seams” users spot. If Google adjusted processing to unify experiences across devices or to accommodate newer features, it could inadvertently explain why small, delicate fixes feel less dependable today.
Google’s Silence and the Stakes for Photos Users
As of now, Google has not publicly acknowledged a regression. That silence matters because Photos isn’t niche software — it serves well over 1B users worldwide and sits at the center of Google’s pitch for everyday, approachable AI. When a flagship legacy tool like Magic Eraser stumbles, it undercuts user trust in the rest of the editing suite, including splashier generative features.
It also invites comparison. Third-party apps such as TouchRetouch are being recommended within user threads for their consistency on small fixes, and heavyweight editors like Adobe Photoshop continue to raise the bar with high-quality object removal. On the mobile side, rivals have stepped up too, from Apple’s Clean Up in Photos to Samsung’s Generative Edit on recent Galaxy devices.
Workarounds and Practical Tips for Better Results
Until Google addresses the concerns, some users report better outcomes by making multiple tiny passes instead of one large selection, or by zooming out slightly to give the algorithm more contextual pixels to synthesize. Others suggest experimenting with selection shapes and feathering boundaries gently to avoid hard seams.
Basic hygiene can also help: ensure the Photos app is fully updated, clear cache and temporary data, and test on a duplicate image at a slightly lower resolution to reduce processing load. If results remain inconsistent, consider a specialized retouching app for fine-detail edits and keep Magic Eraser for larger removals where it still performs reliably.
What to Watch Next as Google Evaluates Magic Eraser
All signs point to a software-side issue introduced alongside or after the editor revamp, which means a server-side or app update could restore the surgical accuracy long-time users remember. Submitting examples through the Photos feedback tool — especially side-by-side edits showing prior vs. current behavior — can help product teams pinpoint failure modes.
If Google responds with model or pipeline tweaks, we’ll likely see improvements land quietly rather than as a marquee feature drop. For now, the message from the community is clear: Magic Eraser still shines for big, obvious distractions, but the bar for tiny, precise corrections needs to rise again.