FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Technology

Grammarly Pulls AI Expert Feature Amid Identity Suit

Gregory Zuckerman
Last updated: March 12, 2026 10:02 am
By Gregory Zuckerman
Technology
7 Min Read
SHARE

Grammarly has taken its AI-powered Expert Review tool offline after revelations that it generated feedback under the names of real writers and academics — including living authors and deceased cultural figures — without their consent. The move comes as the company faces a class action lawsuit alleging unauthorized commercial use of writers’ identities.

The controversy highlights a fast-emerging fault line in generative AI: when systems don’t just learn from public content, but begin to simulate a person’s voice, name, and authority. For a platform used by tens of millions, the stakes extend beyond reputational damage into legal risk.

Table of Contents
  • How the Feature Worked — And Why It Sparked Outrage
  • Company Response and a Swift Retreat from Backlash
  • Class Action Targets Unauthorized Use of Names
  • A Broader Reckoning for AI Attribution and Identity
  • What to Watch Next as Identity Claims Hit AI Tools
A Grammarly Expert Review interface with text Expert Review: Feedback inspired by real experts on a teal background.

How the Feature Worked — And Why It Sparked Outrage

Launched last year as part of a suite of AI “agents,” Expert Review promised substantive feedback grounded in the work of named subject-matter experts. Marketing copy, later archived by the Internet Archive’s Wayback Machine, said the agent drew on “insights from subject-matter experts and trusted publications,” and let users select specific authors to shape the advice they received.

In practice, as first reported by tech journalists testing the product, the system presented AI-generated comments attributed to real people — from bestselling authors such as Stephen King to scholars like bell hooks — blending a general disclaimer that no endorsement was implied with feature descriptions that suggested expert-derived guidance. That tension proved combustible. Writers and academics publicly objected to having their names presented as the voice of machine-written feedback they never reviewed, authorized, or in some cases could not possibly have seen.

Critics called the setup “exploitative” and “misleading,” arguing it invited users to trust advice precisely because it appeared tied to familiar names. The initial plan to let individuals email the company to opt out only intensified the backlash, since many affected people would not even know their names were being used unless a user happened to notice and tell them. The approach also offered no clear remedy for deceased figures such as bell hooks or astronomer Carl Sagan, whose legacies were invoked without the possibility of consent.

Company Response and a Swift Retreat from Backlash

Following days of criticism from authors, editors, and academics, a company executive acknowledged the concerns and apologized in a public post, saying the agent had “misrepresented” experts’ voices. Grammarly said it would “reimagine” the feature with a model that gives experts meaningful control over whether and how they are represented, and disabled Expert Review while it rethinks the design.

That promise suggests any future iteration may move from an opt-out to an opt-in or licensing-based framework, where named contributors can set terms or decline participation altogether. It’s a familiar pivot for AI companies facing identity and attribution concerns: emphasize transparency, secure explicit permissions, and build compensation or control mechanisms for human contributors.

A screenshot of a document titled Cats in Art History: Domestic Felines as Domestic Commentary next to an Expert Review panel. The panel shows suggestions for experts, with Philippa Listeso, Historian known for pioneering work on feminist art history highlighted.

Class Action Targets Unauthorized Use of Names

The legal challenge arrived quickly. Journalist Julia Angwin filed a class action in federal court in New York, alleging that Grammarly’s feature used her identity without consent for commercial purposes. Her counsel at Peter Romer-Friedman Law PLLC framed the case squarely as a right-of-publicity claim, noting that New York law has long prohibited using a person’s name for advertising or trade without permission.

Legal scholars point out that New York strengthened its publicity protections in 2021 by adding postmortem rights for certain deceased individuals, a move that could be relevant when products invoke the identities of late authors. While the precise applicability will turn on the facts, the lawsuit seeks damages and an injunction to block any future use of writers’ names without consent, a remedy that would force product redesign even if monetary exposure proves limited.

A Broader Reckoning for AI Attribution and Identity

Expert Review’s implosion lands amid a broader industry clash over attribution, licensing, and impersonation. News organizations and authors’ groups have already sued AI developers over training data and alleged derivative uses, and courts are beginning to sort out where fair use ends and appropriation begins. Simulating the aura of a specific person — name, reputation, and implied endorsement — is an even riskier frontier than generic style mimicry.

For a platform that has reported more than 30 million daily users and widespread enterprise adoption, the episode underscores a simple product rule that AI does not obviate: if a feature’s trust signal is a human name, that human needs a say. Expect heightened scrutiny of any AI tool that assigns real-world bylines, likenesses, or expert personas to generated output, as regulators and courts draw firmer lines between inspiration, attribution, and impersonation.

What to Watch Next as Identity Claims Hit AI Tools

Key questions now include whether Grammarly commits to an explicit opt-in program for named experts, whether compensation or co-branding becomes part of any relaunch, and how the company proposes to handle estates of deceased writers. The trajectory of the Angwin lawsuit will also be closely watched, as a clear ruling on identity-based AI attributions could set a template for future claims across the industry.

The takeaway for AI builders is already clear: disclaimers are not a substitute for consent, and brand trust erodes quickly when automation wears a borrowed human face.

Gregory Zuckerman
ByGregory Zuckerman
Gregory Zuckerman is a veteran investigative journalist and financial writer with decades of experience covering global markets, investment strategies, and the business personalities shaping them. His writing blends deep reporting with narrative storytelling to uncover the hidden forces behind financial trends and innovations. Over the years, Gregory’s work has earned industry recognition for bringing clarity to complex financial topics, and he continues to focus on long-form journalism that explores hedge funds, private equity, and high-stakes investing.
Latest News
Oracle Cloud ERP Outage Sparks Renewed Debate Over Vendor Lock-In Risks
Why Digital Privacy Has Become a Mainstream Concern for Everyday Users
The Business Case For A Single API Connection In Digital Entertainment
Why Skins and Custom Servers Make Minecraft Bedrock Feel More Alive
Why Server Quality Matters More Than You Think in Minecraft
Smart Protection for Modern Vehicles: A Guide to Extended Warranty Coverage
Making Divorce Easier with the Right Legal Support
What to Know Before Buying New Glasses
8 Key Features to Look for in a Modern Payroll Platform
How to Refinance a Motorcycle Loan
GDC 2026: AviaGames Driving Innovation in Skill-Based Mobile Gaming
Best Dumbbell Sets for Strength Training: An All-Time Buyer’s Guide
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.