Grammarly’s Expert Review feature, a paid AI writing tool that attributed advice to real experts without permission, has been disabled after just seven months of operation. The feature, which launched in August 2025 as part of Grammarly’s $12-per-month Pro subscription, generated AI-written suggestions supposedly inspired by Stephen King, Neil deGrasse Tyson, Carl Sagan, Kara Swisher, and deceased academics. On March 11, 2026, Grammarly disabled the feature entirely, with CEO Shishir Mehrotra admitting the company “fell short” and would “rethink our approach”.
Key Takeaways
- Grammarly disabled Expert Review on March 11, 2026, after a week of intense backlash from journalists and authors.
- The feature attributed AI-generated writing advice to real people, including deceased experts, without permission or endorsement.
- Journalist Julia Angwin filed a class action lawsuit the same day, alleging privacy and publicity rights violations with damages exceeding $5 million.
- CEO Shishir Mehrotra apologized and promised to reimagine the feature with genuine expert control and opt-out mechanisms.
- The feature required users to pay $12 monthly after a free trial to access expert-attributed writing suggestions.
How Grammarly’s Expert Review became an identity theft scheme
The Expert Review feature promised users writing guidance “inspired by” leading professionals and authors. In reality, Grammarly trained its AI on public content from these figures and generated suggestions under their names, creating the false impression of direct endorsement or involvement. The feature buried a disclaimer in its documentation stating the expert references were “for informational purposes only” and did not indicate affiliation or endorsement, but the user interface itself prominently displayed expert names alongside writing suggestions.
Journalist Miles Klee’s Wired article, published in early March 2026, exposed the practice and sparked immediate backlash. Tech journalist Kara Swisher, one of the misattributed experts, responded with fury: “You rapacious information and identity thieves better get ready for me to go full McConaughey on you. Also, you suck”. The feature had monetized real people’s identities and reputations without consent, crossing a line that generic AI training on public data does not.
Unlike typical large language models that train on broad internet data without explicit attribution, Grammarly curated expert voices as a deliberate paid product feature. The company took identities, shaped them into a monetized service tier, and sold access to users who believed they were receiving guidance from actual experts. This distinction matters: all AI companies use public information to train models, but few explicitly brand and sell that information as personalized expert advice from identifiable individuals.
Grammarly Expert Review lawsuit and CEO apology
On the same day Grammarly disabled the feature, journalist Julia Angwin filed a class action lawsuit against Superhuman (Grammarly’s parent company), alleging violations of privacy and publicity rights. The lawsuit claims damages exceed $5 million, signaling that the unauthorized use of expert identities carries serious legal consequences beyond reputational damage.
Superhuman CEO Shishir Mehrotra posted on LinkedIn acknowledging the failure: “Over the past week, we received valid critical feedback from experts who are concerned that the agent misrepresented their voices. This kind of scrutiny improves our products, and we take it seriously. We hear the feedback and recognize we fell short on this. I want to apologize and acknowledge that we’ll rethink our approach going forward”. The apology, while direct, came only after the lawsuit was filed and media pressure reached a crescendo—not before.
Superhuman’s director of product management, Ailian Gan, announced the feature’s fate: “After careful consideration, we have decided to disable Expert Review as we reimagine the feature to make it more useful for users, while giving experts real control over how they want to be represented—or not represented at all”. The promise of expert control is crucial; the original feature had none.
What replaces Expert Review and why users may dislike it
Grammarly has not yet unveiled the replacement feature. The company committed only to “reimagining” Expert Review with better expert control and usefulness, but no timeline or specifics have been announced. This ambiguity is intentional—Grammarly needs to rebuild trust with both experts and users before launching anything new.
The challenge Grammarly faces is fundamental: how do you offer expert-inspired writing advice without either misusing expert identities or requiring explicit consent from dozens of real people? One path is to remove expert names entirely and return to generic AI suggestions. Another is to build a consent-based system where experts opt in and receive compensation. Both options are more expensive and less marketable than the original feature, which is likely why the company is moving slowly.
Users who paid for Expert Review expecting guidance from Stephen King or Neil deGrasse Tyson will now receive either a different feature or a refund. Early adopters of Grammarly Pro may feel cheated, especially if the replacement lacks the expert branding that justified the premium price. The company’s silence on what comes next suggests internal disagreement about the path forward.
Why this matters beyond Grammarly
The Expert Review debacle exposes a gap in AI regulation and corporate accountability. Grammarly built a feature that explicitly monetized real people’s identities and reputations without consent, yet the company faced no regulatory action—only private lawsuits and public backlash. The feature existed for seven months before journalists noticed and reported it, suggesting many users never questioned whether Stephen King actually wrote the suggestions they were reading.
This case will likely influence how other AI companies handle expert attribution and identity use. The legal and reputational costs of unauthorized expert use now have a visible price tag: a $5 million+ lawsuit and a complete product shutdown. Competitors building similar features will think twice, or at least ensure they have explicit consent from the people whose names they use.
Did Grammarly’s Expert Review actually help users write better?
The research brief does not contain user feedback or performance data on whether Expert Review improved writing quality. CEO Mehrotra’s admission that the feature “fell short” suggests it underperformed even on its core function, not just on ethics. A feature that misrepresented expert identities and failed to deliver useful writing advice is a double failure.
Will experts get paid if Grammarly uses their names in a new version?
Grammarly has not announced compensation terms for experts in the reimagined feature. The company promised “real control over how they want to be represented,” which implies opt-in consent, but no payment structure has been disclosed. Whether experts will receive royalties, licensing fees, or simple attribution remains unknown.
Can I get a refund if I paid for Expert Review?
Grammarly has not publicly announced a refund policy for users who purchased Pro subscriptions specifically for Expert Review. The company should offer refunds or Pro subscription credits to affected users, but no official statement on this has been made.
Grammarly’s Expert Review shutdown is a rare moment of corporate accountability in AI, but it came only because journalists exposed the practice and lawyers filed suit. The company’s promise to reimagine the feature with expert control is credible in intent but unproven in execution. Until the replacement launches, users and experts alike should remain skeptical—Grammarly has already demonstrated it will cut corners on consent and transparency if the product is profitable enough.
This article was written with AI assistance and editorially reviewed.
Source: TechRadar


