Apple has banned the women’s safety app Tea and its companion TeaOnHer from the App Store in every market following an ongoing investigation by The Guardian, which found that it did not meet content moderation or user privacy requirements. The takedown, which was first noticed by the analytics firm Appfigures and later confirmed by Apple to TechCrunch, comes after months of criticism over security vulnerabilities and increasing complaints from users.
Why Apple removed Tea and TeaOnHer apps from the App Store
Tea (published under the name Tea Dating) and TeaOnHer were, for their part, in violation of App Review Guideline 1.2, per Apple’s message to TechCrunch, which stipulates that apps with user-generated posts must provide strong tools to block users and to filter or remove objectionable content.

Apple also referred to Guideline 5.1.2, which prohibits personal data from being collected or shared without clear consent, and to the Developer Code of Conduct, section 5.6, regarding respectful behavior and truthful practices.
Apple also found a significant frequency of complaints and bad reviews, implying the flagged problems were not resolved.
Apple had informed the developers of its concerns before the apps were yanked, sources said, but the applications still were not in compliance at the time of the pull.
Security and privacy lapses raised by the investigation
Tea, a dating safety app and digital whisper network for women to warn one another about men, was the site of a massive security failure this summer. Coverage from TechCrunch reported a cyberattack that put sensitive information, including driver’s license images, at risk. Further evidence revealed that private chats and phone numbers had been accessed too, prompting serious worries about privacy.
TeaOnHer, the male counterpart to Tea aimed at women, also appeared to suffer from its own security flaws. Even as the services were intending to be safety platforms, the mechanisms for strong data protections and effective consent flows were apparently wanting. For apps that traffic in highly sensitive claims and identity data, missteps around privacy or moderation aren’t just technical bugs — they’re existential risks.

How Apple’s App Store enforcement is applied in context
Apple’s choice echoes its growing hard line on user-generated content and how data can be used. The company’s App Store transparency materials published annually have described sweeping enforcement, including millions of app submissions rejected yearly for policy violations and routine removals for safety and privacy concerns. Apple has also highlighted in the past that it is clamping down on bad behavior throughout its App Store as part of a broader anti-fraud effort.
That’s a clear message for those who develop community and safety tools: features must be more than just report buttons. Effective review mechanisms for content, rapid takedown actions, and obtaining either explicit consent to collect or display personally identifying information and secure storage of such data are necessary. Apps that do catalog allegations or sensitive reports, especially around dating, are under heightened scrutiny and have a responsibility to guard against misuse and reduce harm.
Tea and TeaOnHer are still available on Google Play for now
Currently, Tea and TeaOnHer continue to be available on Google Play. And that may be a fleeting status, with similar policy holes possibly emerging under Google’s Developer Program Policies and User Data guidelines, which mandate transparent disclosure, consent, and handling of sensitive information. In the past, enforcement timing could vary from store to store, but constant user complaints or persistent security problems usually lead them to act across platforms.
For existing users, the asymmetry is a practical nuisance. A removal on iOS can splinter communities and erode trust, particularly when the value of a service is so reliant on network effects. If the developers want to bring their apps back to iOS, they’ll need stronger verification processes in place, more powerful tools for blocking and taking down abusive users within an app, and better privacy protections to pass App Review muster.
A tough moment for dating safety apps and communities
Safety-minded dating apps tread a fine line. They are trying to detect patterns of harmful behavior and protect the privacy and rights of everyone involved. That tension can clash with store policies against doxing, defamation, and nonconsensual data sharing. The same forces are applying pressure on large internet groups where members attempt to vet prospective dates; reports of abuse can be useful but also legally and ethically fraught.
In the short term, it rests with them being able to fix past insecurity, establish transparent moderation workflows, and demonstrate that they are capable of responsibly managing private user data. Apple’s exit sends yet another industry signal: Powerful as they are, meaningful safety features are table stakes (let me repeat that for the people in the back), but they need to be built on consent, privacy by design, and enforceable moderation — especially when real names and reputations are at stake.