SKIMS has once again brought a fraught debate to the fore with its latest product: a “faux bush” thong on sale now, labeled kindly for an alien arriving from another planet should they be in need of undergarments (it’s one-size-fits-all and comes in three colors).
The cheeky campaign has caught attention for its playful embrace of body hair — along with how comfortably it cozies up to Instagram’s limits. Other adult creators say their identical, fully clothed posts are deleted, suppressed or followed by account penalties. It’s difficult to ignore the irony: A megabrand ascends on the same platform where smaller creators encounter invisible barriers.
The imbalance is telling in another sense, too, for it gets at a larger issue about content moderation at scale: who gets to push the line and who gets pushed off the page?
From sex workers to sex educators and L.G.B.T.Q. creators, users have long said that Instagram’s policing of its nudity and sexual solicitation ban is uneven, heavy-handed and sometimes biased toward those with corporate power.
A Viral Product Meets a Platform Policy Debate
In the videos, designed like a kitschy game show for 1999 and hosted by its producer, the photographer-dreamgirl Nadia Lee Cohen (whom I emailed but received no response), the models play up the gag — different colors! different vibes! — while never showing explicit nudity.
That detail matters. Explicit nudity and sexual activity are against Instagram’s rules, but suggestive content and most lingerie are allowed if genitals are covered. A barely-there, pubic hair–resembling garment, sexy as it may be, can also make it through the automated and human review process if it stays between those lines.
The brand’s posts were organic — as opposed to paid advertisements, which are subject to stricter advertising policies. Still, the clips made their way around the globe, inspiring engagement and pushing a $32 thong into backorder, at least as of this writing — a clear sign that reach was undeterred. 📩 Adult creators note that their experiences can vary: Many have been nabbed into moderation dragnets for similar posts featuring lingerie or implied nudity and faced takedowns, account restrictions or algorithmic freeze-outs on discovery.
Creators Say Enforcement on Instagram Is Spotty
For years sex workers, independent models and sexual health educators have been subject to shadowbanning — a lowering or total removal from Search and Explore that devastates reach without a formal ban. Studies from outfits like Hacking//Hustling and the Free Speech Coalition have recorded widespread deplatforming and revenue declines since FOSTA-SESTA was enacted, pushing platforms to take an even more conservative stance on sexual content.
Meta’s Community Standards Enforcement Reports are full of blocks of action on posts by Instagram under the category Adult Nudity and Sexual Activity — lots of tens of millions each quarter. The company says most enforcement is accurate, though creators and digital rights activists argue that the net is cast too wide. Sex education, LGBTQ expression and consensual adult content often get caught in the sweeping-up, with higher-profile accounts facing less immediate consequences.
The Meta Oversight Board has raised similar concerns. In a high-profile ruling on policies regarding adult nudity for transgender and nonbinary users, the board took issue with vague rules and advised clear standards and better training. Health-related exceptions are in place — for breastfeeding and post-mastectomy images, among others — but educators say they still contend with tales of evidence-based material being removed from the web when algorithms blur anatomy with solicitation.
Why Bigger Brands Often Receive More Leeway Online
Scale and status shape outcomes. It also provides big brands with dedicated account managers, direct escalation paths and a coordinated launch plan which includes triggers for moderation. Previous reporting on Meta’s cross-check program — which offered additional review for concerning posts shared by VIP accounts — shed light on how high-profile users regularly receive more time before penalties are imposed and more chances to remedy the issues before their content is removed.
There’s also the optics concern: corporate campaigns are marketed as fashion or comedy, while comparable content from independent adult creators is categorically underpinned by solicitation. The content could be similar — a thong, suggestive copy, playfully serious tone — but how the reviewer frames the perceived “purpose” varies. It’s that subjective line where bias enters the picture. As scholars with the NYU Stern Center for Business and Human Rights observe: Moderation at scale marries automated systems with human judgment under time pressure, a recipe for inconsistent calls.
What Fair and Consistent Platform Rules Could Look Like
Experts and advocates frequently argue for precision rather than blanket bans. More refined definitions, that distinguish consensual adult expression from sexual solicitation, would help reduce false positives. A fair appeals process, featuring human review and the opportunity for prompt feedback, could help creators mend their mistakes without losing their livelihoods. Public statistics on appeals — such as reversal rates by category — would make enforcement answerable rather than secretive.
Independent audits of algorithmic recommendations would reveal if some communities are unfairly downranked. The Oversight Board has already recommended that Facebook be more transparent about how it treats borderline content in feeds and discovery. And while a VIP-like escalation is still in order for big brands, a parallel route available to small businesses and artists might — if not bring it closer — at least shrink the distance.
The SKIMS shaggy-thong stunt is smart marketing. But it also serves another purpose: a stress test for platform rules. If it’s safe enough for a celebrity-backed label to post a faux bush, then the same should hold true for the adult creators who built audiences on the platform before this trend blipped onto anyone’s radar. What stays up should depend on consistency, not clout.