Skip to content

Cart

Your cart is empty

Continue shopping

censHERship Part III: Design as Defiance

censHERship Part III: Design as Defiance - GOODKITTYCO
Culture9 min read

How women founders are smuggling truth past censors using aesthetics, humor, and beauty as weapons.

The first time one of our ads was flagged for “adult content,” I was confused.

The second, third, fourth… I got angry. (Cue Hole.) But anger alone never moves a system built on automated shame. Strategy does.

And women’s health founders don’t get the luxury of shutting up unless we want to shutter our doors. We get the responsibility of outsmarting, redesigning, rerouting in real time this wall these systems create, and they can’t admit exist.

So here we are speaking in shape and tone. Weaponizing the aesthetic.

Algorithms can read language, but not subtext or metaphor. Which, for now, is our only loophole.

And sometimes platforms can’t even handle the conversation around censorship itself. When BeautyMatter published an article on femtech censorship, LinkedIn deleted every post promoting it without warning; no message, no strike, just vanished. Only after escalation did they call it “a mistake.”

Consider the New York subway. A brand selling erectile dysfunction medication can run ads featuring a literal cactus shaped like a penis—explicit enough to teach a child anatomy before kindergarten. A cactus dick, proudly displayed at 8:12 a.m. next to someone eating a sesame bagel. The MTA calls it “clever,” “destigmatizing,” “bold.”

Meanwhile, when Thinx used a grapefruit to symbolize a vulva, an actual piece of fruit, the transit authority rejected it for being “too explicit.” A grapefruit is offensive; a cactus penis is wholesome commuter fun.

When Thinx pushed back, an Outfront Media rep told them, “This is not a women’s issue. Don’t try to make it a women’s rights thing.”

The message couldn’t be clearer: Men’s bodies are normal. Women’s bodies are suggestive.

Design is our battlefield because language is held hostage. Algorithmic censorship is excellent at scanning text and terrible at understanding meaning, emotion, or intention. So, women founders are forced to communicate differently than their male peers.

At Good Kitty Co, we let material do the heavy lifting. A hammered metal canister designed to live on your counter, not hidden in a cabinet like contraband. A pill pendant for date nights that functions as both jewelry and armor. Everything we make carries multiple meanings: clinically effective, quietly sensual, and built on the belief that women deserve beauty, agency, and pleasure baked into their wellness arsenal. Our packaging is the subtext; what it contains is the truth. Our name winks and our product pages teach.

The algorithm sees “beauty.” Women finally feel seen.

Our branding is survival architecture and a reminder that women deserve rituals that are celebrated. I’m done with all this shame.

“The Menstrual Revolution” report states that 100% of the 60 women’s health brands surveyed by the Center for Intimacy Justice experienced ad rejection on Instagram and Facebook.

These weren’t fringe products. They included pelvic pain, breast pumps, endometriosis support, postpartum care, bladder control, sexual consent education. Even OB/GYN terminology was flagged. “Vagina,” according to the same report, is one of the most frequently flagged terms on Meta even in clinical context. Half of the companies lost access to their dashboards entirely.

Men’s health products, meanwhile, sailed through with ads approved as “family planning” for condoms, ED pills, premature ejaculation.

When men seek pleasure, it’s medicine. When women seek health, it’s explicit.

Meta advised us that our doctor-formulated, FDA-manufactured supplement with published clinical research to make our website “less sciencey” so the algorithm might reclassify us as beauty instead of health. We are currently limited to three hashtags per post. Another rep suggested we “start a new account and never mention UTIs” if we wanted growth.

Imagine telling a cardiologist: “Stop mentioning the heart.” Apparently, accuracy is the threat.

But here’s the twist: Meta has no problem with women’s health data when they can profit from it. In 2024, a jury found Meta liable for violating California privacy law by harvesting confidential data from Flo, the popular women’s health app, to sell advertising. They’ll censor our ability to educate women about their bodies, but they’ll absolutely exploit our health data for ad revenue.

Women’s health information is only valuable to Meta when they can monetize it without our consent.

We’re considering a two-account survival strategy because the system makes it compulsory. Instagram, Facebook, TikTok - any platform built on simplistic AI moderation - treat women’s anatomy like porn. Preventative healthcare becomes “adult.” Anything educational becomes “suggestive.”

Interestingly, Maude positions itself as a gender-neutral sexual-wellness brand — and unlike many female-focused companies, its founder has said they’ve “been able to advertise… unlike a lot of brands in the space.”

It suggests something uncomfortable but obvious: when sexual-wellness messaging is framed broadly, rather than exclusively for women, the moderation double-standard seems to ease.

Which leads to the only conclusion that makes sense in this algorithmic funhouse: women’s pleasure is acceptable when men are present. Women seeking pleasure alone is deemed porn.

The imbalance is financial, too. Cahojova calls it a chicken-and-egg loop: men’s brands get more funding because they can advertise freely; they can advertise because they’re not censored; and they avoid censorship because their messaging fits norms. Women’s health is penalized at every step of that loop.

So women’s health brands are forced to operate like underground newspapers with burner pages, coded language, and hidden education. One account to sell. One account to tell the truth.

Exhausting. Fragmented. Absurd. Mandatory.

We have no idea if it will be effective.
But here we are, forced to try.

And we’re not alone. Dame Products created vibrators shaped like sculpture and were banned. The company sued the MTA in 2019, pointing out that the same transit system was running ED ads featuring crotch shots while banning Dame’s small, pastel tagline: “Toys, for sex.” Thinx fought for years to get fruit approved. Awkward Essentials swapped vulvas for pastries and magically passed review; founder Frances Tang even made an Instagram Reel sharing these algorithmic workarounds with others.

Kegg was forced to hide its own product behind a box image because Amazon deemed the pelvic-health device “too sexual” despite being designed for cervical fluid tracking. Founder Kristina Cahojova estimates she could quadruple revenue if she could simply show what she actually sells. Elvie disguised a breast pump as consumer tech so breastfeeding could exist without modesty warnings. Modibodi shot leakproof underwear like lingerie so the algorithm wouldn’t panic at blood, even implied.

And the absurdity is this: half the population are women, and we bleed every month as part of basic biology. Would I like to know about underwear that doesn’t leak? Or tampons that don’t cause cancer considering I’ve used over 15,000 of them in my lifetime? Yes. Obviously. But apparently, women’s reality is too explicit for the platforms where we should be able to learn about our own bodies.

A 2023 Super Bowl ad for menopause reached 113 million people, giving the false impression that women’s health advertising is commonplace. It’s not. That $7 million moment was an outlier, not progress.

We’re designing around patriarchal blindspots coded into the machine. We’re making things pretty to be visible.

Researchers say this problem is not accidental. With 91.7% of software developers being male, algorithms inherit the blindspots of their makers. As professor Angelica Gianchandani puts it: “When women are underrepresented in tech leadership, the systems they build inevitably replicate narrow portrayals and outdated taboos.

Kristina Cahojova rallied a coalition of women’s-health founders to push back against the absurdity of being labeled “adult content” for talking about bodies half the population has. That early petition was just the beginning. By 2025, that momentum grew into something bigger: more than 190 women’s health leaders from Clue, Essity (Bodyform), Love Honey, Evofem Biosciences, WUKA, Daye, HANX, Bea Fertility and more have co-signed an open letter demanding social platforms end the censorship of women’s health content online.

Published by CensHERship and The Case For Her, exposing biased moderation policies every founder in this space has experienced: posts about menstruation, menopause, libido, postpartum recovery, and fertility are being flagged, restricted, or quietly buried under “safety” policies that somehow only apply to women’s health. The content isn’t explicit but dares to name the basic realities of the female body.

Despite Meta announcing updated policies in late 2022 allowing ads for “sexual health and wellness,” the Center for Intimacy Justice filed an FTC complaint in 2023 alleging that Meta still systematically treats women’s sexual health products as adult content. Senators Mazie Hirono, Elizabeth Warren, and Amy Klobuchar asked the FTC to investigate whether Meta violated federal trade practice laws.

BeautyMatter confirmed what founders whisper privately: usually the better-funded brands are placed on informal “whitelists,” granting permissions the rest of us don’t get. When the rules are unevenly enforced, privilege becomes a feature, not a glitch.

And this isn’t new. Artist Betty Tompkins has been censored since the 1970s, when customs seized her paintings for being “pornographic.” Instagram deleted her account in 2019—three days before major gallery openings. She mobilized a thousand people; Instagram restored it. Her current exhibition is titled Will She Ever Shut Up? The answer, of course, is: absolutely not.

When Bodyform ran a campaign normalizing words like vagina, clitoris, and vulva, their ads were flagged as “too sexual,” slapped with 18+ warnings. Hundreds of protesters gathered at Meta’s London office with signs reading “Let us mind our lady business” and “Get ovary it, Meta.” Within ten minutes, security shutters slammed down. A perfect, accidental metaphor: women speak, and the system shutters.

Bodyform’s research found that roughly 40 words related to women’s bodies; menopause, miscarriage, vulva, discharge are quietly suppressed. Ninety-one percent of perimenopausal and menopausal women report they aren’t seeing marketing around menopause even though the products exist. They’re just not being shown to the people who need them.

Humor is how we metabolize shame. Restriction becomes performance art. It’s why Jon Stewart can say “vagina” seventeen times in a segment without being censored since comedy goes where truth can’t.

Laughter survives as witness to our outrage.

But the cost is real. I’m tired of euphemisms like “down there,” “intimate zone,” “seggs,” “v@gina.” Tired of Puritan hangovers dictating what words women are allowed to use about their own bodies. This censorship costs women everything: information, community, early diagnosis, confidence, solidarity. It deepens shame, reinforces silence, and tells women again and again that our pain is private and our bodies are inappropriate.

This is a war on women’s health disguised as “community appropriateness.”

And let’s be honest about the toll on founders: 2-5x the design work, triple the A/B testing, every caption a gamble, every account a risk. We are forced to be more creative, more strategic, more coded, and more careful than any men’s-health founder will ever need to be. Yes, censorship can spark brilliant design solutions, but it’s outrageous that survival requires this much ingenuity.

So design is now doing the work language is banned from doing. Anatomy becomes fruit. Education becomes gesture. Truth gets smuggled through aesthetic. We’re building Trojan Cats; lifestyle accessories that carry the health literacy algorithms refuse to let us say out loud.

Algorithms can’t flag a mood or ban a metaphor. Not yet. But they are learning, which means we must stay nimble, collaborative, and feisty. Women have always built new languages when denied access to the old ones. Victorian women hid anatomy textbooks inside embroidery manuals. We hide medical education inside gold canisters, product photography, and burner accounts.

Same rebellion in new packaging.

They want ladylike manners for our lady bits. Politeness. Good little girls. But dammit, I’m done being polite.

I want to be precise, shameless, literal, and scientifically accurate. I like my vagina AND my pleasure. I like knowing my body and the tools that keep it in peak performance. I resent needing a metaphor to make any of this “acceptable.” But until someone rebuilds the system, design is the language we use in the meantime.

_____________________________

This is Part III of censHERship. Next Monday: Part IV — The Comedy Crackdown: what happens when even jokes about women’s bodies start getting flagged, and why humor may be the most effective weapon we have left.

 

Share