Over 10 years ago, Kim Kardashian “broke the internet” with a fully nude Paper Magazine cover photo, posted to her 355 million Instagram followers. Many of us civilians, though, would live in fear were we to share something so risque, fear of the judgement of our peers perhaps, but also of Meta’s strict content moderation. While celebrities certainly do live under different rules than the rest of us in many regards, censorship isn’t quite the simple binary of “they can and we can’t.” Some of us really can’t. BIPOC folks, fat folks, disabled folks, and queer and trans folks, are policed on social media in ways that people with relative privilege often are not. You might be surprised to learn there is a name for Zuckerberg and co’s unfair application of the terms and conditions: algorithmic bias.
The “algorithm” part of that phrase refers not only to the algorithms behind your “explore” or “for you” pages, but also the algorithms that flag content that might be inappropriate for review. That’s right. If you have ever had a post removed, or had your entire account disabled, a machine examined your content before a person ever did — if a person ever did.
Terms and Conditions
This bias is made possible by the vague rules written into platform policies. Think back to the first time you created an account: The terms and conditions popped up, and your first instinct was likely to scroll to the bottom and press “accept” without reading a word. As conversations about selling data and privacy have become more mainstream, it is clear that long terms and conditions pages, written in confusing legalese, are intentionally overwhelming to the point of complacency. Conversations about data usage and rights have made another thing clear: Even if you take the time to read it, there is no real recourse if you do not agree with the agreement. Your options are to accept or not use the platform.
Buried in those terms are community guidelines. Those largely impacting queer accounts are called “Adult Nudity and Sexual Activity” on Meta, and “Sensitive and Mature Themes” on TikTok, the latter obviously vague. The ambiguity does not stop there. Dig deeper, and you’ll see policies prohibiting things like “imagery of sexualized body parts”, “implicit sexual activity”, “underwear that does not cover the majority of the buttocks” or “significant body exposure… such as extremely cropped shirts.”
But what does that really mean? Who decides which body parts are “sexualized”? How cropped is “too cropped”? And how exactly does one measure if 49% or 51% of their buttocks is covered in an underwear selfie? This is subjectivity parading as neutrality, and it leaves marginalized creators more vulnerable to censorship.
Sex negative policies are the result of a decades-long legal battle over whether the duty to protect free speech was more important than the duty to protect minors, and whether the duty to protect minors should fall on platforms rather than parents. From the 1997 dismissal of anti-obscenity legislation in Reno v. American Civil Liberties Union, to the 2018 passage of the Fight Online Sex Trafficking Act (FOSTA)/Stop Enabling Sex Traffickers Act (SESTA), the courts have flipped time and time again. Today, the battle in the courts is far from settled. As it stands, platforms can still be held legally liable for any evidence of sex trafficking on their sites, incentivizing overly broad censorship policies that now run rampant.
But sex-related limitations may not be the only unequally applied rules. Queer creators speaking out against homophobic politicians, for example, should look out for “bullying and harassment” claims. Even posts containing self-referential slurs, which members of the LGBTQIA+ community may seek to reclaim, could be censored under those same anti-bullying policies. Just last February, Meta quietly made limited political content the default across feeds, requiring users to manually opt out if they wanted more exposure. “Political”, though, is also hard to find an objective definition for, and Meta has yet to provide one. Queer lives are inherently political, meaning that speaking openly about queerness carries risk under this vague and broad new rule. Beyond content removal, this restriction applies to accounts rather than individual posts, which puts queer creators in danger of being shadowbanned even when they do carefully limit what they share. The terms and conditions may be vague, but the algorithms, who gets restricted and who gets boosted, remain closely guarded company secrets.
In one sweeping announcement, the news of Meta’s decision to limit political content on users’ feeds, came alongside the elimination of DEI initiatives within the company and the dismantling of Meta’s Hateful Conduct Policy. When enacted, the Hateful Content Policy protected against LGBTQIA hate speech by name. The repeal adds yet another layer of vulnerability for queer people navigating online spaces.
Queer Censorship in Real Time
This issue isn’t just theoretical; it’s impacting users everyday. Some have reported noticing content filters on posts under hashtags like #lesbian and #bi. In 2024, the organization “Men Having Babies”, which supports MLM couples pursuing surrogacy, saw a post flagged with a warning label simply for sharing a photo of a queer family. Facebook has also been known to remove trans users’ accounts for violating its “real name” policy. Trans content broadly may discuss aspects of medical transition that involve genitalia and is therefore seen as fair game for content moderation.
In 2019, the editorial platform Salty Mag distributed a survey to their Instagram followers and newsletter subscribers. After hearing through comments and emails that many members of their (largely queer) community were experiencing censorship, they hypothesized that those living with multiple marginalized identities were being disproportionately affected. Since conducting a funded study was inaccessible, they set out to start the conversation in the best way they could: through grassroots data collection and a publicly accessible report. The results of Salty’s survey featured testimonials, like a trans user who was “shadow-banned and taken off for a few days several times…because of female nipples (which is transphobic) … critical political posts, and feminist and queer art.”
Salty themselves had submitted a proposal for paid Instagram advertisements, featuring fully clothed queer folks, only to be rejected for “promoting escorting services.” This could be interpreted as specifically transphobic given how often trans feminine people are profiled as sex workers. This also highlights another component of censorship: While the shadowbanning and removal of organic social media content from everyday users and influencers may happen quietly, queer inclusive businesses trying to advertise may face blatant rejection. Queer comedian Matt Marr, for example, was also denied the ability to advertise for his show CabarGAY, because the content was deemed “too political.” Consumer goods conglomerate Procter and Gamble were kept from running Facebook ads that expressed support for gay rights for the same reason. For smaller businesses though, who often have more cause to be values-driven, being unable to market threatens the livelihood of people IRL.
And these are just the stories we know. There is privilege in being resourced enough to go to the press and share an unfair experience. There is another in even recognizing an experience as unjust when the mainstream does not frame it that way, especially when the ins and outs of algorithmic bias are secretive. This isn’t only an issue of not being seen or not getting attention from those in your network. For even the most casual of social media users, creating and sharing content is a form of self expression. By repressing queer voices, platforms are upholding the legacy of suppressing queer expression for the comfort of the privileged. We owe it to our queer community to pay attention. Because this censorship happens under the radar, these massive platforms are successfully getting away with it and continuing to find new ways to expand censorship’s reach. We need to demand transparency and accountability, and that begins with understanding the issue as fully as we can.