Lifestyle

Instagram Might Finally Be About To Free The Nipple. What Took So Long?

Maxim Shevchenko/Pexels

“It’s about much more than ‘letting women be topless’. It’s about the idea that a topless woman can only ever be two things: sexual, or not sexual enough, and therefore distasteful.”

If I were to post a topless photo on Instagram right now (which, who knows—I might!), here are a few things that would happen. The platform’s artificial intelligence moderators would first detect the presence of nipples. If those nipples were attached to what was deemed a “female breast”, then the photo would swiftly be taken down for violating their community guidelines. And how would they decide what constitutes a “female breast”? Well, put plainly, it depends on how much fatty tissue is behind the nipples (yes, really). And if the breast belongs to a trans man or non-binary person who hasn’t had top surgery? The AI censors will likely mistake the breast for “female”, unless the user tirelessly points out otherwise (as discovered by gender nonconforming model Rain Dove in 2018). 

If all of the above sounds strange and nonsensical, that’s because it is. Which is something that Meta—the company who owns Instagram and Facebook—appears to finally be cottoning onto. Earlier this month, Meta’s oversight board—a group of academics, journalists and politicians who advise the company on its content moderation—recommended that Meta change its adult nudity guidelines “so that it is governed by clear criteria that respect international human rights standards”.

“This policy is based on a binary view of gender and a distinction between male and female bodies,” they wrote, referring to the platform’s nudity ban. “Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale.”

This is a positive sign from Meta, who appear to be recognizing the murky ethics behind policing a person’s body based on gender markers—or at least leaning that way. But it also feels almost laughably overdue. Teenagers today will likely not remember what a contentious and widespread talking point this once was, with everyone from Willow Smith to Rihanna to Cara Delevingne sharing Free the Nipple hashtags and voicing their outrage about misogynistic double standards online. The global movement—kickstarted by artists and activists—is more than a decade old now. That’s 10 years of pointing out something, over and over again, and it falling repeatedly on deaf ears. 

The original Free the Nipple movement, which first went viral in 2015, wasn’t without its drawbacks, of course. While the sentiment was good (why should a “female breast” be perceived as inherently sexual, when guys are free to go topless?), the movement quickly became synonymous with white, cisgender, able-bodied skinny girls with perky tits and not much else. The whole thing began to feel vaguely annoying—a bit “Tumblr feminism”. In what way, detractors asked, is a conventionally attractive woman baring her breasts going to genuinely shift any paradigm? Gina Tonic put it best for Bustle in 2015: “Feminism without intersectionality is pointless; only representing nipples that adhere to patriarchal standards of beauty is pointless.”

That said, when thinking about nipple censorship on social media today, it’s worth getting to the bare bones of the matter, which is this: there is simply no way of knowing a person’s gender based on what their body looks like—full stop. And, even if that were possible, or even remotely ethical to police (it’s not), there is no reason that a woman’s breast ought to be inherently offensive, when a man’s chest isn’t. When we consider those two things in tandem, Instagram’s adult nudity policy begins to look not only misogynistic and transphobic, but also genuinely absurd. Take a trans woman, for example, who hasn’t had breast augmentation. Is she expected to blur her nipples out the moment she legally gains gender recognition? Or does the censorship only apply post-surgery, once deemed “woman enough” by the AI censors? 

Meta isn’t the only company backtracking on outdated censorship rules. In November last year, Tumblr announced that it would be allowing nudity again, including freeing the nipple. Though the platform had a thriving NSFW community in its mid-2010s heyday, its 2018 ban on adult content caused a mass exodus from the platform, leading to a 30 per cent drop in monthly page views. It didn’t help that users widely reported that Tumblr’s “Safe Mode” filter was not only censoring adult content, but also flagging non-explicit and LGBTQIA+ artwork that didn’t contain anything sexual (because guess what? AI censorship doesn’t work). Sure, they’ve since backtracked, but for many, it was too little, too late. And though Tumblr may be on the brink of a comeback, it’s hardly the LGBTQIA+ mecca that it once was, or had the potential to be. 

It’s worth pointing out here that many of these “adult nudity” bans don’t exist in a vacuum. They’re a reflection of real-life norms in the Western world. If I were to go outside topless, there’s a high chance I’d be arrested. If a guy was to leave the house with his top off, nobody would bat an eyelid. These are centuries-old social conventions that position men as observers, and women as the observed, with no room for gender variation. It’s about much more than “letting women be topless”. It’s about the idea that a topless woman can only ever be two things: sexual, or not sexual enough, and therefore distasteful. Whereas a topless man can just exist. And it will take more than the overturning of a nipple ban to overhaul that view, and the whole binary gender-based system that upholds it.

But that doesn’t mean it’s not a start. And though Meta has yet to actually implement these changes in a real, tangible way—they have 60 days to respond publicly to the board’s recommendations—it’s encouraging to see conversations surrounding body autonomy, and the negative ways in which automated systems can affect marginalized bodies, being centered in the public eye. 

Still, the ways in which Meta will be able to uphold a porn ban without upholding a nudity ban in fair and unharmful ways remains to be seen. We’ve already witnessed how artificial intelligence can make mistakes or uphold biases, often at the detriment of marginalized groups. These mistakes arguably can’t be rectified by simply “fixing” the technology (how?), or tweaking the criteria (in which ways?). Meta now has a chance to properly dismantle a system that was flawed from the offset. Only time will tell whether they actually choose to take it.

Share now on:
FacebookXEmailCopy Link