Ad Blocker Detected
Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker.
Content creators have long criticized Facebook and Instagram for their content moderation policies relating to photos that show partial nudity, arguing that their practices are inconsistent and often biased against women and L.G.B.T.Q. people.
This week, the oversight board for Meta, the platform’s parent company, strongly recommended that it clarify its guidelines on such photos after Instagram took down two posts depicting nonbinary and transgender people with bare chests.
The posts were quickly reinstated after the couple appealed, and Meta’s oversight board overturned the original decision to remove them. It was the board’s first case directly involving gender-nonconforming users.
“The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people,” Meta’s Oversight Board said in its case summary on Tuesday. “The lack of clarity inherent in this policy creates uncertainty for users and reviewers, and makes it unworkable in practice.”
The issue arose when a transgender and nonbinary couple posted photos in 2021 and 2022 of their bare chests with their nipples covered. Captions included details about a fund-raiser for one member of the couple to have top surgery, a gender-affirming procedure to flatten a person’s chest. Instagram removed the photos after other users reported them, saying their depiction of breasts violated the site’s Sexual Solicitation Community Standard. The couple appealed the decision and the photos were subsequently reinstated.
The couple’s back-and-forth with Instagram underscored criticism that the platform’s guidelines for adult content are unclear. According to its community guidelines, Instagram bars nude photos but makes some exceptions for a range of content types, including mental health awareness posts, depictions of breastfeeding and other “health related situations” — parameters that Meta’s board described as “convoluted and poorly defined” in its summary.
How to decide what depictions of people’s chests should be allowed on social media platforms has long been a source of debate. Scores of artists and activists contend that there is a double standard under which posts of women’s chests are more likely to be deleted than those of men. Such is also the case for transgender and nonbinary people, advocates say.
Meta’s oversight board, a body of 22 academics, journalists and human rights advocates, is funded by Meta but operates independently of the company and makes binding decisions for it. The group recommended that the platforms further clarify the Adult Nudity and Sexual Activity Community Standard, “so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender.”
It also called for “a comprehensive human rights impact assessment on such a change, engaging diverse stakeholders, and create a plan to address any harms identified.”
Meta has 60 days to review the oversight board’s summary and a spokesman for the company said they would publicly respond to each of the board’s recommendations by mid-March.