The parent firm of Facebook, Facebook’s Oversight Board for Meta, overturned a decision to take down an apparent support for transgender suicide on Instagram.
The Board determined that the article infringed on Meta’s Hate Speech and Suicide and Self-Injury Community Standards, noting that rather than the guidelines themselves, the problem with the circumstance was the way they were being enforced.
The Board advised Meta to improve the internal guidance it provides to reviewers when determining whether people have violated its “community criteria” in order to close any gaps in how the business upholds those requirements.
An picture of a patterned curtain in the transgender flag’s orange, green, and light colors was posted online by an upsetting Twitter user in Poland in April 2023.
The Polish language on the photo read, “New technologies… Drapes that hang themselves.”
The words “flower cleaning” with a heart icon next to them were displayed above that word.
I am a transphobe is stated in the user’s history on their profile, which detractors of the article have cited as proof that it was intended to be aggressive and purposefully unpleasant to trans people.
Less than 50 people responded to the article, and 11 different people reported it a total of 12 times. However, Meta’s automated systems merely gave two of the 12 information priority for people review, while the other ten were closed without any further action being taken.
It was determined that the two reports that suffered human review that claimed the article violated Facebook’s ban on content that promotes suicide or self-harm were not. Employees at Meta did not review any of the reports that claimed the article was a form of “hate talk” in the interim.
Following Meta’s decision to remove the Facebook post, three users filed appeals, and one of them led to a human reviewer upholding the original ruling that the post didn’t violate the Suicide and Self-Injury Community Standard. However, pertains claiming violations of Meta’s Hate Speech Community Standard were previously reviewed by a human being.
The Oversight Board, an unidentified organization that examines and renders legally binding decisions on content moderation scenarios across Meta’s systems, including Facebook and Instagram, received an appeal from one of the customers who reported the original article.
After determining that the article violated both its Hate Speech and Suicide and Self-Injury policies, Meta removed it from Twitter and crippled the consideration of the user who posted the content—though not for the anti-trans post, but for earlier, related violations of community standards. The Oversight Board decided to discover the appeal from the decision to leave the posting in place.
Because the material of the article includes “violent talk” in the form of a contact for the death by suicidal death of an identified group, the Board later ruled that it violated Meta’s prohibition on “hate speech.” The article, which promotes death among trans people, “created an atmosphere of intimidation and isolation, and could have contributed to actual harm.”
The Board explained its choice in a letter given the nature of the text and image, which likewise exacerbated the emotional health problems affecting transgender people. Additionally, it came to the conclusion that Poland’s lawmakers and public numbers have continued to attack and use social rhetoric against the trans community.
“Especially when posts include images and text that need context to be understood, the Board urges Meta to improve the accuracy of hate speech enforcement towards LGBTQIA+ people.” In this instance, “malign creativity” manifested itself in the somewhat coded references to death and the visual representation of a protected class (the transgender symbol). This refers to bad actors creating innovative ways to harass or harass the LGBTQIA+ community through posts and memes they claim are “comical” or “satirical.”
The Oversight Board expressed concern that Meta’s people writers failed to recognize—or probably purposefully disregarded—contextual cues that might have led them to believe the blog broke group norms.
Additionally, it claimed that Meta’s defense of the people reviewers revealed shortcomings in the existing guidance by claiming that they carefully adhered to it. The Board also criticized Meta’s integrated assessment prioritization systems for failing to identify any issues with the offending post.
The Board suggested that Meta understand its Suicide and Self-Injury Community Standards in order to implied forbid material that encourages or promotes death with an identifiable group of users in mind rather than just one user.
In order to make it clear to human writers that “flag-based aesthetic depictions of sex identity that do not have a human figure are understood as representations of the group defined by its members,” Meta was also advised to change its inner guidance.
GLAAD praised the Oversight Board’s decision and urged Meta to take action to address the shortcomings of how Meta handles alleged violations of society standards in its reply to the Board before it made its decision.
According to Sarah Kate Ellis, president and CEO of GLAAD, “I personally want to learn Meta CEO Mark Zuckerberg tell the world now that his business cares about the health, rights, and respect of transgender people.” “This perilous hatred on his websites is wreaking havoc in the real world, and it must stop.”
Ellis pointed out that in July 2023, the Human Rights Campaign, GLAAD, and more than 250 Gay celebrities and allies signed an open letter urging all major social media platforms, including Meta, to develop a strategy for handling willing that expresses anti-transgender hatred or animus.
The email was never responded to by any of the social media companies.
According to Ellis, “This new Oversight Board decision presents a critically important option for Meta.” “The business must deal with this terrible and serious phenomenon of violent anti-trans hate content. The spread of such abhorrent intolerance may be strongly and quickly denounced by Meta as not in line with their business values. The weaponization of lies targeting historically disadvantaged groups has a long and sad record.