Facebook’s self-regulatory ‘Oversight Board’ (FOB) has delivered its first batch of decisions on contested content moderation decisions almost two months after picking its first cases.
A long time in the making, the FOB is part of Facebook’s crisis PR push to distance its business from the impact of controversial content moderation decisions — by creating a review body to handle a tiny fraction of the complaints its content takedowns attract. It started accepting submissions for review in October 2020 — and has faced criticism for being slow to get off the ground. Announcing the first decisions today, the FOB reveals it has chosen to uphold just one of the content moderation decisions made earlier by Facebook, overturning four of the tech giant’s decisions.Decisions on the cases were made by five-member panels that contained at least one member from the region in question and a mix of genders, per the FOB. A majority of the full Board then had to review each panel’s findings to approve the decision before it could be issued. The sole case where the Board has upheld Facebook’s decision to remove content is case 2020-003-FB-UA — where Facebook had removed a post under its Community Standard on Hate Speech which had used the Russian word “тазики” (“taziks”) to describe Azerbaijanis, who the user claimed have no history compared to Armenians. In the four other cases the Board has overturned Facebook takedowns, rejecting earlier assessments made by the tech giant in relation to policies on hate speech, adult nudity, dangerous individuals/organizations, and violence and incitement. (You can read the outline of these cases on its website.) Each decision relates to a specific piece of content but the board has also issued nine policy recommendations. These include suggestions that Facebook [emphasis ours]:
- Create a new Community Standard on health misinformation, consolidating and clarifying the existing rules in one place. This should define key terms such as “misinformation.”
- Adopt less intrusive means of enforcing its health misinformation policies where the content does not reach Facebook’s threshold of imminent physical harm.
- Increase transparency around how it moderates health misinformation, including publishing a transparency report on how the Community Standards have been enforced during the COVID-19 pandemic. This recommendation draws upon the public comments the Board received.
- Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing. (The Board made two identical policy recommendations on this front related to the cases it considered, also noting in relation to the second hate speech case that “Facebook’s lack of transparency left its decision open to the mistaken belief that the company removed the content because the user expressed a view it disagreed with”.)
- Explain and provide examples of the application of key terms from the Dangerous Individuals and Organizations policy, including the meanings of “praise,” “support” and “representation.” The Community Standard should also better advise users on how to make their intent clear when discussing dangerous individuals or organizations.
- Provide a public list of the organizations and individuals designated as ‘dangerous’ under the Dangerous Individuals and Organizations Community Standard or, at the very least, a list of examples.
- Inform users when automated enforcement is used to moderate their content, ensure that users can appeal automated decisions to a human being in certain cases, and improve automated detection of images with text-overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. Facebook should also improve its transparency reporting on its use of automated enforcement.
- Revise Instagram’s Community Guidelines to specify that female nipples can be shown to raise breast cancer awareness and clarify that where there are inconsistencies between Instagram’s Community Guidelines and Facebook’s Community Standards, the latter take precedence.