ALBAWABA - Following a year-long assessment, Meta’s oversight board, which is backed by Meta but functions independently, has determined that the social media giant’s strategy was "overbroad" and had needlessly restricted the expression of millions of users, as reported by Reuters, demanding the company to lift its ban on commonly used Arabic word “Shaheed,” which translates to “Martyr.”
Human Rights Watch had earlier concluded that Meta was responsible for "systemic censorship of Palestine content" during the current Israeli war on Gaza, as reported by Middle East Eye, which it blamed on "flawed Meta policies and their inconsistent and erroneous implementation, over-reliance on automated tools to moderate content, and undue government influence over content removals" ahead of the board's recent advisory statement was made public.
The board believes that Meta's constrained approach has had a "discriminatory and disproportionate" effect on sharing of information, outweighing the company's concerns about the term being utilized to support violence.
Some instances given are when a government releases a press release verifying someone's death, when a human rights advocate uses the term "shaheed" to condemn someone's execution, or even when a user complains about the condition of a local road bearing the honorific term "shaheed" with the name of a deceased person, with each of these examples being removed by Meta.
The decision follows years of calls against the company's approach to Middle Eastern content, which have intensified since the start of the current Israeli aggression on Gaza. One such study, which Meta itself ordered in 2021, found that its strategy had an "adverse human rights impact" on Arabic-speaking users of its platforms, as well as on Palestinians.
Rights groups have accused Meta of censoring content that supports Palestinians on Facebook and Instagram in the context of a conflict that has killed over 32.414 people in Gaza thus far, with Helle Thorning-Schmidt, Oversight Board co-chair, stating that “Meta has been operating under the assumption that censorship can and will improve safety, but the evidence suggests that censorship can marginalize whole populations while not improving safety at all.”