Search

Facebook’s Content Moderation Errors Are Costing Africa Too Much - Slate

polripolri.blogspot.com
A protester standing on top of a bus stop with others waves a giant Nigerian flag that says "End SARS Now."
The #EndSARS protests on Oct. 15 Pius Utomi Ekpei/Getty Images

In 2016, Facebook censored one of the world’s most iconic images—the historic photograph of 9-year-old Kim Phúc, who was naked while fleeing during the Vietnam war. Facebook’s censorship of the “Napalm Girl” photograph sparked global outrage. Facebook finally backed down and apologized for removing the image from its platform.

However, it seems Facebook did not learn its lesson from that event. Recently, Nigerians became the latest victims of Facebook’s overly aggressive content moderation practices.

On Oct. 4, a video surfaced online with members of the Federal Special Anti-Robbery Squad—a tactical unit of the Nigeria Police Force also known as F-SARS or SARS—dragging two limp bodies out of a hotel and later shooting one of them. The video created a flashpoint, inspiring protests online and offline in a movement that became closely identified with the hashtag #EndSARS. On Twitter, a group of feminists rallied support and financial donations for the protesters. Despite obstacles thrown their way by the Nigerian government, they and others created a structure that supported one of the most successful and organic protests the country has ever seen. The Nigerian government responded with bullets, creating what is now known as the #LekkiMassacre.

On Oct. 20, peaceful protesters gathered at the Lekki Toll Gate in the capital city of Lagos as they have done in the past few weeks. At 1 p.m., citing violence that had erupted in other parts of the city, the Lagos government issued a curfew order to begin at 4 p.m. that same day. The Lagos government later backtracked and moved the curfew to 9 p.m., which was announced by 8:08 p.m. on its official Twitter handle. Before the protesters could comply with the order, rumors—soon confirmed—began to circulate on Twitter, saying that government workers were removing cameras and switching off lights around the protest area. By 6:50 p.m., military personnel had opened live fire on the peaceful protesters, many of whom kept singing the Nigerian anthem. While the number of casualties are still uncertain, Amnesty International Nigeria stated at least 12 protesters were killed with many others injured. An eyewitness report said at least 15 dead bodies were carted away by the Nigerian army after the shootings. A week after the shootings, neither the Lagos State Government nor the Federal Government of Nigeria have been able to confirm the number of casualties.

Even as the shooting was still going on, a historic and symbolic image emerged to rally people inside and outside Nigeria: a Nigerian flag, soaked in the blood of some of the protesters shot at by the army. Quickly, the bloodied flag and other alarming images found their way to Facebook and Instagram—which soon labeled them as false information.

The problem seemed to be that their automated systems were confusing the SARS of the hashtag with the initials for severe acute respiratory syndrome, the precursor to COVID-19. In their attempts to block misinformation about the pandemic, the platforms had almost robbed Nigerians of a pivotal moment in their history—and lent a hand to the delegitimization of the massacre by the Nigerian government. (For instance, the Nigerian army’s official Twitter account repeatedly tweeted that the shooting at the Lekki Toll Gate was “fake news”—it even stamped the phrase over images of headlines.)

Instagram later apologized, saying, “[O]ur systems were incorrectly flagging content in support of #EndSARS, and marking posts as false. We are deeply sorry for this. The issue has now been resolved, and we apologize for letting our community down in such a time of need.” Facebook, which owns Instagram, used nearly identical language in statements to outlets reporting on the censorship.

I reached out to Facebook to find out what had happened. Facebook’s head of communications for sub-Saharan Africa, Kezia Anim-Addo, confirmed that the labeling error resulted from an automated system. She said in an email:

In our efforts to address misinformation, once a post is marked false by a third party face checker, we can use technology to “fan out” and find duplicates of that post so if someone sees an exact match of the debunked post, there will also be a warning label on it that it’s been marked as false.

In this situation, there was a post with a doctored image about the SARS virus that was debunked by a Third-Party Fact Checking partner

The original false image was matched as debunked, and then our systems began fanning out to auto-match to other images

A technical system error occurred where the doctored images was connected to another different image, which then also incorrectly started to be matched as debunked. This created a chain of fan outs pulling in more images and continuing to match them as debunked.

This is why the system error accidentally matched some of the #EndSARS posts as misinformation.

I asked her what Facebook might be doing to ensure that this does not happen again in an African country, but she did not directly answer the question.

It’s that lack of specific answers that is particularly troubling here. There are so many things we do not know about Facebook’s content moderation practices, especially in African countries. For example, how many content moderators are there dedicated to the African region broadly and Nigeria specifically? How do these moderators work together with local fact-checkers and what informs their actions? When are human judgments brought into automated decision-making systems? Also, beyond Nigeria, how inclusive and representative are moderators with respect to language, subregions etc.? For example, how impactful is the Facebook team in Nairobi so far? Facebook now plans a local office in Lagos—what role will it and other regional offices play on content moderation policies? Beyond sales, partnerships, and communications, what specific policy roles will the Lagos office be playing?

Just like the Napalm girl image, Facebook’s labeling of the #EndSARS images has once again shown that lack of context will always be a challenge to its content moderation policies. Social media companies cannot continue to wish away their responsibility to protect human rights.

Going forward, Facebook needs to reassess its overreliance on automated flagging, especially in the African region, where its already-flawed systems are even more likely to miss important contexts. It may be useful for Facebook to design alert systems for African countries where events that raise threats levels for the public are categorized and prioritized—had someone been watching the protests closely, they might have realized that the #EndSARS hashtag could be confused with systems intended to screen for coronavirus misinformation. To be more useful and effective with its content regulation policies, Facebook must also invest more in local African languages and a qualified workforce. This brings us back to the platform’s offices in Nairobi and Lagos: What role are they playing in content moderation? What power do the moderators have to flag things for higher-up staff?

The major obstacle of content moderation is not so much about lack of solutions, but about the continuous complacency of platforms that time after time weaponize apologies in place of taking active steps in mitigating online harms especially outside developed democracies. Facebook’s high-handedness and aloofness is particularly insulting because Nigeria is home to the platform’s most users in Africa. Facebook needs to stop seeing itself as an American company and truly embrace its role as a globally responsible business. If Nigeria and by extension other African countries are good enough to make profits from, they are good enough for Facebook to take some responsibility for the power it holds there.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Let's block ads! (Why?)



World - Latest - Google News
October 27, 2020 at 10:38PM
https://ift.tt/2HIMH6t

Facebook’s Content Moderation Errors Are Costing Africa Too Much - Slate
World - Latest - Google News
https://ift.tt/2SeTG7d
https://ift.tt/2yooRZP

Bagikan Berita Ini

0 Response to "Facebook’s Content Moderation Errors Are Costing Africa Too Much - Slate"

Post a Comment

Powered by Blogger.