Trends

Facebook Wanted to Be a Force for Good in Myanmar. Now It Is Rejecting a Request to Help With a Genocide Investigation

The West African nation The Gambia is seeking to hold Myanmar accountable for charges of genocide against the Rohingya people, an ethnic and religious minority. In 2016 and 2017, Myanmar soldiers and their civilian proxies massacred Rohingya men, women and children, raped women and girls and razed villages, forcing more than 800,000 to flee into neighboring Bangladesh.
Facebook’s role in these atrocities isn’t news. In 2018, Facebook acknowledged it was used to “foment division and incite offline violence” in Myanmar, where the social media platform is so ubiquitous it’s often synonymous with the internet. An independent report commissioned by the company documented the same, as did independent fact-finders appointed by the U.N.

In response, Facebook took down the account of the commander-in-chief of the Myanmar military, Senior General Min Aung Hlaing, and other military officials and organizations. In 2018 alone it shut down numerous networks that sought to incite violence against Rohingya, removing 484 pages, 157 accounts, and 17 groups for “coordinated inauthentic behavior.”
To its credit, Facebook preserved the data and content it took down, and the company committed to cleaning up its act. “We know we need to do more to ensure we are a force for good in Myanmar,” a company representative said in an official statement in 2018.
Now, two years later, the company is doing exactly the opposite.
In June, The Gambia filed an application in U.S. federal court seeking information from Facebook that would help it hold Myanmar accountable at the International Court of Justice (ICJ). Specifically, The Gambia is seeking documents and communications from Myanmar military officials as well as information from hundreds of other pages and accounts that Facebook took down and preserved. The Gambia is also seeking documents related to Facebook’s internal investigations into the matter as well as a deposition of a relevant Facebook executive. All of this information could help to prove Myanmar’s genocidal intent.
Back in May, The Gambia filed a similar application in U.S. court against Twitter. The case disappeared quickly because The Gambia pulled its application shortly after submitting it, presumably because Twitter agreed to cooperate.
Not Facebook. Earlier this month, the company filed its opposition to The Gambia’s application. Facebook said the request is “extraordinarily broad,” as well as “unduly intrusive or burdensome.” Calling on the U.S. District Court for the District of Columbia to reject the application, the social media giant says The Gambia fails to “identify accounts with sufficient specificity.”
The Gambia was actually quite specific, going so far as to name 17 officials, two military units and dozens of pages and accounts.
Facebook also takes issue with the fact that The Gambia is seeking information dating back to 2012, evidently failing to recognize two similar waves of atrocities against Rohingya that year, and that genocidal intent isn’t spontaneous, but builds over time.
The management and analysis of data is not new to Facebook. It’s a core competency. The company analyzes billions of data points on its users every day, monetizing information collected from personal posts, chats and photos. That is Facebook’s business model. Now the company claims it’s unreasonable or “burdensome” for it to share data on a specific number of accounts that it already removed and preserved. The argument is not only unconvincing; it’s morally appalling.
But the crux of Facebook’s argument is the claim that The Gambia is asking the court to issue a subpoena in violation of U.S. law, specifically a section of the Stored Communications Act (SCA). This important federal law prevents social media companies from releasing communications and data to third parties on a whim.
Here’s the thing: the law is intended to protect the privacy of individuals, not shield unlawful actions of State actors. Under the SCA, there are exceptions to the privacy it protects, including when the government has a valid subpoena. And Facebook can divulge communications to protect its own rights, which would include its right to enforce its terms of service and deter unlawful behavior on the platform.
In addition, if information shared on social media was intended for public viewing, the SCA does not require a company to keep that information private. Many of the pages and accounts that Facebook took down in relation to the Rohingya genocide in Myanmar were public.
Facebook might say it’s concerned about setting a dangerous precedent, but sharing information of genocidal intent through a U.S. federal court would seem to be precisely the “precedent” the company should want to set, i.e. to deter State actors from using its platform for criminal purposes. Not to mention that voluntarily complying with The Gambia’s request wouldn’t create any legal precedent, only an internal one at the company.
The Gambia is expected to respond to Facebook’s opposition in court on Aug. 18. Regardless of how this request unfolds, U.S. law should never be used to protect perpetrators of genocide. If Facebook won’t do the right thing now, then Congress should make them.
U.S. legislators should pass a swift amendment to the SCA that would require social media companies to share relevant information with official bodies and litigants at international tribunals attempting to hold perpetrators of genocide and mass atrocity crimes accountable. The law should impose civil liability for any failure to divulge such information when properly requested.
Beyond that, for the time being, taking a stand against genocide could be as easy as deleting your Facebook account.
Source: Time

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button