OPINION | The Rohingya Genocide: Meta Must Be Held Accountable
By: Edwin Tang
Seemingly bowing down to Mr. Trump and the wishes of right-wing extremists, Mark Zuckerberg eliminated fact checkers on Facebook and Instagram just over two weeks ago, citing “free speech” and “political censorship.”
Meta’s newly negligent policy that deregulates misinformation and hate speech isn’t just damaging our society by increasing racism against minorities and undermining trust in democratic institutions. Rather, the new plan also plays a role in violence abroad.
This isn’t something new. For years, Meta, through Facebook, has failed to monitor its platform for hate speech and incitement of violence. Fueled by its highly-engaging algorithm designed to keep people in echo chambers, hate speech and nationalistic rhetoric rampantly spread as Facebook and other social media platforms facilitate connectedness across a vast population. Causing the Rohingya genocide, yet often rarely mentioned in American mass media, Facebook is a complicit actor in ethnic cleansing, shielded from punishment and litigation due to its corporate status.
As a stateless ethnic Muslim minority in Myanmar, the Rohingya people suffer with no citizenship and rights, shunned away from almost every country as refugees. Discrimination and restrictions on civil liberties imposed by the government of Myanmar ensure that the Rohingya people are classified as lesser than the rest of Burmese society—always a lower class. The Myanmar military junta has systemically committed rape, murder, arson, and other various crimes against humanity in an intention to eliminate the Rohingya people, resulting in the displacement of around 1 million people. Villages destroyed and people tortured, accounts of sexual violence against woman and children, and other forms of violence are clear evidence of the abhorrent situation in Myanmar. “Kill all you see, whether children or adults,” military officers ordered soldiers.
Disgustingly, Facebook and Meta are complicit in the Rohingya genocide. Meta’s own studies since 2012 have confirmed that Meta knowingly used algorithms that could result in extremist social shifts. Statements such as “our recommendation systems grow the problem” and “hate speech, divisive political content, and misinformation on Facebook… are affecting societies around the world” were found in Meta internal documents. They were also warned by advocacy groups and activists that Facebook’s algorithms were exacerbating ethnic tensions in Myanmar, yet Meta ignored these concerns to prioritize profit over safety, with echo chambers produced by Facebook’s algorithms exacerbating racial violence and attacks. Global Witness in 2022 found that “Facebook’s ability to detect Burmese language hate speech remains abysmally poor.” The reason why is simple—they haven’t tried, even after almost a decade of violence taking countless lives.
Meta must be held accountable and pay reparations to victims of the Rohingya genocide. Clearly complicit in ethnic cleansing, Meta must immediately remediate the irreversible damage its platform has caused to the Rohingya people. Moreover, they must be required to monitor Facebook and Instagram with fact checkers and moderators to shut down misinformation and hate speech. With corporate accountability absent, companies will never learn the red line, always prioritizing financial gain over ethics and morality. We must demand that the rule of law applies to all, not solely reserved for ordinary people. Too often do corporations get away with their crimes due to their control of our political system and economy.
Despite calls for reparations by the Rohingya people, Meta has refused to provide any monetary remediation for its role in the Rohingya genocide. With an annual profit nearing $40 billion, it is absolutely disgraceful that Meta ignores any responsibility it has in the effect of its social media platforms.
To fix this, governments must remove protections and “excuses” used by corporations in their complicit behavior. Through establishing criminal or civil law for contributory liability and guidelines for content moderation, governments can force corporations to adopt corrective measures. An international system to take violators to the International Criminal Court (ICC) would also force through corporate accountability. It’s time to treat Meta and other corporations as they should be treated—punishable and accountable under the law.