Facebook’s ‘supreme court’ gears up to consider its decision to bar Donald Trump

Operational since October 2020 (Mark Zuckerberg originally floated the idea back in 2018), the Facebook oversight board is responsible for independently overseeing Facebook’s content moderation decisions. The body comprises 20 members, two of whom were reportedly on presidential shortlists for the US Supreme Court, as well as a Yemeni Nobel Peace Prize laureate, a former UN special rapporteur, and Colombia’s leading human rights lawyer.

This new type of oversight body came to light recently in the New York Times as they began to review a series of appeals against blocked or removed content. The decisions made (on several user appeals), will prove good practice for the far more momentous decision due in the next three months – namely, should Donald Trump be permitted to return to Facebook and reconnect with his millions of followers?

The decision about Trump’s Facebook account will have major consequences for the right to freedom of expression balanced with the growing threat of ‘hate speech’ and disinformation (fake news), and for American politics, but also for the way in which social media is regulated, and human rights policed.

The board reaches its decisions guided by Facebook’s own ‘community standards’ as well as international human rights law – especially the International Covenant on Civil and Political Rights and related reports by UN. The board is structurally independent and Mr Zuckerberg has promised its decisions will be binding.

The sheer power and reach of Facebook (2.7 billion users) are representative of both the good and bad aspects of social media: promoting freedom of expression and freedom of information, versus incitement of hatred and violence and the spread of disinformation. On top of Facebook’s pioneering capacity, the board’s members and staff are being paid (via a trust) six-figure salaries for roughly 15 hours of work a week. Needless to say, Facebook, the secretariat (which includes two former senior employees of Article 19, a free speech NGO), and the board members, have an enormous responsibility to get their decisions right.

The first cases

Overall, the board chose to overturn Facebook’s decision to remove content in four out of the five cases it considered, and upheld Facebook’s decision in just one case – in relation to hate speech against Azerbaijanis.

Hate speech against Muslims? (case decision 2020-002-FB-UA)

In October last year, a user in Myanmar posted, to a Burmese Facebook group, two photographs of a Syrian toddler who drowned during his attempted journey to Europe in 2015. The accompanying text claimed that there is something psychologically wrong with Muslims. Referencing the terrorist killings in France which came in response to Charlie Hebdo’s depictions of the Prophet Muhammad, the author of the post stated that his sympathies for the depicted child had been reduced, suggesting that the child may have grown up to be an extremist.

Facebook removed this content under its hate speech community standard.

The oversight board overturned Facebook’s decision to remove the post. It found that, while the post might be considered offensive, it did not reach the threshold to be considered ‘hate speech’. From a human rights perspective, this decision seemed sound: the user was not justifying the death of the young child, they were instead, drawing attention to the alleged ‘hypocrisy’ of Muslims’ perceived silence over the treatment of Uyghurs in China.

Comparison of Joseph Goebbels and Donald Trump (case decision 2020-005-FB-UA)

In October 2020, a user posted a quote which was incorrectly attributed to Joseph Goebbels, the Reich minister of propaganda in Nazi Germany. The quote claimed that, rather than appealing to intellectuals, arguments should appeal to emotions, and stated that truth does not matter. There were no pictures of Joseph Goebbels or Nazi symbols in the post. The Facebook user said that their intent was to draw a comparison between the sentiment in the quote and the presidency of Donald Trump.

Facebook removed the post because the user did not make clear that they shared the quote to condemn Joseph Goebbels and to counter extremism.

Once again, the oversight board overturned Facebook’s decision to remove the post, claiming that the quote did not support the Nazi party’s ideology. Comments on the post from the user’s friends supported the user’s defence that they sought to compare the presidency of Donald Trump to the Nazi regime (their use of identity politics). On this basis, the board – rightly – found that the removal of the post went against the user’s right to freedom of opinion.

Covid-19 ‘fake news’ in France? (case decision 2020-006-FB-FBR)

In October 2020, a user posted a video and accompanying text in French in a public Facebook group related to Covid-19. The post alleged a scandal at the Agence Nationale de Sécurité du Médicament (the French agency responsible for regulating health products), which refused to authorise hydroxychloroquine combined with azithromycin for use against Covid-19, but authorised and promoted remdesivir. The user criticised the lack of a health strategy in France and stated that “[Didier] Raoult’s cure” is being used elsewhere to save lives. The user also questioned what society had to lose by allowing doctors to prescribe a “harmless drug” during an emergency.

In its referral to the board, Facebook cited this case as an example of the challenges of addressing the risk of offline harms that may be caused by misinformation relating to the Covid-19 pandemic. The oversight board overturned Facebook’s decision to remove the post.

Facebook removed the content for violating its misinformation and imminent harm rule, which is part of its violence and incitement community standard, finding the post contributed to the risk of imminent physical harm during a global pandemic, as it contained claims that a cure for Covid-19 exists. The company concluded that this could lead people to ignore health guidance or attempt to self-medicate.

The board observed that the user was opposing a governmental policy and wanted to change that policy. The combination of medicines that the post claims constitute a cure are not available without a prescription in France and the content does not encourage people to buy or take drugs without a prescription. Considering these and other contextual factors, the board decided that the post did not reach the threshold of creating a risk of ‘imminent harm,’ as required by its own community standards.

The Board also found that Facebook’s decision did not comply with international human rights standards on limiting freedom of expression. Given that Facebook has a range of tools to deal with misinformation, such as providing users with additional context, the company failed to demonstrate why it did not choose a less intrusive option than removing the content.

On balance, it seems the board again reached a sound decision in this case. They may have added that the exercise of freedom of speech and of opinion to criticise government policy is central to the functioning of democratic society.

Et tu Trumpe?

These early decisions are but the entrée ahead of a far more consequential main course: the board’s upcoming review of Facebook’s decision to suspend former President Trump’s account indefinitely after the attack on the US Capitol on 6 January.

The decisions by Twitter and Facebook to bar Trump, immediately reshaped the American political landscape. As the New York Times noted, “In the course of a few hours after the Capitol riots, they simply vaporised the most important figure in the history of social media”.

The board took up the case in late January and will appoint a panel of five randomly selected board members, at least one of them American, to consider it and reach a decision. The full 20-person board will then review the decision, and could either reinstate Trump’s direct connection to millions of supporters, or sever it for good.

There are mixed views as to which way the board is likely to swing. Scholars and human rights experts seem split between those worried about the implications of permanent bans for free speech (the chief executive of Twitter, Jack Dorsey, and the chancellor of Germany, Angela Merkel), and those worried about what the consequences would be for American democracy, should Facebook’s decision be overturned.

According to Noah Feldman, professor of Law at Harvard Law School, conservatives may be pleased to find an ally in the new board. It will likely be more “responsible to freedom of expression concerns than any platform can be, given real world politics”, he said.

Nick Clegg, Facebook’s vice president for global affairs, on the other hand, has said he is “very confident” the board would affirm the company’s decision to suspend Trump’s account, though less sure as to what the future will be for it.

Whatever the final decision, it is clear it will have enormous consequences for human rights and democracy, not just in the US but around the world. This has led many, including UN Secretary-General Antonio Guterres to question whether individual companies should have the power to regulate free speech in such a manner.

“I do not think that we can live in a world where too much power is given to a reduced number of companies”, he told reporters. Instead, Guterres appeared to call for intergovernmental institutions to take the lead in setting universal norms so that states can apply consistent rules through “regulatory frameworks” that allow such decisions to be taken “in line with the law”.

Can you help us reach more readers?