For years, Facebook has had a content moderation problem. Its struggled to make unpopular decisions about content like whether it should take down a viral doctored video of House Speaker Nancy Pelosi that made it appear like she was slurring her words (it didnt) or if it should ban prominent conspiracy theorist Alex Jones (it did). And as the 2020 presidential election gets closer, the company is attempting to do a better job than it did in 2016, when it was accused of allowing misinformation on its platform to influence the democratic process. Thats why Facebook will soon outsource some of its toughest content moderation problems to a new, much-anticipated but not yet fully formed outside group: its independent oversight board.
Facebook on Tuesday revealed new details about the board, which will evaluate how the company handles controversial posts on its platform, and which could influence its takedown policies around contentious topics like hate speech, nudity, and misinformation.
Currently, if you object to Facebooks decision to take down content you posted, you can appeal the decision to Facebooks community moderation team. But after that, theres no way for you to further dispute the decision. In the future, the company says, Facebook and Instagram users will be able to petition the independent oversight board that will have final say on the matter, if it decides to take on a users case. People who care about the future of free speech online have been closely watching to see what Facebook does with its board, because it could shape the future of whats allowed, or not allowed, on the platform for years to come.
And now, more than a year after CEO Mark Zuckerberg initially announced the idea of an oversight board, Facebook has released a series of proposed rules about how it will all work. That includes a process for users to submit appeals, an outline of how the board will decide on cases, and a mandate forcing Facebook to implement the boards decisions on individual posts within seven days, unless the post violates local law. All in all, once the board considers an appeal, the decision should take around 90 days, Facebook says. In certain cases that warrant faster action, Facebook can expedite them, and the board should take no longer than 30 days to come to a decision.
We still dont know arguably the most important information about the board who will be on it but Facebook expects to announce its first three co-chairs in the coming months. The companys director of governance and strategic initiatives, Brent Harris, said on a press call on Tuesday morning that the boards members will have a diverse set of opinions.
Harris called the board a new mechanism that is independent, beyond the walls of Silicon Valley, and enables review against Facebooks stated principles. If the board chooses, it can amend any of the bylaws that Facebook proposed Tuesday, which it says it suggested for the sake of getting the board up and running.
Several experts Recode spoke with called the updates a step in the right direction for greater transparency, but say that the projects success will depend on how much the company actually listens to this new governing body. Under the proposed rules, Facebook will be forced to follow the boards decisions when it rules that the company should not have taken down content. But for broader policy decisions, Facebook will only take guidance not mandates from the board.
Ultimately, how the board ends up functioning could affect the day-to-day communications of Facebooks more than 2 billion users around the world. It could turn out to be a success or a failed experiment in wrangling the social networks gargantuan problem of dealing with controversial speech.
Content moderation is really hard and these are really consequential decisions about public discourse and human rights, said Evelyn Douek, a doctoral student at Harvard Law School researching the regulation of online speech. Douek, along with many other academics, gave Facebook early feedback on a draft of its new rules. While its unacceptable to be having completely unaccountable and private entities making decisions, we also dont want governments having their hands all over it. So the oversight board is a new model for handling this.
A brief history of Facebooks oversight board
Zuckerberg announced the idea for an oversight board in 2018, as the company was facing intense criticism about how it moderates content from both liberal and conservative politicians in the US. At the time, Zuckerberg wrote that the company should not make so many important decisions about free expression and safety on its own, and that instead it would assign responsibility on some of the toughest decisions to an independent body.
So far, setting up the board has been a relatively slow process for a company whose famous tagline used to be move fast and break things. Facebook says thats because it wants to be sure its getting the setup right.
What we do know is that the board will be funded by a separate trust, in which Facebook plans to invest more than $130 million. On Tuesday, Facebook also announced that it hired the boards administrative director, Thomas Hughes, a former executive director for Article 19, a nonprofit for freedom of expression and digital rights. Hughes will oversee administrative staff but will not make content decisions.
The board plans to hear its first cases in the first half of 2020, in time for the presidential elections. So presumably, by that timeline, it will have to hire its members by July.
Remaining questions
The most obvious question about the board following Tuesdays announcement is who will be on it.
Beyond that, experts wonder how many cases the board will take at each time, exactly what kinds of cases it will consider, and how broadly its rulings will be applied within the company. Facebooks Harris said on a press call Monday that he expects the board will take on dozens of cases initially which would obviously be only a very small percentage of the total volume of posts of Facebook.
We dont know what kinds of cases well hear the board might spend all of its time on hate speech or nudity, said Douek. We dont know what the board will actually decide to do with its power. She said its actually a good thing for the board to have discretion, if its going to be a truly independent body from Facebook as it hopes to be.
Douek also pointed out that for now, Facebook will be limiting users appeals to content they think was incorrectly taken off the platform. That leaves out complaints where users are upset that content remains on the platform (like hate speech, violence, or misinformation). In the future, Facebook plans to expand its appeal process to let users dispute whether a post should remain on the app, but it hasnt said when.
Another big question is how narrowly Facebook plans to interpret the boards decisions.
If Facebook only plans to take down, or leave up, a handful of posts that the board specifically rules on, it could make only a minor dent in the overall landscape of misinformation on the social media platform, some experts say. Dipayan Ghosh, who conducts economic and technology policy research at the Harvard Kennedy School, said that the devil will really be in the details of implementation, and that for now, these announcements help contain the criticism from the community of academics, policy experts, and regulators who are plugged into speech issues on Facebook.
If Facebook didnt do this, there would be more calls and pressure on the company to do something, said Ghosh. The announcement and gradual development of this board is coming at a time when the company needs it to continue to succeed and survive.