Facebook now has 3 billion users - more than a third of humanity.
Of these users, many use Facebook as their primary news source—a dynamic that has suffused the platform with toxic misinformation. Though some countries, like Germany, have passed laws attempting to curb the dissemination of hate speech, laws such as these are enforceable only in select geographical boundaries. In the United States, the question of how to regulate Facebook has highlighted tensions with the First Amendment, becoming a challenging topic of legal debate. Consequently, Facebook has been left to make difficult decisions about speech largely on its own.
That’s where Noah Feldman, a Harvard Law School professor, comes in. A close friend of Sherly Sandberg, the COO of Facebook, Feldman pitched the idea in 2018 that social-media companies needed “quasi-legal systems” to weigh difficult decisions around freedom of speech. Mark Zuckerberg agreed with Feldman’s proposal, noting that the question of deleting individual, high-profile posts should be left to the experts. From there, the Oversight Board was created.
Though many outside the company have wanted the Board to have as much authority as possible, in reality, its powers are limited. Many of Facebook’s most controversial posts: conspiracy theories, disinformation, and speech are allowed to remain up. And most significantly, the board’s rulings do not become Facebook policy in the same way that a Supreme Court precedent becomes the law of the land. Even if the board decides to take down a post, similar posts are only taken down at Facebook’s discretion.
Read Kate Klonic’s first-hand glimpse into the Board’s inner workings. And check out this profile on the board’s first 20 members.