By Mark Latonero
Facebook is a step away from creating its global Oversight Board for content moderation. While there’s good reason to be skeptical of whether Facebook itself can fix problems like hate speech and disinformation on the platform, we should pay closer attention to how the board proposes to make decisions.
Decisions by Facebook to limit content and speech are often met with intense public criticism. Facebook wants the Oversight Board to take responsibility for these decisions. However, since Facebook will select the board’s initial slate of international experts, it risks becoming stacked with members who would be too deferential to the company. A more central problem is baked into the foundational charter that states the board “will review content enforcement decisions and determine whether they were consistent with Facebook’s content policies and values.” If the board becomes an echo chamber for values dreamed up in Silicon Valley, it will hardly be trusted on the world stage.
The bylaws contain a possible path forward. They say the board will “be guided by relevant human-rights principles” and will provide an “analysis of how the board’s decisions have considered or tracked the international human rights implicated by a case.” While the language is slippery, if the board bases its decision-making more explicitly on international human rights, it could gain legitimacy. These rights aren’t defined by Facebook, but by the United Nations’ Universal Declaration on Human Rights, international treaties and human-rights courts.
Still, if the board were to agree to protect all human rights when making decisions, it could lead to novel opinions that could help others grappling with similar challenges. When I spoke with Noah Feldman from Harvard Law School, who came up with the Supreme Court for Facebook concept and advises Mark Zuckerberg, he imagined that other tech companies might one day bring their predicaments to the Oversight Board if they agreed the decision would be binding.
The more the board limits its scope, however, the more it will miss the big picture. The board should be fully empowered to make policy recommendations, especially those that may directly challenge the inner workings of Facebook’s revenue models or News Feed algorithm. If the board fails to self-govern, it would leave one clear and extremely challenging message for lawmakers: Facebook must be regulated.
Mark Latonero is a Fellow at the Harvard Kennedy School.
Image credits: Tampatra1 | Dreamstime.com