Magid: Facebook changes nature of corporate decision making

Magid: Facebook changes nature of corporate decision making

Facebook is a company, not a government, but its user-base is bigger than the population of any country in the world and the decisions made by its staff affect


Larry Magid 

people in some of the same ways as decisions made by legislatures and courts in many countries. Nowhere is this more evident than in the way Facebook regulates speech.

What it allows and forbids affects people’s ability to communicate, but also impacts their safety, privacy, security and human rights. And sometimes rights are in conflict or at least appear to be so, such as cases where certain types of speech can impact people’s rights or personal safety or some privacy protections, like encryption, can make it harder for law enforcement to protect the public.

As a country, the United States grapples with similar issues, but we have publicly elected legislators to make laws, publicly elected officials to enforce them and duly appointed judges to interpret them. And, based on contentious votes in Congress, leaks about disagreements within administrations and the number of 5-4 decisions by our Supreme Court, it’s clear that making “the right decision,” isn’t always obvious or easy.

But aside from the fact that Facebook lacks the legitimacy of a sovereign nation, it’s job is even harder than that of legislators and judges because it operates globally, in countries with very different laws and traditions including the U.S., which has a strong tradition of free speech, Europe where free speech is limited by laws banning hate speech, and other places where authoritarian regimes have imposed much broader limitations on what people are allowed to say. And images that are perfectly acceptable in some cultures are considered vulgar — and perhaps illegal — in others.

These issues come up frequently. There have been numerous public arguments about what constitutes acceptable speech on Facebook. Sometimes the right decisions seem pretty obvious, but at other times, there are nuances and competing rights and interests to be considered.

The company recently changed its policy to ban expressions of white nationalism and white supremacy just as some, including President Donald Trump, have accused the company of discriminating against conservatives. Are pictures of people being tortured or beheaded gratuitous violence or terrorist propaganda or could they be legitimate news stories or rightfully posted to elicit outrage for horrible acts? Years ago, its policies against nudity were successfully challenged by moms wanting to show pictures of themselves nursing.

Who’s in charge?

As a publicly traded private company, Facebook is accountable to its stockholders, not the public at large. And, because of his vast stock holdings in the company, CEO Mark Zuckerberg is effectively in complete control. Yet, when it comes to some decisions, even Zuckerberg realizes that the stakes are too high for one person or one company to hold all the cards, and that’s one of the reason’s Facebook is in the process of putting together an Oversight Board for Content Decisions.

That board, which will be made up of a diverse group of about 40 people from around the world, will be like what The Verge called a “Supreme Court for content moderation.” The board, according to Facebook, will serve as an “independent authority outside of Facebook,” and have the power to “reverse Facebook’s decisions when necessary.”

Facebook is going to great lengths to assure that this board has a global perspective. “We have traveled around the world hosting six in-depth workshops and 22 roundtables attended by more than 650 people from 88 different countries. We had personal discussions with more than 250 people and received over 1,200 public consultation submissions,” wrote Brent Harris, Facebook’s director of governance and global affairs.

Board will have extraordinary power

This is an extraordinary and mostly unprecedented undertaking from a private company which recognizes the potential impact of its decisions. If the board operates as planned, it will have the ability to overrule Zuckerberg himself on matters of what content is and isn’t allowed on the service.

Zuckerberg discussed the board and its role in a video conversation with Jenny Martinez, Dean of Stanford Law School and Noah Feldman, a professor at Harvard Law and adviser on the oversight board.

Martinez set the stage by pointing out “companies like Facebook have really very global power and power that seems a lot like what governments have … in a company that isn’t accountable in a way that a democratically elected government would be.”

She said that historically “there were previous periods where very large companies played a role that also straddled the divide between public and private powers,” like the British East India Company, which engaged in what we would now consider government powers like coining money or raising an army.  Although it doesn’t have its own army, Facebook is kind of engaged in the coining of money through its participation in the Libra Alliance.

In the video, Harvard’s Noah Feldman, said that the review board’s “legitimacy ultimately will be real when people see decisions that are different from what FB would otherwise have decided to do.” Zuckerberg agreed, and pledged to respect the board’s decisions, even if they overrule his decisions. “That’s my expectation, we can say that this is an independent process, we can have a consultative approach to helping design it and who should be on it, but trust will build up over time.”

Perspective

I’m viewing these conversations about this powerful review board from my perspective as a 10-year founding member of the less powerful Facebook Safety Advisory Board, which is composed of safety experts mostly representing nonprofit organizations in several countries. I serve in my capacity as CEO of ConnectSafely.org, which receives financial support from Facebook.

This board is independent and, at times, has disagreed with decisions made by Facebook management. But even though the board is treated with great respect by Facebook executives, its role is purely advisory. We are not empowered to overrule Facebook’s management. I’m not complaining. It’s doing what it was designed to do, but the design and mandate for the Oversight Board for Content is profoundly different.

If Facebook does a good job in creating a board which is both representative and independent and if it faithfully abides by its decisions, even when they are in conflict with what executives like Zuckerberg want, it will be at least a partial shift in the nature of corporate governance by creating a body that is neither controlled by the corporation itself or the governments in countries where the corporation operates.

At the end of the day, local law in each jurisdiction will trump any decisions by this board and — I suppose — Facebook could change its mind and fail to implement one or more of the board’s decisions, but if we take the company at its word, that isn’t supposed to happen. And, although content review is an extremely important part of what makes Facebook Facebook, there are other very important decisions — including personnel — that are outside the jurisdiction of this board.

Although Facebook is not completely rewriting the rules of corporate governance, it is making a bold move that changes the way some of its most important decisions will be made by empowering people who represent those affected by the company who — without such a board — would have no power over how the company operates. It is, to an extent, taking on powers held by governments as well as powers held by stockholders and board members. It’s a bold experiment.

Larry Magid is a tech journalist and internet safety activist.

Published at Fri, 05 Jul 2019 11:00:45 +0000