Facebook is biased about bias

easyDNS is pleased to sponsor Jesse Hirsh‘s “Future Fibre / Future Tools” segments of his new email list, Metaviews

There can be no critical distance in a tyranny

 
Facebook has a meta problem. They lack the critical distance and proper governance to understand their power or their problems. The problem with their problems is not just that they fail to appreciate them, but that they are arguably not in a position to understand them at all.

One example of this is Facebook’s bias about biases. For starters, Facebook doesn’t think it has biases, or at least in aspiring or believing it can be neutral, Facebook cannot recognize when it is not neutral.

This bias against biases prevents Facebook from recognizing when it is acting in a biased manner.

Presently Facebook is in a growing conflict with the Australian government over the licensing and royalties of news content. While no legislation has been passed, Facebook has made a sensational announcement that they will not allow news on their platform in Australia.

As a bargaining tactic, or more appropriately an intimidation tactic, cutting off news publishers in Australia from the platform was effective for Facebook, as it caused the Australian government to blink. They changed the current draft of the bill, and Facebook re-enabled news on their platform.

I discussed this on the Big Story podcast:

Putting aside the whole policy question and news debate for a moment, there is something absurd about this position Facebook has taken. Specifically it illustrates their bias against biases.

For example, what is news? Who can create news? Who should be compensated for news?

Since Facebook believes it is neutral, it also believes that it can decide what is news, and what isn’t news. In spite of there being a “newsfeed” on Facebook that features “news” from friends, family, and the world at large.

Even if we were to arbitrarily redefine news to exclude personal news, how can we limit who produces and thereby deserves compensation for it? The concept of the citizen journalist is not new.

It would be one thing to push back against the Australian government, but to suggest that it is possible to ban news on Facebook illustrates how absurd and dangerous Facebook’s view of itself is. They cannot see their own bias because they do not understand what bias or subjectivity is, due to their financial necessity to regard themselves as neutral, and regard “news” as something objective?

Speaking of news, here’s an interesting yet somewhat believable assertion:

The above is the start of a great thread, by Jason, that gets into detail some of the governance problems with the platform, but also notes the following:

Facebook is using it’s power (and bias) to pick winners in the news and media industry. This kind of power should not be held in a single company, or in the hands of a single tyrant, and his three stooges. The digital gang of four.

Jason’s thread above, was largely in response to this excellent report recently issued by Buzzfeed News:

In April 2019, Facebook was preparing to ban one of the internet’s most notorious spreaders of misinformation and hate, Infowars founder Alex Jones. Then CEO Mark Zuckerberg personally intervened.

Jones had gained infamy for claiming that the 2012 Sandy Hook elementary school massacre was a “giant hoax,” and that the teenage survivors of the 2018 Parkland shooting were “crisis actors.” But Facebook had found that he was also relentlessly spreading hate against various groups, including Muslims and trans people. That behavior qualified him for expulsion from the social network under the company’s policies for “dangerous individuals and organizations,” which required Facebook to also remove any content that expressed “praise or support” for them.

But Zuckerberg didn’t consider the Infowars founder to be a hate figure, according to a person familiar with the decision, so he overruled his own internal experts and opened a gaping loophole: Facebook would permanently ban Jones and his company — but would not touch posts of praise and support for them from other Facebook users. This meant that Jones’ legions of followers could continue to share his lies across the world’s largest social network.

What’s fascinating about this report is not just that it illustrates the holes in Facebook’s governance, but rather it confirms that Facebook (as a rising government) is essentially a one man dictatorship, enabled by the work of the Facebook Gang of Four (which includes Chairman Mark).

“That was the first time I experienced having to create a new category of policy to fit what Zuckerberg wanted. It’s somewhat demoralizing when we have established a policy and it’s gone through rigorous cycles. Like, what the fuck is that for?” said a second former policy employee who, like the first, asked not to be named so they could speak about internal matters.

“Mark called for a more nuanced policy and enforcement strategy,” Facebook spokesperson Andy Stone said of the Alex Jones decision, which also affected the bans of other extremist figures.

Zuckerberg’s “more nuanced policy” set off a cascading effect, the two former employees said, which delayed the company’s efforts to remove right wing militant organizations such as the Oath Keepers, which were involved the Jan. 6 insurrection at the US Capitol. It is also a case study in Facebook’s willingness to change its rules to placate America’s right wing and avoid political backlash.

Internal documents obtained by BuzzFeed News and interviews with 14 current and former employees show how the company’s policy team — guided by Joel Kaplan, the vice president of global public policy, and Zuckerberg’s whims — has exerted outsize influence while obstructing content moderation decisions, stymieing product rollouts, and intervening on behalf of popular conservative figures who have violated Facebook’s rules.

This helps to explain why Facebook has a bias against bias. In particular this story and report delves a bit into how the company responded to accusations of bias from conservatives. Facebook’s fear of being accused of bias engendered a new kind of bias, in the form of the anticipation of bias, which in this case enabled extremist organizing to flourish on the platform.

Since Facebook does not have the ability to recognize this, nor do they have external oversight or accountability, it’s highly likely this problem will get worse before it gets better.

Leave a Reply

Your email address will not be published. Required fields are marked *