Quantcast
Channel: mark zuckerberg – Techdirt
Viewing all articles
Browse latest Browse all 116

Yet Another Story Shows How Facebook Bent Over Backwards To Put In Place Different Rules For Conservatives

$
0
0

It has become an article of faith among some that the big social media sites engage in "anti-conservative bias" in their moderation practices. When we point out, over and over again, that there is no evidence to support these claims, our comments normally fill up with very, very angry people calling us "delusional" and saying things like "just look around!" But they never actually provide any evidence. Because it doesn't seem to exist. Instead, what multiple looks at the issue have found is that moderation policies might ban racists, trolls, and bigots, and unless your argument is that "conservatism" is the same thing as "racism, trolling, and bigotry" then you don't have much of an argument. In fact, studies seem to show that Facebook, in particular, has bent over backwards to support conservative voices on the platform.

Last fall a report came out noting that when an algorithmic change was proposed to downgrade news on Facebook overall, the fact that some extremist far right sites were so popular on the site, the company's leadership, including Mark Zuckerberg were so afraid that Republicans would accuse them of "anti-conservative bias" that he stepped in to make sure the algorithm also downgraded some prominent "left-leaning" sites, even though the algorithm initially wasn't going to -- just so they could claim that both sides of the traditional political spectrum were downgraded.

Over the weekend a new report came out along similar lines, noting that Facebook's policy team spent a lot of time and effort putting in place a policy to deal with "misinformation and hate." Not surprisingly, this disproportionately impacted far right extremists. While there certainly is misinformation across the political spectrum -- especially at the outer reaches of the traditional political compass, it's only on the right that it has generally gone mainstream. And, again, the same political calculus appeared to come into play. After the policy team worked out more neutral rules for dealing with misinformation and hate, Zuckerberg apparently stepped in to overrule the policy, and to make sure that wack job supporters of Alex Jones and similar conspiracy mongers were allowed to continue spewing misinformation:

Jones had gained infamy for claiming that the 2012 Sandy Hook elementary school massacre was a “giant hoax,” and that the teenage survivors of the 2018 Parkland shooting were “crisis actors.” But Facebook had found that he was also relentlessly spreading hate against various groups, including Muslims and trans people. That behavior qualified him for expulsion from the social network under the company's policies for "dangerous individuals and organizations," which required Facebook to also remove any content that expressed “praise or support” for them.

But Zuckerberg didn’t consider the Infowars founder to be a hate figure, according to a person familiar with the decision, so he overruled his own internal experts and opened a gaping loophole: Facebook would permanently ban Jones and his company — but would not touch posts of praise and support for them from other Facebook users. This meant that Jones’ legions of followers could continue to share his lies across the world’s largest social network.

"Mark personally didn’t like the punishment, so he changed the rules,” a former policy employee told BuzzFeed News, noting that the original rule had already been in use and represented the product of untold hours of work between multiple teams and experts.

The political calculus isn't that difficult to understand, here, and the Buzzfeed article notes that much of it was driven by policy boss and Republican operative Joel Kaplan. Basically, the fact is that applying these rules neutrally mean that more Republicans/conservatives are impacted, because misinformation and hate is more mainstream on the Republican side of the aisle. So a "fair" policy results in more limitations on Republicans, which only lead to more (false) claims of "anti-conservative bias." Or, paraphrasing Stephen Colbert: reality has an anti-conservative bias.

This goes back to the point that many people have made: that all of these claims of "anti-conservative bias" are really "working the refs." Basically, people whining about this realize that these moderation decisions and policies around them are not made in a vacuum. They know that users, the media, politicians and commentators will respond to them. But by constantly screaming about "anti-conservative bias," it makes some fear that if the outcome appears to impact one side more than the other, then the inputs need to change. In other words, in order to keep the outcomes "equal," the entire moderation grading system has to be on a curve: meaning that the actual bias is in favor of conservative viewpoints because those viewpoints tend to include a lot more misinformation and hate.

Facebook's explanation of this, in response to the Buzzfeed article, is pretty silly. They call it a "more nuanced" approach. And by that, they mean focusing on fairness of outcomes, not fairness of policies. That's not nuanced, that's simply caving in to dishonest actors gaming the system. It's failure state.

“Mark called for a more nuanced policy and enforcement strategy,” Facebook spokesperson Andy Stone said of the Alex Jones decision, which also affected the bans of other extremist figures.

Zuckerberg’s “more nuanced policy” set off a cascading effect, the two former employees said, which delayed the company’s efforts to remove right wing militant organizations such as the Oath Keepers, which were involved the Jan. 6 insurrection at the US Capitol. It is also a case study in Facebook’s willingness to change its rules to placate America’s right wing and avoid political backlash.

Internal documents obtained by BuzzFeed News and interviews with 14 current and former employees show how the company’s policy team — guided by Joel Kaplan, the vice president of global public policy, and Zuckerberg’s whims — has exerted outsize influence while obstructing content moderation decisions, stymieing product rollouts, and intervening on behalf of popular conservative figures who have violated Facebook’s rules.

In December, a former core data scientist wrote a memo titled, “Political Influences on Content Policy.” Seen by BuzzFeed News, the memo stated that Kaplan’s policy team “regularly protects powerful constituencies” and listed several examples, including: removing penalties for misinformation from right-wing pages, blunting attempts to improve content quality in News Feed, and briefly blocking a proposal to stop recommending political groups ahead of the US election.

Again, the reasoning behind this is not surprising, and completely understandable if the entire calculus was: how do we minimize political risk. But that's silly and reflects extremely poorly on Facebook's willingness to stand up to dishonest and disingenuous political screamers, rather than just saying the truth, which is that there's just more misinformation and hate on the Republican side of the aisle right now. That's a problem for the Republican Party to deal with, but Facebook seems to think it needs to paper over that issue to avoid any appearance of bias. It's a silly -- and sometimes dangerous -- position to take. I understand that Mark Zuckerberg has staked out an understandable position that he doesn't want to be the arbiter of truth. But the reality of the situation is that some people are abusing that stance to make sure that there's more nonsense in the system.


Viewing all articles
Browse latest Browse all 116

Trending Articles