Another "free speech" controversy has blown up at Facebook. "Free speech" in quotes because Facebook is a private company that can make it own rules about speech it's willing to tolerate, much less protect. It's also one that can make up the rules as it goes along and apply them inconsistently. Welcome to the Internet. That's just how things are done.
So, it comes as no surprise that moderators at Facebook attempted to remove Donald Trump's posts as "hate speech." (via Slashdot)
Facebook employees pushed to remove some of Republican presidential candidate Donald Trump's Facebook posts — such as one proposing the ban of Muslims from entering the U.S. — from the service as hate speech that violated the giant social network's policies, the Wall Street Journal reported Friday.
To some readers, Facebook's attempts to remove posts by a Republican may seem like business as usual. The social media network has been criticized before for playing politics with its news feeds. But digging a little deeper into the details of the story reveals this mini-debacle starts as most censorship stories do: with the site's users, rather than its moderation team.
Issues around Mr. Trump’s posts emerged when he posted on Facebook a link to a Dec. 7 campaign statement “on preventing Muslim immigration.” The statement called for “a total and complete shutdown of Muslims entering the United States until our country’s representatives can figure out what is going on.”
Users flagged the December content as hate speech, a move that triggered a review by Facebook’s community-operations team, with hundreds of employees in several offices world-wide.
Flagging a policy proposal as "hate speech" sounds very much like certain Facebook users' attempts to create their own echo chambers -- the normal efforts of those who have mistaken the "report" button for Facebook's still-nonexistent "dislike" button.
The problem could have ended there. Moderators could have easily decided this was relevant to the upcoming election and not something that should be declared "hate speech." But it didn't go that way.
Some Facebook employees said in internal chat rooms that the post broke Facebook’s rules on hate speech as detailed in its internal guidelines, according to people familiar with the matter.
Facebook's definition of "hate speech" is far too broad. Even CEO Mark Zuckerberg agreed the post violated the company's "hate speech" policy, but overrode moderators and reinstated the posts. The rules will apparently continue to be rewritten on the fly.
On Friday, senior members of Facebook’s policy team posted more details on its policy. “In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest—even if they might otherwise violate our standards,” they wrote.
This is a better interpretation of the rules, but one that should be permanently implemented, rather than just half-assed into place to lower the risk of losing campaign advertising dollars. Facebook has earned a lot of the criticism thrown in its direction over its surprisingly terrible post moderation decisions. So, FB earns a golf clap for deciding to prevent user-generated echo chambers, at least up until the second Tuesday in November.
The other problem is that this decision just isn't good enough for some Facebook employees.
[O]thers, including some Muslim employees at Facebook, were upset that the platform would make an exception. In Dublin, where many of Facebook’s content reviewers work, more than a dozen Muslim employees met with their managers to discuss the policy, according to another person familiar with the matter. Some created internal Facebook groups protesting the decision, while others threatened to leave.
Those that threatened to leave should do so. They're only going to make Facebook an even worse place for the world to get its news. There's plenty of unpleasantness out there that is newsworthy, significant, or important to the public interest. Very little of it rises to the level of hate speech -- even in Facebook's broad, constantly-changing definition of the term.
Lot of things Trump has said and advocated for are objectively repugnant and undoubtedly offensive to the races and religions targeted by them. But they are not "hate speech." They are bad ideas borne of worse thought processes. In any event, it's better to know what presidential candidates are supporting, rather than being unpleasantly surprised post-election.
The same goes for "normal" people. Why police "hate speech" in such a heavy-handed fashion? Wouldn't it be better to have those in your social circles out themselves publicly as repellant human beings, rather than discover this during a child's birthday party or other IRL social gathering?
Facebook isn't a free speech defender. It's a private company with a lot of advertising dollars and billions of users with competing interests on the line. It will play it safe and continue its long run of dubious moderation decisions. But what it shouldn't do is continue to expand its definition of hate speech so moderators become nothing more than a heckler's veto extensions.
Permalink | Comments | Email This Story