Quantcast
Channel: mark zuckerberg – Techdirt
Viewing all articles
Browse latest Browse all 116

Transparency Is Important; Mandated Transparency Is Dangerous And Will Stifle Innovation And Competition

$
0
0

While much of yesterday's Senate Commerce Committee hearing was focused on the pointless grievances and grandstanding of sitting Senators, there was a bit of actual news made by Mark Zuckerberg and Jack Dorsey. As we discussed earlier this week, Zuckerberg agreed for the first time that he was in support of Section 230 reform, though he declined in his opening remarks to specify the nature of the reforms he supported. And while the original draft of Jack Dorsey's opening testimony suggested full support of 230, in the given remarks he also suggested that Twitter would support changes to Section 230 focused on getting companies to be more transparent. Later in the hearing, during one of the extraordinarily rare moments when a Senator actually asked the CEOs how they would change 230, Zuckerberg also focused on transparency reports, before immediately noting that Facebook already issued transparency reports.

In other words, it appears that the "compromise" the internet companies are looking to throw to a greedy Congress regarding Section 230 reform is "transparency." I've heard from a variety of policymakers over the last few months who also seem focused on this transparency issue as a "narrow" way to reform 230 without mucking up everything else, so it seems like mandating content moderation transparency may become "a thing."

Mandating transparency, however, would be a dangerous move that would stifle both innovation and competition.

Cathy Gellis has covered this in detail in the past, and I addressed it in my comments to the FCC about Section 230. But it seems like we should be a little clearer:

Transparency is important. Mandated transparency is dangerous.

We've been celebrating lots of internet companies and their transparency reports going back to Google's decision nearly a decade ago to start releasing such reports. Over time, every large internet company (and many medium ones) has joined the bandwagon. Indeed, after significant public pressure, even the notoriously secretive giant telcos started issuing transparency reports as well (though they often did so in a secretive manner that actually hid important details).

So, at the very least, it certainly looks like public pressure, good business practices, and pressure from peers in the industry have already pushed the companies into releasing such reports. On top of that, many of the internet companies seem to try to outdo each other in being more transparent than their peers on these reports -- which again is a good thing. The transparency reports are coming and we should celebrate that.

At the very least, though, this suggests that Congress doesn't need to mandate this, as it's already happening.

But, you might say, then why should we worry about mandates for transparency reports? Many, many reasons. First off, while transparency reports are valuable, in some cases, we've seen governments and government officials using them as tools to celebrate censorship. Governments are not using them to better understand the challenges of content moderation, but rather as tools to see where more censorship should be targeted. That's a problem.

Furthermore, creating a "baseline" for transparency reports creates two very large issues that could damage competition and innovation. First, it creates a clear compliance cost, which can be quite burdensome for new and smaller websites. Facebook, Google and Twitter can devote people to creating transparency reports. Smaller sites cannot. And while you could, in theory, craft a mandate that has some size thresholds, historically that leads to gaming and other tricks.

Perhaps more importantly, though, a mandate with baseline transparency thresholds locks in certain "rules" for content moderation and creates real harm to innovative and different ideas. While most people seem to think of content moderation along the lines of how Facebook, YouTube, and Twitter handle it -- with large (often outsourced) content moderation teams and giant sets of policies -- there are many, many other models out there as well. Reddit is a decently large company. Yet it handles content moderation by pushing it out to volunteer moderators who run each subreddit and get to make their own content moderation rules. Would each subreddit have to release its own report? Would Reddit itself have to track how each individual subreddit is moderated and include all of that in its report?

Or how about Wikipedia? That's one of the largest sites on the internet, and all of its content moderation practices are already incredibly transparent, since every single edit shows in each page's history -- often including a note about the reasoning. And, again, rather than being done by staff, every Wikipedia edit is done by volunteers. But should Wikipedia have to file a "standardized" report as well about how and why each of those moderation decisions were made?

And those are just two examples of large sites with different models. The more you look, the more alternative moderation models you can find -- and many of them would not fit neatly into any "standards" for a transparency report. Instead, what you'd get is a hamfisted setup that more or less forces all different sites into a single (Facebook/YouTube/Twitter) style of content moderation and transparency. And that's very bad for innovation in the space.

Indeed, as someone who is quite hopeful for a future where the content moderation layer is entirely separated from the corporate layer of various social media sites, I worry that mandated transparency rules would make that much, much more difficult to implement. Many of the proposals I've seen to build more distributed/decentralized protocol-based solutions for social media would not (and often could not) be fit into a "standardized" model of content moderation.

And thus, creating rules that mandate such transparency reporting for companies based on the manner in which those three large companies currently release transparency reports would only serve to push others into that same model, creating significant compliance costs for those smaller entities, while greatly limiting their ability to experiment with new and different styles of moderation.


Viewing all articles
Browse latest Browse all 116

Trending Articles