Quantcast
Channel: mark zuckerberg – Techdirt
Viewing all articles
Browse latest Browse all 116

Mark Zuckerberg Finally Speaks About Cambridge Analytica; It Won't Be Enough

$
0
0

It took way too long, but Mark Zuckerberg finally responded to the Cambridge Analytica mess on Wednesday afternoon. As we've discussed in a series of posts all week, this is a complex issue, where a lot of people are getting the details wrong, and thus most of the suggestions in response are likely to make the problem worse.

To be clear, Mark's statement on the issue is not bad. It's obviously been workshopped through a zillion high-priced PR people, and it avoids all the usual "I"m sorry if we upset you..." kind of tropes. Instead, it's direct, it takes responsibility, it admits error, does very little to try to "justify" what happened, and lists out concrete steps that the company is taking in response to the mess.

We have a responsibility to protect your data, and if we can't then we don't deserve to serve you. I've been working to understand exactly what happened and how to make sure this doesn't happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there's more to do, and we need to step up and do it.

It runs through the timeline, and appears to get it accurately based on everything we've seen (so no funny business with the dates). And, importantly, Zuckerberg notes that even if it was Cambridge Analytica that broke Facebook's terms of service on the API, that's not the larger issue -- the loss of trust on the platform is the issue.

This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.

The proactive steps that Facebook is taking are all reasonable steps as well: investigating all old apps prior to the closing of the old API to see who else sucked up what data, further restricting access to data, and finally giving more transparency and control to end users:

First, we will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity. We will ban any developer from our platform that does not agree to a thorough audit. And if we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps. That includes people whose data Kogan misused here as well.

Second, we will restrict developers' data access even further to prevent other kinds of abuse. For example, we will remove developers' access to your data if you haven't used their app in 3 months. We will reduce the data you give an app when you sign in -- to only your name, profile photo, and email address. We'll require developers to not only get approval but also sign a contract in order to ask anyone for access to their posts or other private data. And we'll have more changes to share in the next few days.

Third, we want to make sure you understand which apps you've allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you've used and an easy way to revoke those apps' permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.

That's mostly good, though as I explained earlier, I do have some concerns about how the second issue -- locking down the data -- might also limit the ability of end users to export their data to other services.

Also, this does not tackle the better overall solution that we mentioned yesterday, originally pitched by Cory Doctorow to open up the platform not to third party apps that suck up data, but to third party apps that help users protect and control their own data. That part is missing and it's a big part.

If you already hated Zuckerberg and Facebook, this response isn't going to be enough for you (no response would be, short of shutting the whole thing down, as ridiculous as that idea is). If you already trusted him, then you'll probably find this to be okay. But a lot of people are going to fall in the middle and what Facebook actually does in the next few months is going to be watched closely and will be important. Unless and until the company also allows more end-user control of privacy, including by third party apps, it feels like this will fall short.

And, of course, it seems highly unlikely that these moves will satisfy the dozens of regulators around the world seeking their pound of flesh, nor the folks who are already filing lawsuits over this. Facebook has a lot of fixing to do. And Zuckerberg's statement is better than a bad statement, but that's probably not good enough.

Meanwhile, as soon as this response was posted, Zuckerberg went on a grand press tour, hitting (at least): CNN, Wired, the NY Times and Recode. It's entirely possible he did more interviews too, but that's enough for now.

There are some interesting tidbits in the various interviews, but most of what I said above stands. It's not going to be enough. And I'm not sure people will be happy with the results. In all of the interviews he does this sort of weird "Aw, shucks, I guess what people really want is to have us lock down their data, rather than being open" thing that is bothersome. Here's the Wired version:

I do think early on on the platform we had this very idealistic vision around how data portability would allow all these different new experiences, and I think the feedback that we’ve gotten from our community and from the world is that privacy and having the data locked down is more important to people than maybe making it easier to bring more data and have different kinds of experiences.

But, of course, as we pointed out yesterday (and above), all this really does is lock in Facebook, and make it that much harder for individuals to really control their own data. It also limits the ability of upstarts and competitors to challenge Facebook. In other words, the more Facebook locks down its data, the more Facebook locks itself in as the incumbent. Are we really sure that's a good idea? Indeed, when Wired pushes him on this, he basically shrugs and says "Well, the people have spoken, and they want us to control everything."

I think the feedback that we’ve gotten from people—not only in this episode but for years—is that people value having less access to their data above having the ability to more easily bring social experiences with their friends’ data to other places. And I don’t know, I mean, part of that might be philosophical, it may just be in practice what developers are able to build over the platform, and the practical value exchange, that’s certainly been a big one. And I agree. I think at the heart of a lot of these issues we face are tradeoffs between real values that people care about.

In the Recode interview, he repeats some of these lines, and even suggests (incorrectly) that there's a trade-off between data portability and privacy:

“I was maybe too idealistic on the side of data portability, that it would create more good experiences — and it created some — but I think what the clear feedback from our community was that people value privacy a lot more.”

But... that's only true in the situation where Facebook controls everything. If they actually gave users more control and transparency, then the user can decide how her data is shared, and you can have both portability and privacy.

One other interesting point he raises in that interview: we should not be letting Mark Zuckerberg make all the decisions about what is and what is not okay:

“What I would really like to do is find a way to get our policies set in a way that reflects the values of the community, so I am not the one making those decisions,” Zuckerberg said. “I feel fundamentally uncomfortable sitting here in California in an office making content policy decisions for people around the world.”

“[The] thing is like, ‘Where’s the line on hate speech?’ I mean, who chose me to be the person that did that?,” Zuckerberg said. “I guess I have to, because of [where we are] now, but I’d rather not.”

But, again, that's a choice that Facebook is making in becoming the data silo. If the end users had more control and more tools to control, then it's no longer Mark's choice. Open up the platform not for "apps" that suck up users data, but for tools that allow users to control their own data, and you get a very different result. But that's not where we're moving. At all.

So... so far this is moving in exactly the direction I feared when I wrote about this yesterday. "Solving" the problem isn't going to be solving the problem for real -- and it's just going to end up giving Facebook greater power over our data. That's an unfortunate result.

Oh. And for all of the apologizing and regrets that Zuckerberg raises in these interviews, he has yet to explain why if this is such a big deal, Facebook threatened to sue one of the journalists who broke this story. It seems like that's an important one for him to weigh in on.



Permalink | Comments | Email This Story

Viewing all articles
Browse latest Browse all 116

Trending Articles