Quantcast
Channel: mark zuckerberg – Techdirt
Viewing all articles
Browse latest Browse all 116

Good For The World, But Not Good For Us: The Really Damning Bits Of The Facebook Revelations

$
0
0

As expected, UK Parliament Member Damian Collins released a bunch of documents that he had previously seized under questionable circumstances. While he had revealed some details in a blatantly misleading way during the public hearing he held, he's now released a bunch more. Collins tees up the 250 page release with a few of his own notes, which also tend to exaggerate and misrepresent what's in the docs, and many people are running with a few of those misrepresentations.

However, that doesn't mean that all of these documents have been misrepresented. Indeed, there are multiple things in here that look pretty bad for Facebook, and could be very damaging for it on questions around the privacy protections it had promised the FTC it would put in place, as well as in any potential anti-trust fight. It's not that surprising to understand how Facebook got to the various decisions it made, but the "move fast and break things" attitude also seems to involve the potential of breaking both the law and the company's own promises to its users. And that's bad.

First, the things that really aren't that big a deal: a lot of the reporting has focused on the idea that Facebook would give greater access to data to partners who signed up to give Facebook money via its advertising or other platforms. There doesn't seem to be much of a bombshell there. Lots of companies who have APIs charge for access. This is kind of a standard business model question, and some of the emails in the data dump show what actually appears to be a pretty thoughtful discussion of various business model options and their tradeoffs. This was a company that recognized it had valuable information and was trying to figure out the best way to monetize it. There isn't much of a scandal there, though some people seem to think there is. Perhaps you could argue that allowing some third parties to have greater access Facebook has a cavalier attitude towards that data since it's willing to trade access to it for money, but there's no evidence presented that this data was used in an abusive way (indeed, by putting a "price" on the access, Facebook likely limited the access to companies who had every reason to not abuse the data).

Similarly, there is a lot of discussion about the API change, which Facebook implemented to actually start to limit how much data app developers had access to. And the documentation here shows that part of the motivation to do this was to (rightfully) improve user trust of Facebook. It's difficult to see how that's a scandal. In addition, some of the discussions involve moving specific whitelisted partner to a special version of the API that gives them access to more data... but in a way that the data is hashed, providing better privacy and security to that data, while still making it useful. Again, this approach seems to actually be beneficial to end users, rather than harmful, so the attempts to attack it seem misplaced -- and yet take up the vast majority of the 250 pages.

The bigger issues involve specific actions that certainly appear to at least raise antitrust questions. That includes cutting off apps that recreate Facebook's own features, or that are suddenly getting a lot of traction (and using the access they had to users' phones to figure out which apps were getting lots of traction). While not definitively violating antitrust laws, that's certainly the kind of evidence that any antitrust investigator would likely explore -- looking to see if Facebook held a dominant position at the time of those actions, and if those actions were designed to deliberately harm competitors, rather than for any useful purpose for end-users. At least from the partial details released in the documents, the focus on competitors does seem to be a driving force. That could create a pretty big antitrust headache for Facebook.

Of course, the details on this... are still a bit vague from the released documents. There are a number of included charts from Onavo included, showing the popularity of various apps, such as this:

Onavo was a data analytics company that Facebook bought in 2013 for over $100 million. Last year, the Wall Street Journal broke the story that Facebook was using Onavo to understand how well competing apps were doing, and potentially using that data to target acquisitions... or potentially to try to diminish those competing apps' access. The potential "smoking gun" evidence is buried in these files, but there's a short email on the day that Twitter launched Vine, its app for 6-second videos, where Facebook decides to cut off Twitter's access to its friend API in response to this move, and Zuckerberg himself says "Yup, go for it."

Now... it's entirely possible that there's more to this than is shown in the documents. But at least on its face, it seems like the kind of thing that deserves more scrutiny. If Facebook truly shut down access to the API because it feared competition from Vine... that is certainly the kind of thing that will raise eyebrows from antitrust folks. If there were more reasons for cutting off Vine, that should come out. But if the only reason was "ooh, that's a potential competitor to our own service," and if Facebook was seen as the dominant way of distribution or access at the time, it could be a real issue.

Separately, if the name Onavo sounds familiar to you, that might be because earlier this year, Facebook launched what it called a VPN under the brand name Onavo... and there was reasonable anger over it because people realized (as per the above discussion) that Onavo was really a form of analytics spyware that charted what applications you were using and for what. It was so bad that Apple pulled it from its App Store.

The other big thing that comes out in the released documents is all the way at the end, when Facebook is getting ready to roll out a Facebook app update on Android that will snoop on your SMS and call logs and use that information for trying to get you to add more friends and for determining what kinds of content it promotes to you. Facebook clearly recognized that this could be a PR nightmare if it got out, and they were worried that Android would seek permission from users, which would alert them to this kind of snooping:

That is bad. That's Facebook knowing that its latest snooping move will look bad and trying to figure out a way to sneak it through. Later on, the team is relieved when they realize, after testing, that they can roll this out without alerting users with a permission dialog screen:

As reporter Kashmir Hill points out, it's notable that this "phew, we don't really have to alert users to our sketchy plan to get access to their logs" came from Yul Kwon, who was designated as Facebook's "privacy sherpa" and put in charge of making sure that Facebook didn't do anything creepy with user data. From an article that Hill wrote back in 2015:

The face of the new, privacy-conscious Facebook is Yul Kwon, a Yale Law grad who heads the team responsible for ensuring that every new product, feature, proposed study and code change gets scrutinized for privacy problems. His job is to try to make sure that Facebook’s 9,199 employees and the people they partner with don’t set off any privacy dynamite. Facebook employees refer to his group as the XFN team, which stands for “cross-functional,” because its job is to ensure that anyone at Facebook who might spot a problem with a new app — from the PR team to the lawyers to the security guys — has a chance to raise their concerns before that app gets on your phone. “We refer to ourselves as the privacy sherpas,” says Kwon. Instead of helping Facebook employees scale Everest safely, Kwon’s team tries to guide them safely past the potential peril of pissing off users.

And yet, here, he seems to be guiding them past those perils by helping the team hide what's really going on.

This is also doubly notable for Kashmir Hill who has been perhaps the most dogged reporter on the creepy levels to which Facebook's "People You May Know" feature works. Facebook has a history of giving Hill totally conflicting information about how that feature worked, and these documents reveal, at least, the desire to secretly slurp up your call and SMS records in order to find more "people you might know" (shown as PYMK in the documents).

One final note on all of this. I recently pointed out that Silicon Valley really should stop treating fundamental structural issues as political issues, in which they just focus on what's best for the short-term bottom line, and should focus on the larger goals of doing what's right overall. In a long email included in the documents from Mark Zuckerberg, musing thoughtfully on various business model ideas for the platform, one line stands out. Honestly, the entire email (starting on page 49 of the document) is worth reading, because it really does carefully weigh the various options in front of them. But there's also this line:

If you can't read that, it's a discussion of how it's important to enable people to share what they want, and how enabling other apps to help users do that is a good thing, but then he says:

The answer I came to is that we’re trying to enable people to share everything they want, and to do it on Facebook. Sometimes the best way to enable people to share something is to have a developer build a special purpose app or network for that type of content and to make that app social by having Facebook plug into it. However, that may be good for the world but it’s not good for us unless people also share back to Facebook and that content increases the value of our network. So ultimately, I think the purpose of platform – even the read side – is to increase sharing back into Facebook.’

I should note that in Damian Collins' summary of this, he carefully cuts out some of the text of that email to frame it in a manner that makes it look worse, but the "that may be good for the world, but it's not good for us" line really stands out to me. That's exactly the kind of political decision I was talking about in that earlier post. Taking the short term view of "do what's good for us, rather than what's good for the world" may be typical, and even understandable, in business, but it's the root of many, many long term and structural problems for not just Facebook, but tons of other companies as well.

I wish that we could move to a world where companies finally understood that "doing good for the world" leads to a situation in which the long term result is also "good for us," rather than focusing on the "good for us" at the expense of "good for the world."



Permalink | Comments | Email This Story

Viewing all articles
Browse latest Browse all 116

Trending Articles