Quantcast
Channel: mark zuckerberg – Techdirt
Viewing all articles
Browse latest Browse all 116

Facebook's Latest Privacy Screwup Shows How Facebook's Worst Enemy Is Still Facebook

$
0
0

There's another Facebook scandal story brewing today and, once again, it appears that Facebook's biggest enemy is the company itself and how it blunders into messes that were totally unnecessary. When the last story broke, we pointed out that much of the reporting was exaggerated, and people seemed to be jumping to conclusions that weren't actually warranted by some internal discussions about Facebook's business modeling. The latest big scandal, courtesy of a big New York Times story, reveals that Facebook agreed to share a lot more information than previously known or reported with a bunch of large companies (though, hilariously, one of those companies is... The NY Times, which The NY Times plays down quite a bit).

The social network permitted Amazon to obtain users’ names and contact information through their friends, and it let Yahoo view streams of friends’ posts as recently as this summer, despite public statements that it had stopped that type of sharing years earlier.

As Kash Hill notes in a separate story at Gizmodo, this suddenly explains a story she had explored years ago, where Amazon rejected a review of a book, claiming the reviewer "knew the author," (which was not true). However, the reviewer had followed the author on Facebook, and Amazon magically appeared to know about that connection even if the reviewer did not directly share her Facebook data with Amazon.

The NY Times report further explains another bit of confusion that Hill has spent years trying to track down: how Facebook's People You May Know feature is so freaking creepy. Apparently, Facebook had data sharing agreements with other companies to peek through their data as well:

Among the revelations was that Facebook obtained data from multiple partners for a controversial friend-suggestion tool called “People You May Know.”

The feature, introduced in 2008, continues even though some Facebook users have objected to it, unsettled by its knowledge of their real-world relationships. Gizmodo and other news outlets have reported cases of the tool’s recommending friend connections between patients of the same psychiatrist, estranged family members, and a harasser and his victim.

Facebook, in turn, used contact lists from the partners, including Amazon, Yahoo and the Chinese company Huawei — which has been flagged as a security threat by American intelligence officials — to gain deeper insight into people’s relationships and suggest more connections, the records show.

As Hill noted on Twitter, when she asked Facebook last year if it uses data from "third parties such as data brokers" to figure out PYMK, Facebook's answer was technically correct, but totally misleading:

Specifically, Facebook responded: "Facebook does not use information from data brokers for People You May Know." Note that the question was if Facebook used information from "third parties" and the "data brokers" were just an example. Facebook responded that it didn't use data brokers, which appears to be correct, but left out the other third parties from which it did use data.

And this is why Facebook is, once again, its own worst enemy. It answers these kinds of questions in the same way that the US Intelligence Community answers questions about its surveillance practices: technically correct, but highly misleading. And, as such, when it comes out what the company is actually doing, the company has completely burned whatever goodwill it might have had. If the company had just been upfront, honest and transparent about what it was doing, none of this would be an issue. The fact that it chose to be sneaky and misleading about it shows that it knew its actions would upset users. And if you know what you're doing will upset users, and you're unwilling to be frank and upfront about it, that's a recipe for disaster.

And it's a recipe that Facebook keeps making again and again and again.

And that's an issue that goes right to the top. Mark Zuckerberg has done too much apologizing without actually fixing any of this.

One bit in the NY Times piece deserves a particular discussion:

Facebook also allowed Spotify, Netflix and the Royal Bank of Canada to read, write and delete users’ private messages, and to see all participants on a thread — privileges that appeared to go beyond what the companies needed to integrate Facebook into their systems, the records show. Facebook acknowledged that it did not consider any of those three companies to be service providers. Spokespeople for Spotify and Netflix said those companies were unaware of the broad powers Facebook had granted them. A Royal Bank of Canada spokesman disputed that the bank had any such access.

Spotify, which could view messages of more than 70 million users a month, still offers the option to share music through Facebook Messenger. But Netflix and the Canadian bank no longer needed access to messages because they had deactivated features that incorporated it.

This particular issue has raised a lot of alarm bells. As Alvarao Bedoya points out, disclosing the content of private communications is very much illegal under the Stored Communications Act. But, the NY Times reporting is not entirely clear here either. Facebook did work hard for a while to try to turn its Messenger into more of a "platform" that would let you do more than just chat -- so I could see where it might "integrate" with 3rd party services to enable their features within Messenger. But the specifics of how that works would be (1) really, really important, and (2) should be 100% transparent with users -- such that if they're agreeing to, say, share Spotify songs via Messenger, they should absolutely be told that this means Spotify has access to whatever they have access to. A failure to do that -- as appears to be the case here -- is yet another braindead move by Facebook.

Over and over and over again we see this same pattern with Facebook. Even when there are totally reasonable and logical business and product decisions being made, the company's blatant unwillingness to be transparent about what it is doing, and who has access to what data, is what is so damning for the company. It is a total failure of the management team and until Facebook recognizes that fact, nothing will change.

And, of course, the most annoying part in all of this is that it will come back to bite the entire internet ecosystem. Facebook's continued inability to be open and transparent about its actions -- and give users a real choice -- is certainly going to lead to the kinds of hamfisted regulations from Congress that will block useful innovations from other companies that aren't so anti-user, but which will be swept up in whatever punishment Facebook is bringing to the entire internet.



Permalink | Comments | Email This Story

Viewing all articles
Browse latest Browse all 116

Trending Articles