Tech
Facebook wants its hand held to help it comply with privacy laws
The social network says it doesn't want another Cambridge Analytica scandal. What it really wants is to demonstrate it's too complex to be split up.
Facebook is asking the Federal Trade Commission (FTC) to help guide it on how to meet new obligations for "data portability," that is, allowing users to export their data or move it to other services, an option it's required to offer in the European Union and California. Bloomberg reports Facebook wants assistance figuring out how to offer the option without compromising privacy laws. For instance, if a picture includes other people should tag data persist when the image is exported? Or what about images a user is tagged in but didn't take? Should those be exportable, and if so, how?
The goal, the social network says, it to avoid a repeat of the Cambridge Analytica scandal, where a political consultancy used a quiz app to extract data on users and their friends, and then used the information to help tailor election ads for the 2016 U.S. election that saw Donald Trump narrowly grab victory. Facebook paid the FTC a record $5 billion fine in the wake of the scandal (not that it mattered, as a share of earning the fine was a slap on the wrist, and the company's share price actually climbed in the days following payment).
One read on this is that Facebook wants to do the right thing. A more likely one is it's stalling. It's asking the FTC, which isn't a tech company, how to comply with the new legislation. It's palming off its work and feigning innocence by arguing it wants to comply, it just doesn't know how to. The poor thing. Perhaps it can hire some experts to help with all that ad money it's taken from the Trump campaign?
What belongs to you? — Facebook's questions regarding what can be exported are complicated ones because of the nature of its service. Users are intertwined through a web of connections, from being "friends" on the service to appearing in each other's content, whether because they're tagged by name or included in content like stills and videos. Figuring out how to comply with legislation while also protecting users' data is, in Facebook's defense, a difficult process.
But you can't help but think it's delighted this is the case and has often made deliberate moves to further complicate things. The more complicated the company is, and the more interconnected its products are (see, for instance, the merging of its Messenger and Instagram Direct Messages services), the more difficult it'll be for regulators to split it up should they decide that's necessary.
“The last time we tried to do this at scale, we had Cambridge Analytica happen,” said Bijan Madhani, Facebook’s privacy and public policy manager. “We want to make sure we are crystal clear on the obligations on us.”
Contradicting requirements — Facebook is trying to balance requirements to offer data portability, intended to increase competition, with the somewhat contradictory requirement to protect user data from leaking or otherwise being compromised. The company has fiercely guarded its friend graph, which can be used to find your network of friends in new apps. In the past it has stopped apps like Meerkat and Twitter from using its friend graph, as doing so could help those competing services grow and erode one of Facebook's unique advantages. Privacy laws support protecting the friend graph but data portability laws do not.
The company's privacy issues that have plagued it for years have been overshadowed more recently by its struggle with fake news and disinformation. Recent reports have found that Facebook ignores warnings about right-wing pages that spend a lot of money advertising on its platform and Holocaust denial posts are allowed to run rampant. It's in Facebook's interest to take as long as possible to figure out how to allow data portability. Until then, the best thing you can do to encourage it is to stop using the service or, better yet, delete your account entirely. Plus, you'll have the added bonus of the deep, restful sleep that comes from knowing you're now helping empower Zuckerberg's misinformation and polarization machine.