Culture
Facebook whistleblower tells Senate the company can't fix itself
“Almost no one outside Facebook knows what happens inside of Facebook.”
Frances Haugen, a former Facebook employee turned whistleblower.
A popular refrain at today’s Senate subcommittee hearing — which focused entirely on former Facebook employee Frances Haugen’s testimony — was that Facebook and Big Tech are “facing a Big Tobacco moment,” as Sen. Richard Blumenthal (D-Conn.) put it. The comparison is apt. In both cases, enormous conglomerates withheld information about their products’ harmful effects in the pursuit of maximized profits; in both cases, the government had to step in to provide oversight and regulation when the corporations themselves failed to do so.
Haugen agreed with the metaphor. Today’s hearing, which served as the conclusion to the subcommittee’s current schedule of hearings on how Facebook protects its youngest users from harm (or fails to), consisted of a series of Senators asking Haugen questions of their choosing for five minutes at a time. The Senators, on the whole, seemed immensely thankful for Haugen’s testimony. Blumenthal at one point called her an American hero.
The Big Tobacco metaphor has one major flaw, though, and it’s a glaring one. Facebook is a tech company. It makes its money by manufacturing social interaction, not rolling cigarettes. The tools used for packaging Facebook’s products are more complex and much, much more difficult to make assumptions about. They’re also more insidious in that they’re ostensibly free, aren’t good at enforcing age restrictions, and don’t come with any warnings.
The big black box — Because it took the form of a quick-fire question-and-answer session, Haugen’s testimony today ended up being a little all over the place. Most of her answers had already been made evident by the research she leaked to The Wall Street Journal, though it was really hammered home by Haugen’s concise, precise, and measured answers.
In her responses to the questions posed to her, Haugen returned many times over to her work on Facebook’s algorithms. These algorithms are now the primary drivers of its business model. Algorithms dictate which content users see on their feeds; algorithms are relied upon for catching harmful content; algorithms choose which ads users see on a minute-to-minute basis.
The problem is that these powerful algorithms are nowhere near perfect. Haugen confirmed this many times over today. Facebook’s algorithms cannot catch content that might be harmful to teens and children; they cannot catch underage users until it’s too late; they often, in an attempt to keep users engaged, serve them downward-spiraling content like posts that promote eating disorders. Because the algorithms are not designed to make using Facebook as helpful or as wholesome as possible, they’re designed to keep users hooked.
Facebook does not disclose to researchers (or any other external parties) how these algorithms work. Haugen made it clear in her testimony that this isn’t just about Facebook’s decision to never provide us the full data set — it’s also about a company culture that prioritizes insularity as the only path forward.
Breaking up is not the answer — As Haugen sees it, Facebook’s refusal to disclose information about its’ platforms’ base functions is one of the most prominent ways in which the company allows its products to cause harm. The black box mentality — nothing in, nothing out — is only one facet of this complex tangle, though.
That’s why Haugen’s ultimate goal in opening up this particular can of worms is the creation of new oversight paths for all of Big Tech. As a company bringing in revenue of more than $40 billion every year, Haugen argues Facebook has plenty of resources to make its platforms safe. Facebook knows who its most vulnerable users are and which content is most harmful to them. But at every turn, the company chooses profits over human safety.
Haugen is far from the only party advocating for external oversight. Unlike other tech experts, though, she says explicitly that she does not advocate for Facebook to be broken up as a solution to its myriad problems. Because the company will continue to enact harm even if it’s separated from Instagram or WhatsApp.
Haugen’s reminder here — that anti-competitive measures aren’t the only remedies out there — is a prescient one. The U.S. government has spent so much time focusing on Facebook as a monopoly that it’s often lost sight of its responsibility to create laws that keep the people who actually use its services safe. Because Facebook can’t be trusted to do that on its own.