“I haven’t called my mom enough, that’s definitely true,” former Facebook product manager turned whistleblower Frances Haugen told me on a recent summer Friday. Other than that? She doesn’t have many big regrets.
Haugen is dialing in from a moving car: Such is life on the public-speaking circuit, where she has spent much of the last two years since going public as the former employee whose disclosures so tanked the credibility of the company in question that you now know it by its rebranded name, Meta.
But for old time’s sake—and to preserve the historicity of Facebook still being Facebook back in 2021, when Haugen shared tens of thousands of internal documents detailing the platform’s systemic toxicity-for-profit mindset—Haugen and I both avoid that shiny new name throughout our conversation about the ensuing years of post–“Facebook Files” fallout. If we’re getting picky about regrets, Haugen, who resides in Puerto Rico these days, does wish she could have convened a broader consortium of journalists sooner to unpack the documents for the world. Otherwise, she’s glad we’re all here now, at this point in time when the danger of unregulated social media is such a public concern that even the surgeon general is getting involved.
“We’re talking about a culture-change issue, right?” Haugen reminds me when I press for any conclusive sense of societal progress made. She’s thinking of this juncture now as our potential parallel to the seatbelts discourse of the 1960s—and how it took concerted effort, particularly from one individual, Ralph Nader (from whom Haugen clearly draws personal inspiration), to pressure corporate forces to implement safety measures we now take for granted. “Back in 1965, the average person did not know that we could live in a world where we set steadily improving standards for car safety,” Haugen says.
This is the central theme of her work via her nonprofit, Beyond the Screen, as well as her memoir, The Power of One, published this week by Little, Brown and Company that this could be our seatbelts moment, where we finally demand a basic degree of consumer safety from the increasingly opaque tech platforms bending our reality to their whims—and one day, we’ll look back and shake our heads and wonder how it couldn’t have been more obvious.
In conversation with Vanity Fair, Haugen discusses how she feels about the past two years of tech oversight, why she’s not as nervous about the advent of AI as you might think, the one stipulation she has for maybe even returning to Facebook one day, and why she thinks it’s worth saving at all.
This interview has been edited and condensed for clarity.
Vanity Fair: How do you feel about the timing of your memoir? We’re at quite an interesting point in the public’s relationship with social media.
Frances Haugen: I was blindsided last week when the Surgeon General issued the advisory around teen mental health and social media. Like, two years ago, I was just leaving Facebook. I doubt either you or I thought there was any chance this was in the near term future. It really symbolizes for me that we are seeing an interesting moment culturally.
One of the things that I think most people aren’t aware of is that the Surgeon General has issued very few advisories, maybe less than 15 in the last 60 years, about the things that we take for granted now. It’s things like, seatbelts save lives. Smoking causes cancer. Breastfeeding is good for babies. Real mom and apple pie kind of stuff. Those advisories act as the period at the end of a sentence.
I worry a little bit that we’re reaching an inflection point where we can pass sensible moderate laws, like the Digital Services Act in the European Union, or we can start passing emotional and extreme laws, like straight up banning TikTok. I really hope my book can play a role in shaping the conversation around what our options are. Is there a third way, you know, that’s not a Chinese approach, but also not the laissez faire approach that got us to where we are right now?
Maybe in 2023, we’re finally able to talk about accountability from Meta and Instagram, but from your vantage point, are we remotely equipped for similar conversations around things like TikTok and AI?
There are lessons we need to extend and generalize to these new generative AIs. An example from my book is the idea that transparency really matters, and that you get what you measure. In the case of social platforms, they know how to make the system safer. It’s just they choose not to, because we forced them to report profit and loss numbers, but we don’t force them enough to report things like how many kids are online at midnight, 1 a.m., 2 a.m., 3 a.m. Imagine if they had to report those numbers every week.
When it comes to generative AI, these are similarly not neutral systems. And if we don’t have legally mandated transparent transparency mechanisms, we don’t get to actually use consumer power to push in one way or another. But one of the things that makes me a little bit less nervous about the generative AI issues in general is that they don’t have Section 230. Right? These are systems where people are going to get hurt. And there will be lawsuits. And I think it’ll be interesting to see, because there will be consequences.
Based on your interactions with legislators over the years, do you think they’ve gotten any better at understanding the internet? I’m thinking of the TikTok CEO’s congressional hearing back in March, of course.
You know, I was slightly pained by the TikTok hearing, because I would have happily coached them or been like, hey, here are some questions you could ask.
But we’ve mostly focused internationally. Some of the lower hanging fruit is in Canada, Australia, Nordics, Brazil…you have to remember the United States has the cleanest, safest version of Facebook because Facebook disproportionately spends its safety budget here. So I have not spent as much time supporting, say, legislators in the United States.
While reading your book, I was surprised by how deep in the weeds you get about the actual technology—the “physics of social media,” as you called it. I felt like I was getting a crash course on how everything from recommendation algorithms to basic data science to the inner workings of Meta/Facebook all actually work.
I really wanted the average person to be able to walk away and say, hey, when we shifted from Web 1.0 to Web 2.0, what meaningfully changed, like when you start having personalization, but we’re governing the internet with laws from Web 1.0? How does this change when we can’t have a shared experience of reality? That’s for sure a complicated topic.
So part of why I went into such detail was because I really wanted people to understand that Facebook made incredibly reasonable decisions. And if we don’t change the incentives of the environment that Facebook works with, we’re not going to see a change in behavior. It’s not like there were people scheming in a backroom. It’s just that they made a system that was amoral, that didn’t have humans in the loop. And they did that for business reasons.
Is there any intention that people take The Power of One to also be a kind of whistleblower’s manual?
I think we need more narratives on the whistleblowing experience, partially because whistleblowing is actually a relatively new thing, right? Like we’ve only had whistleblower protections for less than 20 years. It’s not super long. And we are entering into a world where more and more of our economy will be run by opaque systems that are analogous to social media, and we’re going to need to extend the lessons we learned from social media around transparency if we want to have accountability.
I think there are many more people in the world who want to stand up for what they feel is true in their hearts, but they are scared or intimidated. But I want to be very clear, for legal reasons, this is not advice on how to blow the whistle. There’s a disclaimer at the front of the book: your mileage will vary. Please do not sue me!
How did you find the experience of writing the book itself?
I was part of that bleeding edge of nerdy teenagers who just lived on the internet—you know, back in the day, we didn’t send text messages. We sent long, typed out chat threads. And so I’m a very fast typer. And I did national circuit debate. So language is not a big impediment to me.
I’m super glad I wrote the book because I became a much better public speaker. Because I had to get very clear about how I thought about a lot of these issues—sitting down and being like, I have to actually get my facts right on Ralph Nader. Ralph Nader did a lot of very specific and intentional things to help that moment of cultural change where we altered our relationship with automotive companies, and I deeply appreciate it. I don’t think I’m as good an operator as Ralph Nader. He QB’d a lot harder than I QB. Partially because like, I’m just not as pushy of a person. But it made me appreciate like, we got seatbelts not accidentally. He had an intentional plan. He did a bunch of stuff to make that happen. And that’s how the world changes.
Speaking of Nader then: Would you say you have any political ambitions?
Well, I live in paradise. And if I can run for office without leaving Puerto Rico, maybe. But that sounds really unpleasant. Like, Ralph Nader likes living in DC. I would like us to get some reasonable laws in place, and then I’m going to go back to being a data scientist and getting to sit by the ocean and code. That will be my happy place.
There are all of these skills and experiences that you’ve picked up in other parts of your life that almost make it seem like being a Facebook whistleblower was your destiny. Like being on the high school debate team, being a meticulous product manager, literally studying how professional scrapers operate, or even observing how people at Google Books would capture and enter countless “pages” of documents.
I’m curious if you agree with that narrative—that you were always meant to do this. Or do you fantasize about an alternate-universe Frances?
I think if they hadn’t gotten rid of Civic Integrity, I would probably still be at Facebook, right? And if they’d let me stay in Puerto Rico, because they don’t let people work from territories. Or I’d be at another tech company, right? Because honestly, I was very happy. I was willing to keep grinding away at it as long as I thought they were acting in good faith.
A number of people have asked me like, “Why you? 200,000 people came through the doors before—why you?” I think that number is much smaller; there’s probably 300 or 400 people in the world who really understand how the algorithms work. I had a very odd intersectional role as a product manager, someone who is meant to play this kind of interface role between marketing and legal and product operations and user research and all these different things. So now the pool is down to 20 people, right?
I think it’s less a thing of like, “I was born to do this,” but rather that I happened to be fortunate enough to have had, just by chance, these steps in my life. That’s just how these things go. There’s always some person who is, by random chance, that little statistical anomaly.
You said you’d love to just be a data scientist sitting on the beach one day. Do you mean it? Do you miss working in tech?
I had a conflict I needed to resolve recently, so for the first time in two years, I got to do a nice data science project. I definitely was like, Oh, God, I missed this. Because there’s like a wonderful Zen calm to just sitting in doing data science.
My great-grandfather was still walking to the University of Iowa and writing papers in his nineties. I’m young, I’ve got 50 years. That, too, is a difference between me and Ralph Nader. I think he really likes going after the bad guys; he founded a whole legal law center doing just that, and I’m really glad he did. But my interests are not as broad as Ralph Nader’s. We might do a simulated social network, in which case I will just go and work on building the simulation and writing lesson plans. I could imagine doing that for a long time.
Back in 2021, you told Vogue, “I hold a slight fantasy that one day Facebook will rehire me to work on something in integrity.” Is that still true?
I think if Mark steps down. I’ve been very sad about the last six months. They’ve taken safety to the woodshed for sure. It’s really sad. I still believe we cannot let Facebook die—Facebook is the internet for billions of people. It’s not enough for people in the United States to opt out. We genuinely have to live up to the obligations that we took on.
I would totally go back to Facebook if they would rehire me. They are the most important social network in the world, And we have to figure out how to fix them.
Looking at the past two years, are you happy with the amount of progress that’s been made—legislatively and/or culturally?
If you looked at the day before the first article came out in the Wall Street Journal, I think there was a sense of fatalism amongst the people who were paying attention. People really bought Facebook’s line that the only conversation was about freedom of speech versus safety; they could kind of shake their heads and be like, oh, I know it’s so tragic, but you know, freedom of speech.
I did a lot of thinking when we hit the one-year anniversary, like was it worth it? And the reality is, there is nothing that feels as good as giving another person hope. Every single day, it’s a little jolt of dopamine or oxytocin to watch someone light up and be like, No, we can do this. So from that perspective, I’m really excited.
The Surgeon General’s warning is a great step in the right direction. Once the Digital Services Act actually starts producing data, I think that’s going to be a game-changer, because even though it will be only available to European researchers, it means Facebook will stop being the only person who gets to grade their homework. And I think even very small amounts of transparency could be transformative.
Do you think you’ve changed throughout all of this?
Totally. I am a child of two academics—lovely, quiet people who like their garden. I never really took up very much space most of my adult life. And it’s been interesting being forced out into the public eye. I think there are a lot of people out there who could be showing up more in their own lives.