I just can’t quit Facebook

Yes, it’s harmful to children, democracy, and much more. It also makes my life better.
October 19, 2021
Christian Century illustration

I logged in to Facebook for the first time in 2008, when the teenager who babysat for my daughter sat me down and walked me through the process of setting up a profile. I was in my early 30s and had already moved across the country multiple times with my academic husband. Facebook quickly became the easiest way to share photos and information across those distances. It gave me a chance I wouldn’t have had to watch my friends’ and cousins’ kids grow up.

I didn’t yet have a smartphone. I remember the babysitter showing me how to upload my very first profile photo from my digital camera. I used to print digital photos of my children and orga­nize them in real photo albums or share them with family members via a link to a website, but soon all that seemed redundant. Facebook became my personal baby book. I now have 12 years’ worth of family photos in digital albums hosted by Facebook and Instagram.

I also have 12 years’ worth of online friendships that became cherished real-life friendships, plus acquaintances, publishing contacts, and group memberships that give me access to people around the world who share some of my more obscure interests. Try finding just one person to parse the finer points of 1970s British folk horror teleplays with when you live in rural northern Michigan or the foothills of the Blue Ridge Mountains. On Facebook, I connected to thousands of them. In 2014, friends and I started our own group for Catholic writers and artists, which immediately outpaced the blog we’d started. For the next few years my entire professional life—and much of my spiritual life—was entirely bound up with Mark Zucker­berg’s machine. I was invited to speak about how to build online communities, sitting on panels and giving talks that, looking back, were free advertising for Facebook products.

It’s true that social media—Facebook, in particular—made my life as a writer and stay-at-home mom of young children so much richer and more interesting, and I was a staunch defender until 2016 when, like so many others, I started to sense that something wasn’t quite right. What had once felt good was starting to feel bad.

One night, when yet another distant family member materialized in the comments on one of my enraged posts about Donald Trump’s presidential candidacy, I angrily asked why he only ever commented on my political posts, never the posts about my writing or my children. He responded: But all you ever post about is politics! He’d never even seen those other posts in his newsfeed. I think that’s when I first began to consider the costs of being manipulated by an algorithm. I didn’t delete Facebook, though. I just (mostly) stopped arguing about poli­tics there and became a quick draw with the block button.

Frances Haugen, the whistleblower who recently brought thousands of Facebook’s internal documents to Congress and the US Securities and Exchange Commission, isn’t the first to accuse Facebook of dividing people, weakening democracy, and hurting children in its efforts to grow quickly and make what Haugen calls “astronomical profits.” External critics have been raising these issues with Facebook executives for years. But Haugen brought receipts that prove Facebook is aware—from its own internal research—how harmful its practices are to individuals and society and has chosen not to protect the well-being of either.

Facebook, a $1 trillion company, engages in practices that maximize profit but result in “actual violence that harms and even kills people,” Haugen testified before a Senate subcommittee. With 2.8 billion users—that’s 60 percent of all internet users on the planet—we’ve already tasted the bitter fruit. We know Russian troll farms used Facebook to spread misinformation and interfere with US elections; we know the Myanmar government used the platform to launch a genocide; we know the insurrectionists used it to orchestrate the January 6 riot at the Capitol. And now we know that Facebook knew it too and did little to protect people.

Every time another friend deletes their account, I think, Does staying make me a bad person?

“No one at Facebook is malevolent,” Haugen told CBS News, “but the incentives are misaligned. Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.” Haugen produced evidence that European leaders have said Facebook’s algorithm pushed them into extremist positions to curry social media attention, the only kind of attention that matters anymore.

Facebook also knows that its products, including Instagram, harm our children. Haugen leaked an internal study that showed 13.5 percent of British teen girls reported an increase in suicidal thoughts after looking at Instagram. Another study found that 17 percent of teen girls said it made their eating disorders worse.

Those of us who watched The Social Dilemma in 2020 were not surprised to hear how harmful social media is, and by design. The Netflix docudrama drew attention to how social media companies rely on research in the psychology of addiction to hook us and our kids, and it precipitated a minor exodus from Facebook, the first wave of people announcing they were deleting their profiles as an act of protest or self-protection. In the days after Haugen’s testimony, I saw more friends leave, some using the hashtag #deletefacebook, and felt a mix of envy and irritation at what amounts to a moral flex I’m not willing to make. I envy their ability to let it all go—the low-commitment “friend” groups, the easy connectivity, the ready-made platform for sharing work and life from anywhere. I’m irritated that their departure makes me consider, again, what I’m trading for all that, and if “all that” is really worth that much after all. Every time someone goes, I think, does staying make me a bad person?

In the television show The Good Place, which is set in the afterlife, handwringing moral philosopher Chidi finds himself in hell because he’s always trying to discern the lesser evil and so is unable to make choices at all. He is accompanied by Eleanor, who in life was motivated by outrageous selfishness, and Tahani, who always desperately needed to feel like a good person. I feel like these characters are my companions through the moral realms of Facebook. I know I’m in the Bad Place, where we act as hapless minions making lots of money for the very few by lending our support to a system that destroys families and tears at the fabric of society. But I feel trapped there by real connections, personal and professional, that make my life better. Of course I want to protect children—and democracy. But I don’t want to give up access to my perimenopause support group! It’s almost laughable. The doors of hell really are locked from the inside.

More than the people I see leaving now, I envy those who never logged in, those whose current careers and friendships can’t be traced back through more than a decade’s intricate web of likes, adds, and “people you may know.” But that’s not me. The social network was my only access to a network at all, and I hate that I am grateful for it.