The timing certainly was weird. Facebook says there was no connection between a whistle-blower’s appearance on “60 Minutes” on Sunday and its family of apps going dark the following afternoon, but in cyberspace, one never knows. Certainly, the hours-long outage brought increased attention to how deep Facebook has wormed into contemporary life, as well as to what Frances Haugen had to say on the venerable CBS News show and again on Tuesday when she testified in the Senate.
Ms. Haugen’s core message, first disclosed in a Wall Street Journal series, was that Facebook’s leadership knew the harm that it and its apps did and that, far from being something they tried to stop, it was the company’s business model. Instagram, which Facebook bought in 2012 for $1 billion, contributes to eating disorders and body image crises among teen girls, according to its own internal documents Ms. Haugen leaked. She described how Facebook developed a plan to hook ever-younger potential users despite knowing all about its destructive content. I am here today because I believe that Facebook’s products harm children, stoke division, weaken our democracy and much more, she said in her prepared testimony.
The charges against Facebook itself create a picture of a corporation unwilling to make socially responsible choices, amid a climate of fear that it could all be over in a moment, if a new, bigger, better online platform emerged. Corporate paranoia has made the unthinkable thinkable at Facebook, despite its public pronouncements that it is a good neighbor that plays within the lines.
Almost all of the problematic aspects on Facebook can be traced to the fact that it has been resistant to change at the highest levels when its harmful effects are clear. After it changed the algorithm by which it selected what to display to whom to try to stave off declining user engagement, the company knew that it was making people angrier, not bringing them closer. According to the documents reviewed by the Wall Street Journal, Facebook’s head Mark Zuckerberg dug in because he worried any fixes would lead people to interact less with his product. Making this even worse, Facebook has a secret double-standard that exempts millions of high-profile users from some or all of its rules. “Many abuse the privilege, posting material including harassment and incitement to violence that would typically lead to sanctions,” the Journal reported. It gets worse.
Other documents show how Facebook employees flag human trafficking and drug cartel activity only to have their higher-ups do nothing. Examples include situations where women are lured into abusive work in the Middle East, encouraging violence toward ethnic minorities in Ethiopia, hate groups everywhere, even the selling of human organs. Thousands of Americans may have died as the direct result of vaccine and mask misinformation spread widely on Facebook during the Covid-19 pandemic. Again, Mr. Zuckerberg and other Facebook executives knew but failed to stop the flow. Facebook was also a place where the flames of the fatal Jan. 6 insurrection were fanned.
Facebook is everywhere, and that is at the root of the problem. In some countries it is synonymous with the internet, which can mean there is no escape from its dangerous influences. Soccer leagues use it to post schedules, school districts to talk to parents. There are hobbyist groups of every imaginable sort. Most publications, including this one, rely on Facebook to reach audiences not otherwise reachable. Critics of the company have said, convincingly, that individuals and businesses remaining on its platforms are complicit in its toxicity; it is difficult to offer a counter-argument. That so many of us do not delete our accounts — in spite of all of the negatives — is the surest indication that we should.