The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World by Max Fisher

The very first victim of the Facebook newsfeed—before the harassment of women in the video-game industry, before the death threats in Brazil, before the incitement of slaughter in Myanmar and Sri Lanka—was the Facebook newsfeed itself.

Facebook introduced (or imposed) the feature in 2006, the New York Times reporter Max Fisher writes in The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World, turning its network of browsable profiles into a constantly updated listing, telling each user everything his or her friends had done on the site. Immediately, the updates that filled up the feed were about people joining groups with names such as “Against News Feed” and “I HATE FACEBOOK.”

Two facts defined this eruption of anger, Fisher explains. One was that the apparent uprising against the newsfeed was “an illusion”—though “only a minority of users ever joined” the protest groups, the architecture of the newsfeed itself picked up and amplified their joining, until it “tricked Facebook’s users, and even its leadership, into misperceiving the platform’s loudest voices as representing everyone.”

The other fact, Fisher writes, was that, “crucially, it had also done something else: driven engagement up.” Seeing other people get mad at Facebook encouraged people to spend more time on Facebook and attracted more people to the site. The people running Facebook realized they had found a winning strategy.

Both parts of the parable recur throughout the book—two different stories closely orbiting each other, with one interpretation eclipsing the second, then being eclipsed in turn. Fisher describes social media, in frequently horrifying terms, as a collection of bad machines and bad systems, with a power and morality beyond human control. The algorithms, following their own opaque internal logic, come across as truly evil: cultivating the worst in humanity, making people miserable and agitated, inspiring mass delusions and violence, eventually even generating their own child pornography.

Behind this mindless mechanical destructiveness, though, Fisher also tells a story of deliberately bad people—the entitled “misfit twentysomethings” who conform to, and perpetuate, “a ​​contrarian, brash, almost millenarian engineering subculture,” designing those machines to pursue “instant, exponential returns” at any cost, for their personal aggrandizement and enrichment.

Silicon Valley, in Fisher’s account, is a place built by hopelessly self-centered and narrow men—he reminds readers that the silicon transistor was invented by William Shockley, known for his “‘fondness for humiliating his employees,’ his knee-jerk rejection of any idea not his own, and his inclination toward extremes,” to say nothing of his bigotry and passion for eugenics—who recruited more hopelessly self-centered and narrow followers, in a sort of runaway algorithm in the real world.

Usually, personally, I’m more fascinated by the badness of the machines—by the evidence that systems like Facebook and YouTube have attained a size and complexity where, even without literal machine consciousness, they now bend human activity to serve their endless growth and other demands. Fisher has brought together years of reporting, from corporate messaging to whistleblower leaks, to document the inexorable perversity of the algorithms. Again and again he describes how the systems themselves spontaneously stitch together, say, “mainstream-right commentators, oddball conspiracists, and basement racists” to drive users deeper and deeper into stimulating but desolate alternate realities.

But the human element keeps asserting itself, sometimes even when Fisher is focusing on the machines. The story of how the people who run social networks are blindly caught up in technology’s unintended consequences keeps running into the problem that, as Fisher summarizes the warnings of the whistleblower Frances Haugen, “the platforms amplified harm; Facebook knew it; the company had the power to stop it but chose not to; and the company continually lied to regulators and to the public.”

If Facebook was deploying “a robot army bent on defeating each user’s control over their own attention,” whose opaque inner logic promoted falsehood and conflict, it was also repeatedly taking orders from Joel Kaplan, the Republican operative turned company executive, not to rein in those robots, “to avoid even hypothetical Republican objections.”

When tech-company leaders embraced the role of “wartime CEO,” which “provided a kind of moral cover” for their lying, rule-breaking, and abuse, they were following the advice of the investor Ben Horowitz, son of the 60s radical turned fanatical reactionary David Horowitz, who has dedicated his life to playing the hero on one side or the other of some prospective apocalyptic war.

The people who make the machines keep rolling out new features and new goals—to increase YouTube’s “daily watch time by a factor of ten,” to get Facebook users to break the apparent natural limit of about 150 members on human social groups—while their ethics and responsibility never make it out of beta. Every time the press caught YouTube doing something wrong, Fisher writes, “YouTube, in a paradoxical turnabout, would put out a statement insisting it had already fixed issues that, only weeks earlier, it had dismissed as nonexistent.”

Even as he captures its lies and contradictions, Fisher himself struggles at times to get beyond Silicon Valley’s nerd culture of reductionism and manipulation. The algorithms, and the people who study them, have found that posts with words invoking “disgust, shame, or gratitude” and similar moral concepts “traveled 20 percent farther—for each moral-emotional word.” In search of why people are so susceptible to online moral outrage, he concludes, based on Soviet experiments in fox domestication, that human social regulation was driven by the sudden proliferation of our neural crest cells some 250,000 years ago.

Surely some other relevant things happened during those 250,000 years, to say nothing of the past 250, or 25, to bring the human race to the point where Sinhalese people kill their real-world Tamil-speaking neighbors to defend an abstract online community from a fictitious threat they heard about in a video on an app. Primordial tribalism doesn’t quite seem to be a strong enough cause.

Yet it’s also true that neither the Khmer Rouge nor the Turks needed Facebook to carry out genocide, and that actual lynch mobs long pre-dated Twitter mobs. Unraveling these causes and effects, inside and outside the algorithmic black boxes, is going to require years of work, with a range of intellectual tools beyond those of the computer programmer or the newspaper reporter.

Still, Fisher has drawn together a chilling record of the events and the cultural and commercial imperatives behind them, and how the perversity of machines and of people, working in tandem, have upended life around the globe. Beneath all the complexity, there are simple facts: during the genocide in Myanmar—in which the United Nations concluded Facebook “had played a ‘determining role’”—“a single engineer could have shuttered the entire network as they finished their morning coffee.”

During the killing in Sri Lanka, Fisher recounts, desperate political leaders, unable to get Facebook to respond to their pleas and warnings, eventually pulled the plug on the social networks themselves:

Two things happened almost immediately. The violence stopped; without Facebook or WhatsApp driving them, the mobs simply went home. And Facebook representatives, after months of ignoring government ministers, finally returned their calls. But not to talk about the violence. They wanted to know why traffic had zeroed out.

But the machinery will not stay unplugged. In a coda to that episode, Fisher catches up with a Tamil-speaking restaurant owner who had been beaten by a Facebook-inflamed mob, while becoming the villain of a viral video, all over an imaginary plot to sterilize Sinhalese people: “With long, empty days in hiding, he said, ‘I have more time and I look at Facebook much more.”

Fisher is incredulous. “He looked up from the floor, shrugging,” he writes. “‘Whether it’s wrong or right, it’s what I read.’”

Tom Scocca is the former politics editor for Slate and the editor of Hmm Weekly