If you wanted to target a Facebook ad to someone like Mark Zuckerberg you might start by plugging some basic information into his platform’s precision advertising app: gender, location, age. Zuckerberg is male. He lives in California. And this year, the most tumultuous of Facebook’s rise, he turned 34 years old. Potential ad reach, 660,000 people.
Much has changed for Facebook’s founder in 10 years. In 2008 he was celebrating 100 million users as “a big milestone for us”; now 2.2 billion people use Facebook each month, a number so enormous that a Russian agitprop campaign can reach over 150 million users and still weigh in as “relatively small” on Zuckerberg’s titanic scale. Zuckerberg’s world now is painted in large numbers: 30,000 employees, $40 billion in revenue, 15 million square feet of server space in five countries, with a new $1 billion, 11-story data center newly announced for Asia. The lows are just as spectacular. Facebook is down $130 billion in market value from its peak in July, when the Cambridge Analytica privacy scandal broke large.
Driving everything is Facebook’s power to store and process user information, from the prosaic to the personal, allowing advertisers to deliver their messages with laser-guided accuracy. The tools Facebook built for this purpose are varied and powerful. An advertiser can upload a data file listing the exact users they want to target, or let Facebook’s algorithm find “lookalike” users with similar qualities. They can target or exclude users who’ve visited particular websites. Or they can design a custom audience by selecting personal attributes from a cascade of nested pull-down menus, narrowing the reach with each new detail.
Relationship status: Married. Education level: some college. Potential ad reach, 2,900 people.
Zuckerberg famously dropped out of Harvard in his sophomore year to pilot Facebook full time. Four years later he reached the 100 million users landmark, and was clearing the runway for the next decade of growth, settling with the Winklevoss twins, unveiling a website redesign and, on paper, moving his company’s headquarters to Ireland for tax reasons.
If there was an overriding vision guiding Zuckerberg in those years, it was openness. Sharing was his North Star. He spoke often of witnessing a transformation in society. "People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people," he said in one speech. In his vision the internet was drawing everyone closer, knitting the fractured peoples of the world into a global community built on a deeper understanding of one another. This, he’d hasten to add, was happening on its own—“just something that has evolved over time.” Facebook was a reed floating on a river of change.
In reality, Zuckerberg wasn’t so much riding a global shift in thinking, as trying to engineer one. In December 2009, a seemingly-innocent makeover of Facebook’s privacy controls smuggled the code for Zuckerberg’s new world. Overnight Facebook’s default privacy setting switched to full openness: every post shared with everyone. Knowledgeable users could override Zuckerberg’s choice, but few did. The option to keep friend lists private vanished entirely.
Zuckerberg congratulated himself on his boldness—“Doing a privacy change for 350 million users is not the kind of thing that a lot of companies would do.” Facebook, he said, “just went for it.” Users did not. In the resulting backlash, the Federal Trade Commission slammed Facebook with a complaint charging deceptive trade practices. “Facebook retroactively applied these changes to personal information that it had previously collected from users,without their informed consent,” the complaint charged. Facebook settled the case, undid the changes and entered into an FTC consent decree that binds it still.
Interests: Civil law (legal system).
Most of Facebook’s privacy dustups were born in this gap between Zuckerberg’s model of the world and his users’ experience of it, including the scandal that ultimately dragged Zuckerberg to successive public hearings in the Senate and the House this year.
The idea, around which Facebook completely redesigned in 2010, was that the web would no longer be a universe of isolated destinations floating in an anonymous void. Instead, all those websites would be connected by tendrils leading to Facebook, and your identity would travel with you wherever you go. Announcing his vision at a developers' conference in San Francisco, Zuckerberg called it “the most transformative thing” Facebook had done for the web. “We’re building toward a web where the default is social,” he told the crowd. “Every application will be designed from the ground up to use real identity and friends.”
The most obvious evidence of the Open Graph is the Facebook “Like” button on over eight million websites. But Zuckerberg’s vision ran far deeper, and included getting third-party apps to plug into his matrix using the Open Graph API.
Unsurprisingly, the API debuted with a strong bias toward sharing. Among other things, a user could authorize an app to collect profile information on all their Facebook friends, as well as their friends’ friends. Two degrees of separation on Facebook amounts to 157,000 people, on average, whose profiles could be silently plucked on the strength of one user’s click on a permissions dialog.
Four years later, Facebook concluded its users didn’t want that much sharing, and in 2014, Zuckerberg dialed down the API’s power. But by then the damage was done. A UK psychology researcher had harvested information on as many as 87 million people through Open Graph. The data wound up in the hands of Cambridge Analytica, a shady campaign consulting firm building “psychographic profiles” of voters.
Interests: Online Advertising, Campaigns and Elections, Current Events, Big Data. Household income: top 5 percent of ZIP codes (U.S.). Potential ad reach, under 1,000—too specific for ads to be shown.
The Cambridge Analytica story broke in the pages of the Guardian newspaper in December 2015. It had little impact at first, but came roaring back last March with more detail and better timing. By then the truth had come out about Russia’s weaponization of Facebook during the 2016 election, and a privacy scandal that looped in Facebook and a consulting firm that worked for Trump’s campaign looked like another piece of the same puzzle.
In some ways, though, the privacy scandal was a distraction from the core issue around Zuckerberg’s empire. That was in full evidence during his congressional testimony, when Zuckerberg was forced to make one particular point again and again. “There is a common misperception,” he told legislators in the House, “that for some reason we sell data. I can't be clearer on this topic. We do not sell data.”
That’s true. Facebook’s privacy flaps all involved the company giving away data for free. Facebook sells something much more valuable—influence. In 10 years, Zuckerberg’s dorm-room project has evolved into the most effective and powerful machine ever built for planting ideas, changing minds, and impelling action.
The Russians figured that out. Zuckerberg isn’t quite there yet. He finds the notion that Facebook could be used to trick people in an election “almost viscerally offensive,” he said in a recent New Yorker profile. “Because it goes against the whole notion that you should trust people and that individuals are smart and can understand their own experience and can make their own assessments.”
Whether the Kremlin’s influence campaign tipped the election to Trump will probably never be known. But there’s no dispute that Russia used Facebook’s machinery to get some number of Americans to believe things that they didn’t believe before, and to do things they hadn’t planned on doing. Zuckerberg’s model of the world and the humans that inhabit it is once again out of tune.