Mark Zuckerberg has never really understood privacy. From Facebook’s earliest days, he figured people would eventually grow comfortable sharing everything with everyone—indeed, his business depended on it. Everytime Facebook rolled out a new feature, a subset of us questioned our ability to control who gets to see the personal information we’d been uploading in the form of pokes and pictures and witty wall posts.
But these worries never triggered an insurgency. If the violation was truly egregious, Zuckerberg would just apologize, usually via a Facebook post, and announce a fix. The fix often involved updating its privacy tools. (In 2014, Facebook even introduced a charming blue dinosaur as a privacy instructor.) As the site expanded, these labyrinthian settings became increasingly hard to grok, and often made more information public by default, rather than putting true control in the hands of its users.
On March 28, twelve days after The Guardian and The New York Times revealed that Cambridge Analytica had misused Facebook user data (and that the company had known about the violation and done nothing.) Facebook once again announced newly redesigned privacy tools, this time placed at the top of its newsfeed.
Maybe these tools will put users in control of our personal information once and for all, and as a result, we will trust Facebook to protect our data better in the future. But if history is any guide, we’ll see this episode again, judging by this not-at-all exhaustive list of the times Zuckerberg has apologized for giving you privacy jitters, and assured you it would all be absolutely fine, eventually.
News Feed Nuisance, September 2006
“We really messed this one up.” –Mark Zuckerberg
When Facebook first launched Newsfeed, the design that has come to define all social software, its users revolted because suddenly their posts were revealed in one centralized place. Zuckerberg wrote his first apology letter, writing “we did a bad job of explaining what the new features were and an even worse job of giving you control of them.” Eventually, concern died down as people got used to the Newsfeed.
The Opt-In Assumption, December 2007
“We simply did a bad job with this release, and I apologize for it.” –Mark Zuckerberg
When Zuckerberg announced its first targeted advertising product, Beacon, users were outraged that Facebook was using their information—including information it had gotten from third-party sites, like whether they bought concert tickets from Fandango—to help advertisers target them. Zuckerberg addressed criticism in a post, saying “It took us too long after people started contacting us to change the product so that users had to explicitly approve what they wanted to share.” As a result of complaints, he said the company had made Beacon opt-in, and that Facebook planned to release a feature that let people turn it off completely. (Facebook shut down Beacon on 2009.)
Creating Confusion, December 2009
“We’re adding something that many of you have asked for.” –Mark Zuckerberg
This time, Facebook attempted to get in front of its users’ data concerns. As it stepped up its efforts with advertisers, Facebook launched new privacy tools that aimed to “empower people to personalize control over their information,” according to an announcement from Zuckerberg. Yet critics complained the tools were overly confusing and pushed users to make even more of their personal information public, rather than giving them control. This triggered an Federal Trade Commission investigation.
Break Things, May 2010
“Sometimes we move too fast.” –Mark Zuckerberg
Reporting in The Wall Street Journal revealed that advertisers were using a privacy loophole to retrieve revealing personal information, like users’ names, from Facebook (and other social networks like MySpace and Xanga). Facebook quickly made a change to its software to get rid of the identifying code, and a few days later, announced plans to redesign its privacy settings. This time, Zuckerberg detailed the changes in an apologetic Washington Post op-ed, writing “sometimes we move too fast,” before posting on his Facebook profile.
Privacy Predicament, November 2011
“I’m the first to admit that we’ve made a bunch of mistakes.” –Mark Zuckerberg
After the FTC charged Facebook with deceiving consumers by saying they could keep their information private, and then allowing it to be shared and made public, Facebook agreed to a settlement. Zuckerberg addressed the agreement, which he never explicitly called a “settlement,” in a blog post. The company had made a number of mistakes, Zuckerberg wrote, but he understood “that many people are just naturally skeptical of what it means for hundreds of millions of people to share so much personal information online, especially using any one service.” He then offered up a list of new tools Facebook had made available to help users control their privacy and announced he’d have two chief privacy officers going forward, instead of one.
One Version of Control, January 2013
“You control who you share your interests and likes with on Facebook.” –Michael Richter, former Chief Privacy Officer, Facebook
In January, Facebook rolled out a search product called Graph Search, designed to let users search any topic inside Facebook. This riled privacy activists, because it allowed people to unearth any information a user hadn’t pro-actively protected. Days later, one of Facebook’s chief privacy officers respond to the concerns, kicking off a trend of the company addressing privacy concerns after the fact. This became a steady coda to nearly every product launch that followed.
Facebook Under Fire, March 2018
“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you.” –Mark Zuckerberg
After five days of silence, Zuckerberg finally weighed in on the reports that Cambridge Analytica misused its data, and it failed to protect its users. Less than a week later, the company has promised once again to improve its privacy settings. The announcement reads, as usual, like a mea culpa, headlined: It’s Time to Make Our Privacy Tools Easier to Find. “We’ve heard loud and clear that privacy settings and other important tools are too hard to find and that we must do more to keep people informed,” write the two executives who’ve penned the announcement.
Superficially, it seems like the company is taking things seriously and working hard to make changes, but review the evidence and it sounds like trite old news. Indeed, It’s a version of something we’ve all heard before.
In Facebook We Trust
Follow along with all of WIRED’s coverage of the Cambridge Analytica scandal.
It’s tricky to figure out exactly what data Facebook has accumulated—here’s what to look for in that data.
There’s one group that doesn’t get much access to Facebook data: researchers. The case for why that’s a problem with massive ramifications.
This article was syndicated from wired.com