Following weeks of public and congressional pressure, Facebook CEO Mark Zuckerberg has agreed to testify at two congressional hearings next week. The decision marks a shift for Zuckerberg, who just last month suggested that the company’s engineers and lawyers were better-equipped to answer Congress’s detailed questions. What Zuckerberg seemed to miss when he gave that excuse—and what he now has an opportunity to address—is that the problems plaguing Facebook have far less to do with the company’s technical flaws than with its fundamental ethos.
“If it is ever the case that I am the most informed person at Facebook in the best position to testify, I will happily do that,” Zuckerberg said in an interview with WIRED last month. He is, and always has been.
It’s true that when Zuckerberg assumes the hot seat—at a joint hearing of the Senate Committee on the Judiciary and the Senate Commerce, Science, and Transportation Committee on Tuesday, followed by the House Energy and Commerce Committee on Wednesday—lawmakers will undoubtedly grill the embattled tech founder on the specifics of how the private data of some 87 million Facebook users ended up in the hands of the shadowy British data firm SCL and its American offshoot, Cambridge Analytica. They will almost certainly interrogate him over how a Russian propaganda group managed to conduct a paid political influence campaign during the 2016 US election, and question him about why Facebook didn’t find and shut down another roughly 300 of these propaganda spewing accounts and pages until just this week. They may try to pin him down on what sort of regulation he supports or ask for his predictions about the next big threat in election meddling.
‘This could be a missed opportunity if it’s focused on what happened in the last few months and what bills Congress can pass in the next 12 months.’
Siva Vaidhyanathan, University of Virginia
If history serves, Zuckerberg will come prepared with a full list of bullet points branded into his brain, acknowledging Facebook’s mistakes, admitting these are hard problems to solve, and explaining in detail the ways Facebook plans to fix each one. Zuckerberg struck this tone on a call with reporters Wednesday afternoon. But the most important question he may not be prepared to answer is far more philosophical. It has little to do with engineering fixes or the technicalities of Facebook’s privacy policies in 2014. It’s this: How can Facebook ward off tomorrow’s crisis if its guiding principle is and always has been connection at all cost, maximizing the flow of data between people and their friends—as well as advertisers and apps? And if it radically alters that ethos—the one that allowed it to grow to 2.2 billion users and counting—can it sustain itself?
“This could be a missed opportunity if it’s focused on what happened in the last few months and what bills Congress can pass in the next 12 months,” says Siva Vaidhyanathan, a professor of media studies at the University of Virginia and author of the forthcoming book Antisocial Media: How Facebook Disconnects Us.
Facebook executives have often been called to Congress to answer for the most recent headlines. This past January, Facebook’s head of global policy management Monika Bickert testified about the strides the company has made in detecting terrorist content with machine learning and image recognition technology. Last fall, the company dispatched general counsel Colin Stretch to answer questions about the Russian Internet Research Agency’s infiltration of the platform. Stretch spouted off numbers about the percentage of ads purchased by the Russian trolls, the reach of those ads, and the total posts these phony accounts shared. His careful defenses and roundabout responses left some members of Congress feeling unfulfilled.
“I went home last night with profound disappointment. I asked specific questions. I got vague answers, and that just won’t do,” California senator Dianne Feinstein said during the Senate Select Intelligence Committee hearing in November.
Cambridge Analytica is just the poster child for a far more pervasive problem.
Zuckerberg will likely walk lawmakers through the changes Facebook has made since the Cambridge Analytica scandal broke—and there have been many. Just Wednesday, Facebook announced a slew of restrictions to its APIs that make it harder for app developers to collect data from Facebook users. Previously, the company had announced forthcoming audits of app developers to see what data they had collected under Facebook’s previous, looser rules. And of course, Democratic lawmakers in particular will be all too happy to press Zuckerberg on Cambridge Analytica, which was a vendor to President Trump’s 2016 campaign. Zuckerberg, who has already cut both Cambridge Analytica and SCL off from Facebook, will undoubtedly oblige.
But lawmakers who listen closely will realize what Facebook advertisers and app developers already know: Cambridge Analytica is just the poster child for a far more pervasive problem. Cambridge Analytica was only able to harvest so much data because Facebook’s policies at the time allowed developers to do so, and the company had no safeguards in place to ensure developers wouldn’t share that data. Those policies existed because, throughout Facebook’s entire history, Zuckerberg and his staff have believed that connectivity and sharing does more good than harm, and has relied on those connections to maximize profits.
That thesis may be evolving, but it hasn’t changed entirely. On Wednesday, Zuckerberg maintained that when forced to choose between privacy and a better, more targeted experience on Facebook, “the feedback is overwhelmingly on the side of wanting a better experience.”
That may be. But if Facebook wants to truly explain why all this has happened—why terrorists have been radicalized on Facebook, why fake news has proliferated, why foreign actors can buy political ads, and why data gets passed around with minimal oversight—Zuckerberg is the only person qualified to provide the real answer: This is how Facebook was designed to work.
This article was syndicated from wired.com