The Case for Regulating Facebook

Photo+from+Flickr.

Photo from Flickr.

On April 10 and 11, Mark Zuckerberg stood before a congressional panel and testified on behalf of his company Facebook, the social media giant.

The stakes of the hearing were high. What Zuckerberg said here would determine Facebook’s future, and decide just how much or how little the federal government would interfere.

The charge? The mismanagement of 87 million Facebook users’ data, a veritable swarm of Russian bots and fake ads, and inadequate responses to both breaches.

The nature of this mismanagement is storied and complicated, not only involving a political research firm called Cambridge Analytica and a researcher named Aleksandr Kogan, but also the separate issue of the efforts of the Russian government to undermine the 2016 election, and a social media monopoly.

After four hours of softball questions, Kamala Harris (D-CA) addressed Zuckerberg. “You personally, as CEO, became aware in December of 2015 that Dr. Kogan and Cambridge Analytica misappropriated data from 87 million Facebook users. That’s 27 months ago.”

That is the heart of the issue. Facebook knew about the breach in 2015, and before the New York Times broke the story Facebook repeatedly downplayed the scope of the breach to reporters, until it was too late, finally releasing a statement.

But nothing was done to close such a breach in 2015. Nothing was done about Russian ad-buys in the 2016 election, and Zuckerberg himself made his media rounds and described the idea that Russia could influence the election as “crazy.”

Facebook later admitted in written testimony to congress that Russian operatives published 80,000 posts to their platform between 2015 and 2017 in an effort to influence the election.

The pattern is simple. When did they know? Early. What did they do? Nothing. Until they were exposed, and then they did the bare minimum. They banned Cambridge Analytica, they released a statement, they testified before congress.

“I actually am not sure we shouldn’t be regulated,” Zuckerberg told CNN’s Laurie Segall, before pivoting to talking about Facebook’s self regulation tools: more transparency about ads.

But it is not enough.

Big business can’t be trusted to self-regulate. We trust the government to enforce anti-monopoly laws, to test paint for lead levels, to regulate emission, to enforce OSHA compliance. The same should be true of the tech industry, as new and unexplored as that territory is.

That is why it’s truly worrisome to see 80-year-old congressmen fumbling their opportunities to get answers from Zuckerberg. Imagining these senior politicians as all that stands between us and a world where our information is free to be exploited by social media firms is a frightening thought, but ultimately a reality.

The government should act decisively to leash social media companies.

Last October, Sens. Amy Klobuchar (D-MN), Mark Warner (D-VA), and John McCain (R-AZ) introduced the Honest Ads Act, which would see online advertising regulated in the same way print and television are. Social media companies would have to disclose which groups are running political ads and work hard to keep foreign agents from taking advantage of their systems.

Zuckerberg came out in support of this measure, which raises suspicions that it’s exactly the sort of half-measure that Facebook wants. Not enough.

By contrast, a California ballot initiative up this November called The California Privacy Act would require tech firms to disclose whom they share users’ information with, how they sell it, and empower customers to refuse to allow their information to be shared in those ways. Facebook, as well as AT&T, Comcast and others, sharply oppose this measure.

But Facebook had its chance to decide what was done. That chance was in 2015 when these problems first showed themselves. When capitalism fails to self-regulate, it becomes the government’s responsibility to safeguard the world. It is now necessary to protect our private information, and shackle Facebook with the responsibility of safeguarding that information–and making them liable for their failures.