A decade ago, in the social media business book Groundswell, Charlene Li and I described online social technologies as an uncontrollable grass-roots movement. I had hoped social media would be a force for good. But now Facebook, along with its subsidiary Instagram, dominates that movement. It controls more of our collective attention – and gathers more of our collective data – than any other entity on earth. And it’s flailing.
Facebook has now admitted that Cambridge Analytica abused its data to influence the election; that its job, housing, and loan ads were discriminatory; that it shared data with 150 other companies without users’ permission; that its advertisers can target users based on phone numbers supplied solely for security purposes; that it stored hundreds of millions of unencrypted passwords in files accessible to its employees; and finally, that it broadcast the Christchurch massacre live. And that’s just in the last year or so.
We rigorously regulate financial institutions like JP Morgan Chase that are “too big to fail” because their failure would crater the world economy. But the attention economy that Facebook dominates is far more pervasive in our lives. Facebook now counts 2.3 billion active users; the average American user spends nearly an hour a day there, plus another 53 minutes on Facebook’s Instagram platform.
Facebook’s failure to behave responsibly with our data and our attention is a threat to civil society. It’s too big to flail the way it does.
Facebook CEO Mark Zuckerberg’s endless string of apologies can’t fix its repeated violations of trust. Neither can a welter of privacy settings that just emphasize how its default is to gather data on nearly everything. To ensure the public good, we regulate companies that build useful, compelling, or addicting products. We require seat belts in cars, determine how drug makers are allowed to interact with doctors, and block cigarette companies from advertising. What should we do about Facebook?
We must regulate it in two key areas: data stewardship and newsfeed fairness.
Start with data. Facebook has historically treated our data as its property. Forrester Research vice president and privacy expert Fatemeh Khatibloo explains, “Facebook has been incredibly cavalier and paternalistic when it comes to how they use people’s data. Their perspective is that the world must be open and connected. That is not good and right for everyone, and yet Facebook has unilaterally made the decision to make the world more open and connected for everyone.”
Last December the New York Times exposed how Facebook shared our data with its partners, allowing Amazon to get our friends’ contact information and Netflix to access our private messages. It then used a loophole in its own terms and conditions to justify this access. As Christian J. Ward, data partnership expert and coauthor of Data Leverageexplains, Facebook “acted as though those major companies were acting as its agencies, so it didn’t have to disclose the data sharing.”
The remedy starts with reviving the government’s moribund regulatory regime. In Move Fast and Break Things, Jonathan Taplin’s eye-opening book about the excesses of tech companies, he explains how Robert Bork’s libertarian theories have neutered antitrust enforcement, restricting it to cases of unfair price manipulation. “Google, Amazon, and Facebook are all monopolies that would be prosecuted under antitrust statutes if it hadn’t been for Robert Bork,” he writes. Some legislators and candidates agree; Rhode Island Congressman David N. Cicilline has called for the government to investigate Facebook under antitrust laws, and congressional candidate Brianna Wu says “an antitrust conversation is long overdue.”
Splitting Facebook from its acquisitions Instagram and WhatsApp, as Senator Elizabeth Warren proposes, wouldn’t reform its culture of data abuse. Instead, the Federal Trade Commission (FTC) should immediately initiate enforcement against Facebook for its repeated abuse of users’ data – and for skirting its 2012 FTC consent decree about data-sharing. The next settlement must include monthly public reporting on all new data-related programs and relationships.
Rather than attempting to undo past acquisitions, the FTC should block what’s happening now: Zuckerberg’s recently announced plan to integrate messaging within Messenger, Instagram, and WhatsApp. The government should also block future acquisition of social media or messaging competitors, like the video sharing social network TikTok.
And what of the news feed? While Facebook says it will de-emphasize the feed, it remains a powerful influence on consumers. Even as Facebook hides behind the Communications Decency Act of 1996, which say that it is not liable for content others post, it blocks nudity and ejects provocateurs like Infowars’ Alex Jones for inciting violence and hate. As author Taplin says, “They need to begin to take responsibility for what’s on their platforms.”
I don’t want the government deciding what’s shown on Facebook. I want Facebook to be clearer about how it decides. The next FTC consent decree should require Facebook to invest far more in the detection and blocking of pernicious content. It must clearly and publicly describe how the algorithm works and how it is changing. And it should make that algorithm available for testing by news organizations, non-profits, and even advocacy organizations. A media property with this much influence should be subject to regular audits for bias by groups across the political spectrum.
Regulation will gum up Facebook’s operations a bit. But with its 37 to 40 per cent profit margins, it can easily afford to spend more on compliance, transparency, and fairness. As a Globe article earlier this month pointed out, regulation and antitrust actions didn’t kill IBM, Microsoft, or AT&T – they simply made it possible for competitors to generate a diversity of new ways to interact.
As my Groundswell coauthor, Altimeter Group senior fellow Charlene Li, reminded me, Facebook is now “screwed” by the groundswell of its own users’ desires. “They can’t control what people want to share, it’s an absolutely impossible situation,” she says. But we can hold the company to a higher standard of data stewardship – and require a lot more transparency about how it works. Otherwise it will continue to use our data and attention to its own benefit, amplifying the worst of our impulses as it flails. Regulators must act, because a company this wired into our body politic is simply too big to flail.
About this post: This is the op-ed I originally submitted to the Boston Globe, presented here for your evaluation and enjoyment. I researched it with eight interviews over the course of a week. (They requested a rewrite because it was too similar to other op-eds they had requested on the same topic; they eventually published this version.)